Sample records for decision making algorithm

  1. Decision-making without a brain: how an amoeboid organism solves the two-armed bandit.

    PubMed

    Reid, Chris R; MacDonald, Hannelore; Mann, Richard P; Marshall, James A R; Latty, Tanya; Garnier, Simon

    2016-06-01

    Several recent studies hint at shared patterns in decision-making between taxonomically distant organisms, yet few studies demonstrate and dissect mechanisms of decision-making in simpler organisms. We examine decision-making in the unicellular slime mould Physarum polycephalum using a classical decision problem adapted from human and animal decision-making studies: the two-armed bandit problem. This problem has previously only been used to study organisms with brains, yet here we demonstrate that a brainless unicellular organism compares the relative qualities of multiple options, integrates over repeated samplings to perform well in random environments, and combines information on reward frequency and magnitude in order to make correct and adaptive decisions. We extend our inquiry by using Bayesian model selection to determine the most likely algorithm used by the cell when making decisions. We deduce that this algorithm centres around a tendency to exploit environments in proportion to their reward experienced through past sampling. The algorithm is intermediate in computational complexity between simple, reactionary heuristics and calculation-intensive optimal performance algorithms, yet it has very good relative performance. Our study provides insight into ancestral mechanisms of decision-making and suggests that fundamental principles of decision-making, information processing and even cognition are shared among diverse biological systems. © 2016 The Authors.

  2. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets

    PubMed Central

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  3. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    PubMed

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  4. Out-of-Home Placement Decision-Making and Outcomes in Child Welfare: A Longitudinal Study

    PubMed Central

    McClelland, Gary M.; Weiner, Dana A.; Jordan, Neil; Lyons, John S.

    2015-01-01

    After children enter the child welfare system, subsequent out-of-home placement decisions and their impact on children’s well-being are complex and under-researched. This study examined two placement decision-making models: a multidisciplinary team approach, and a decision support algorithm using a standardized assessment. Based on 3,911 placement records in the Illinois child welfare system over 4 years, concordant (agreement) and discordant (disagreement) decisions between the two models were compared. Concordant decisions consistently predicted improvement in children’s well-being regardless of placement type. Discordant decisions showed greater variability. In general, placing children in settings less restrictive than the algorithm suggested (“under-placing”) was associated with less severe baseline functioning but also less improvement over time than placing children according to the algorithm. “Over-placing” children in settings more restrictive than the algorithm recommended was associated with more severe baseline functioning but fewer significant results in rate of improvement than predicted by concordant decisions. The importance of placement decision-making on policy, restrictiveness of placement, and delivery of treatments and services in child welfare are discussed. PMID:24677172

  5. Automated Decision-Making and Big Data: Concerns for People With Mental Illness.

    PubMed

    Monteith, Scott; Glenn, Tasha

    2016-12-01

    Automated decision-making by computer algorithms based on data from our behaviors is fundamental to the digital economy. Automated decisions impact everyone, occurring routinely in education, employment, health care, credit, and government services. Technologies that generate tracking data, including smartphones, credit cards, websites, social media, and sensors, offer unprecedented benefits. However, people are vulnerable to errors and biases in the underlying data and algorithms, especially those with mental illness. Algorithms based on big data from seemingly unrelated sources may create obstacles to community integration. Voluntary online self-disclosure and constant tracking blur traditional concepts of public versus private data, medical versus non-medical data, and human versus automated decision-making. In contrast to sharing sensitive information with a physician in a confidential relationship, there may be numerous readers of information revealed online; data may be sold repeatedly; used in proprietary algorithms; and are effectively permanent. Technological changes challenge traditional norms affecting privacy and decision-making, and continued discussions on new approaches to provide privacy protections are needed.

  6. Improving family satisfaction and participation in decision making in an intensive care unit.

    PubMed

    Huffines, Meredith; Johnson, Karen L; Smitz Naranjo, Linda L; Lissauer, Matthew E; Fishel, Marmie Ann-Michelle; D'Angelo Howes, Susan M; Pannullo, Diane; Ralls, Mindy; Smith, Ruth

    2013-10-01

    Background Survey data revealed that families of patients in a surgical intensive care unit were not satisfied with their participation in decision making or with how well the multidisciplinary team worked together. Objectives To develop and implement an evidence-based communication algorithm and evaluate its effect in improving satisfaction among patients' families. Methods A multidisciplinary team developed an algorithm that included bundles of communication interventions at 24, 72, and 96 hours after admission to the unit. The algorithm included clinical triggers, which if present escalated the algorithm. A pre-post design using process improvement methods was used to compare families' satisfaction scores before and after implementation of the algorithm. Results Satisfaction scores for participation in decision making (45% vs 68%; z = -2.62, P = .009) and how well the health care team worked together (64% vs 83%; z = -2.10, P = .04) improved significantly after implementation. Conclusions Use of an evidence-based structured communication algorithm may be a way to improve satisfaction of families of intensive care patients with their participation in decision making and their perception of how well the unit's team works together.

  7. Algorithms for optimizing the treatment of depression: making the right decision at the right time.

    PubMed

    Adli, M; Rush, A J; Möller, H-J; Bauer, M

    2003-11-01

    Medication algorithms for the treatment of depression are designed to optimize both treatment implementation and the appropriateness of treatment strategies. Thus, they are essential tools for treating and avoiding refractory depression. Treatment algorithms are explicit treatment protocols that provide specific therapeutic pathways and decision-making tools at critical decision points throughout the treatment process. The present article provides an overview of major projects of algorithm research in the field of antidepressant therapy. The Berlin Algorithm Project and the Texas Medication Algorithm Project (TMAP) compare algorithm-guided treatments with treatment as usual. The Sequenced Treatment Alternatives to Relieve Depression Project (STAR*D) compares different treatment strategies in treatment-resistant patients.

  8. Research on AHP decision algorithms based on BP algorithm

    NASA Astrophysics Data System (ADS)

    Ma, Ning; Guan, Jianhe

    2017-10-01

    Decision making is the thinking activity that people choose or judge, and scientific decision-making has always been a hot issue in the field of research. Analytic Hierarchy Process (AHP) is a simple and practical multi-criteria and multi-objective decision-making method that combines quantitative and qualitative and can show and calculate the subjective judgment in digital form. In the process of decision analysis using AHP method, the rationality of the two-dimensional judgment matrix has a great influence on the decision result. However, in dealing with the real problem, the judgment matrix produced by the two-dimensional comparison is often inconsistent, that is, it does not meet the consistency requirements. BP neural network algorithm is an adaptive nonlinear dynamic system. It has powerful collective computing ability and learning ability. It can perfect the data by constantly modifying the weights and thresholds of the network to achieve the goal of minimizing the mean square error. In this paper, the BP algorithm is used to deal with the consistency of the two-dimensional judgment matrix of the AHP.

  9. Demonstration of Prognostics-Enabled Decision Making Algorithms on a Hardware Mobile Robot Test Platform

    DTIC Science & Technology

    2014-10-02

    were described in (Balaban, Saxena, Bansal , Goebel, & Curran, 2009; Poll et al., 2011), and, in the course of this work, three types of sensor faults...enabled decision making algorithms. International Journal of Prognostics and Health Management, 4(1). Balaban, E., Saxena, A., Bansal , P., Goebel, K. F

  10. Decision Aids for Naval Air ASW

    DTIC Science & Technology

    1980-03-15

    Algorithm for Zone Optimization Investigation) NADC Developing Sonobuoy Pattern for Air ASW Search DAISY (Decision Aiding Information System) Wharton...sion making behavior. 0 Artificial intelligence sequential pattern recognition algorithm for reconstructing the decision maker’s utility functions. 0...display presenting the uncertainty area of the target. 3.1.5 Algorithm for Zone Optimization Investigation (AZOI) -- Naval Air Development Center 0 A

  11. Analysis of decision fusion algorithms in handling uncertainties for integrated health monitoring systems

    NASA Astrophysics Data System (ADS)

    Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza

    2012-06-01

    It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.

  12. Data quality system using reference dictionaries and edit distance algorithms

    NASA Astrophysics Data System (ADS)

    Karbarz, Radosław; Mulawka, Jan

    2015-09-01

    The real art of management it is important to make smart decisions, what in most of the cases is not a trivial task. Those decisions may lead to determination of production level, funds allocation for investments etc. Most of the parameters in decision-making process such as: interest rate, goods value or exchange rate may change. It is well know that these parameters in the decision-making are based on the data contained in datamarts or data warehouse. However, if the information derived from the processed data sets is the basis for the most important management decisions, it is required that the data is accurate, complete and current. In order to achieve high quality data and to gain from them measurable business benefits, data quality system should be used. The article describes the approach to the problem, shows the algorithms in details and their usage. Finally the test results are provide. Test results show the best algorithms (in terms of quality and quantity) for different parameters and data distribution.

  13. Multi-objective optimisation and decision-making of space station logistics strategies

    NASA Astrophysics Data System (ADS)

    Zhu, Yue-he; Luo, Ya-zhong

    2016-10-01

    Space station logistics strategy optimisation is a complex engineering problem with multiple objectives. Finding a decision-maker-preferred compromise solution becomes more significant when solving such a problem. However, the designer-preferred solution is not easy to determine using the traditional method. Thus, a hybrid approach that combines the multi-objective evolutionary algorithm, physical programming, and differential evolution (DE) algorithm is proposed to deal with the optimisation and decision-making of space station logistics strategies. A multi-objective evolutionary algorithm is used to acquire a Pareto frontier and help determine the range parameters of the physical programming. Physical programming is employed to convert the four-objective problem into a single-objective problem, and a DE algorithm is applied to solve the resulting physical programming-based optimisation problem. Five kinds of objective preference are simulated and compared. The simulation results indicate that the proposed approach can produce good compromise solutions corresponding to different decision-makers' preferences.

  14. Multi-criteria group decision making for evaluating the performance of e-waste recycling programs under uncertainty.

    PubMed

    Wibowo, Santoso; Deng, Hepu

    2015-06-01

    This paper presents a multi-criteria group decision making approach for effectively evaluating the performance of e-waste recycling programs under uncertainty in an organization. Intuitionistic fuzzy numbers are used for adequately representing the subjective and imprecise assessments of the decision makers in evaluating the relative importance of evaluation criteria and the performance of individual e-waste recycling programs with respect to individual criteria in a given situation. An interactive fuzzy multi-criteria decision making algorithm is developed for facilitating consensus building in a group decision making environment to ensure that all the interest of individual decision makers have been appropriately considered in evaluating alternative e-waste recycling programs with respect to their corporate sustainability performance. The developed algorithm is then incorporated into a multi-criteria decision support system for making the overall performance evaluation process effectively and simple to use. Such a multi-criteria decision making system adequately provides organizations with a proactive mechanism for incorporating the concept of corporate sustainability into their regular planning decisions and business practices. An example is presented for demonstrating the applicability of the proposed approach in evaluating the performance of e-waste recycling programs in organizations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. [Adequacy of clinical interventions in patients with advanced and complex disease. Proposal of a decision making algorithm].

    PubMed

    Ameneiros-Lago, E; Carballada-Rico, C; Garrido-Sanjuán, J A; García Martínez, A

    2015-01-01

    Decision making in the patient with chronic advanced disease is especially complex. Health professionals are obliged to prevent avoidable suffering and not to add any more damage to that of the disease itself. The adequacy of the clinical interventions consists of only offering those diagnostic and therapeutic procedures appropriate to the clinical situation of the patient and to perform only those allowed by the patient or representative. In this article, the use of an algorithm is proposed that should serve to help health professionals in this decision making process. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  16. Possibility expectation and its decision making algorithm

    NASA Technical Reports Server (NTRS)

    Keller, James M.; Yan, Bolin

    1992-01-01

    The fuzzy integral has been shown to be an effective tool for the aggregation of evidence in decision making. Of primary importance in the development of a fuzzy integral pattern recognition algorithm is the choice (construction) of the measure which embodies the importance of subsets of sources of evidence. Sugeno fuzzy measures have received the most attention due to the recursive nature of the fabrication of the measure on nested sequences of subsets. Possibility measures exhibit an even simpler generation capability, but usually require that one of the sources of information possess complete credibility. In real applications, such normalization may not be possible, or even desirable. In this report, both the theory and a decision making algorithm for a variation of the fuzzy integral are presented. This integral is based on a possibility measure where it is not required that the measure of the universe be unity. A training algorithm for the possibility densities in a pattern recognition application is also presented with the results demonstrated on the shuttle-earth-space training and testing images.

  17. Fleeing from Frankenstein's Monster and Meeting Kafka on the Way: Algorithmic Decision-Making in Higher Education

    ERIC Educational Resources Information Center

    Prinsloo, Paul

    2017-01-01

    In the socio-technical imaginary of higher education, algorithmic decision-making offers huge potential, but we also cannot deny the risks and ethical concerns. In fleeing from Frankenstein's monster, there is a real possibility that we will meet Kafka on our path, and not find our way out of the maze of ethical considerations in the nexus between…

  18. Artificial intelligence in cardiology.

    PubMed

    Bonderman, Diana

    2017-12-01

    Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiology are reviewed. The text also touches on the ethical issues and speculates on the future roles of automated algorithms versus clinicians in cardiology and medicine in general.

  19. Managing and learning with multiple models: Objectives and optimization algorithms

    USGS Publications Warehouse

    Probert, William J. M.; Hauser, C.E.; McDonald-Madden, E.; Runge, M.C.; Baxter, P.W.J.; Possingham, H.P.

    2011-01-01

    The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. ?? 2010 Elsevier Ltd.

  20. Evidence-based algorithm for heparin dosing before cardiopulmonary bypass. Part 1: Development of the algorithm.

    PubMed

    McKinney, Mark C; Riley, Jeffrey B

    2007-12-01

    The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time < 450 seconds with > 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.

  1. Intelligent Medical Systems for Aerospace Emergency Medical Services

    NASA Technical Reports Server (NTRS)

    Epler, John; Zimmer, Gary

    2004-01-01

    The purpose of this project is to develop a portable, hands free device for emergency medical decision support to be used in remote or confined settings by non-physician providers. Phase I of the project will entail the development of a voice-activated device that will utilize an intelligent algorithm to provide guidance in establishing an airway in an emergency situation. The interactive, hands free software will process requests for assistance based on verbal prompts and algorithmic decision-making. The device will allow the CMO to attend to the patient while receiving verbal instruction. The software will also feature graphic representations where it is felt helpful in aiding in procedures. We will also develop a training program to orient users to the algorithmic approach, the use of the hardware and specific procedural considerations. We will validate the efficacy of this mode of technology application by testing in the Johns Hopkins Department of Emergency Medicine. Phase I of the project will focus on the validation of the proposed algorithm, testing and validation of the decision making tool and modifications of medical equipment. In Phase 11, we will produce the first generation software for hands-free, interactive medical decision making for use in acute care environments.

  2. The Computational Complexity of Valuation and Motivational Forces in Decision-Making Processes.

    PubMed

    Redish, A David; Schultheiss, Nathan W; Carter, Evan C

    2016-01-01

    The concept of value is fundamental to most theories of motivation and decision making. However, value has to be measured experimentally. Different methods of measuring value produce incompatible valuation hierarchies. Taking the agent's perspective (rather than the experimenter's), we interpret the different valuation measurement methods as accessing different decision-making systems and show how these different systems depend on different information processing algorithms. This identifies the translation from these multiple decision-making systems into a single action taken by a given agent as one of the most important open questions in decision making today. We conclude by looking at how these different valuation measures accessing different decision-making systems can be used to understand and treat decision dysfunction such as in addiction.

  3. Computed Tomography Evaluation of Esophagogastric Necrosis After Caustic Ingestion.

    PubMed

    Chirica, Mircea; Resche-Rigon, Matthieu; Zagdanski, Anne Marie; Bruzzi, Matthieu; Bouda, Damien; Roland, Eric; Sabatier, François; Bouhidel, Fatiha; Bonnet, Francine; Munoz-Bongrand, Nicolas; Marc Gornet, Jean; Sarfati, Emile; Cattan, Pierre

    2016-07-01

    Endoscopy is the standard of care for emergency patient evaluation after caustic ingestion. However, the inaccuracy of endoscopy in determining the depth of intramural necrosis may lead to inappropriate decision-making with devastating consequences. Our aim was to evaluate the use of computed tomography (CT) for the emergency diagnostic workup of patients with caustic injuries. In a prospective study, we used a combined endoscopy-CT decision-making algorithm. The primary outcome was pathology-confirmed digestive necrosis. The respective utility of CT and endoscopy in the decision-making process were compared. Transmural endoscopic necrosis was defined as grade 3b injuries; signs of transmural CT necrosis included absence of postcontrast gastric/ esophageal-wall enhancement, esophageal-wall blurring, and periesophageal-fat blurring. We included 120 patients (59 men, median age 44 years). Emergency surgery was performed in 24 patients (20%) and digestive resection was completed in 16. Three patients (3%) died and 28 patients (23%) experienced complications. Pathology revealed transmural necrosis in 9/11 esophagectomy and 16/16 gastrectomy specimens. Severe oropharyngeal injuries (P = 0.015), increased levels of blood lactate (P = 0.007), alanine aminotransferase (P = 0.027), bilirubin (P = 0.005), and low platelet counts (P > 0.0001) were predictive of digestive necrosis. Decision-making relying on CT alone or on a combined CT-endoscopy algorithm was similar and would have spared 19 unnecessary esophagectomies and 16 explorative laparotomies compared with an endoscopy-alone algorithm. Endoscopy did never rectify a wrong CT decision. Emergency decision-making after caustic injuries can rely on CT alone.

  4. A Modified Decision Tree Algorithm Based on Genetic Algorithm for Mobile User Classification Problem

    PubMed Central

    Liu, Dong-sheng; Fan, Shu-jiang

    2014-01-01

    In order to offer mobile customers better service, we should classify the mobile user firstly. Aimed at the limitations of previous classification methods, this paper puts forward a modified decision tree algorithm for mobile user classification, which introduced genetic algorithm to optimize the results of the decision tree algorithm. We also take the context information as a classification attributes for the mobile user and we classify the context into public context and private context classes. Then we analyze the processes and operators of the algorithm. At last, we make an experiment on the mobile user with the algorithm, we can classify the mobile user into Basic service user, E-service user, Plus service user, and Total service user classes and we can also get some rules about the mobile user. Compared to C4.5 decision tree algorithm and SVM algorithm, the algorithm we proposed in this paper has higher accuracy and more simplicity. PMID:24688389

  5. Aneurysmal subarachnoid hemorrhage prognostic decision-making algorithm using classification and regression tree analysis.

    PubMed

    Lo, Benjamin W Y; Fukuda, Hitoshi; Angle, Mark; Teitelbaum, Jeanne; Macdonald, R Loch; Farrokhyar, Forough; Thabane, Lehana; Levine, Mitchell A H

    2016-01-01

    Classification and regression tree analysis involves the creation of a decision tree by recursive partitioning of a dataset into more homogeneous subgroups. Thus far, there is scarce literature on using this technique to create clinical prediction tools for aneurysmal subarachnoid hemorrhage (SAH). The classification and regression tree analysis technique was applied to the multicenter Tirilazad database (3551 patients) in order to create the decision-making algorithm. In order to elucidate prognostic subgroups in aneurysmal SAH, neurologic, systemic, and demographic factors were taken into account. The dependent variable used for analysis was the dichotomized Glasgow Outcome Score at 3 months. Classification and regression tree analysis revealed seven prognostic subgroups. Neurological grade, occurrence of post-admission stroke, occurrence of post-admission fever, and age represented the explanatory nodes of this decision tree. Split sample validation revealed classification accuracy of 79% for the training dataset and 77% for the testing dataset. In addition, the occurrence of fever at 1-week post-aneurysmal SAH is associated with increased odds of post-admission stroke (odds ratio: 1.83, 95% confidence interval: 1.56-2.45, P < 0.01). A clinically useful classification tree was generated, which serves as a prediction tool to guide bedside prognostication and clinical treatment decision making. This prognostic decision-making algorithm also shed light on the complex interactions between a number of risk factors in determining outcome after aneurysmal SAH.

  6. Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation.

    PubMed

    Zhao, Wei; Wang, Han

    2016-06-28

    Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages.

  7. Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation

    PubMed Central

    Zhao, Wei; Wang, Han

    2016-01-01

    Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages. PMID:27367691

  8. Maximum-likelihood soft-decision decoding of block codes using the A* algorithm

    NASA Technical Reports Server (NTRS)

    Ekroot, L.; Dolinar, S.

    1994-01-01

    The A* algorithm finds the path in a finite depth binary tree that optimizes a function. Here, it is applied to maximum-likelihood soft-decision decoding of block codes where the function optimized over the codewords is the likelihood function of the received sequence given each codeword. The algorithm considers codewords one bit at a time, making use of the most reliable received symbols first and pursuing only the partially expanded codewords that might be maximally likely. A version of the A* algorithm for maximum-likelihood decoding of block codes has been implemented for block codes up to 64 bits in length. The efficiency of this algorithm makes simulations of codes up to length 64 feasible. This article details the implementation currently in use, compares the decoding complexity with that of exhaustive search and Viterbi decoding algorithms, and presents performance curves obtained with this implementation of the A* algorithm for several codes.

  9. Equipment Selection by using Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Yavuz, Mahmut

    2016-10-01

    In this study, Fuzzy TOPSIS method was performed for the selection of open pit truck and the optimal solution of the problem was investigated. Data from Turkish Coal Enterprises was used in the application of the method. This paper explains the Fuzzy TOPSIS approaches with group decision-making application in an open pit coal mine in Turkey. An algorithm of the multi-person multi-criteria decision making with fuzzy set approach was applied an equipment selection problem. It was found that Fuzzy TOPSIS with a group decision making is a method that may help decision-makers in solving different decision-making problems in mining.

  10. Differences in spirometry interpretation algorithms: influence on decision making among primary-care physicians.

    PubMed

    He, Xiao-Ou; D'Urzo, Anthony; Jugovic, Pieter; Jhirad, Reuven; Sehgal, Prateek; Lilly, Evan

    2015-03-12

    Spirometry is recommended for the diagnosis of asthma and chronic obstructive pulmonary disease (COPD) in international guidelines and may be useful for distinguishing asthma from COPD. Numerous spirometry interpretation algorithms (SIAs) are described in the literature, but no studies highlight how different SIAs may influence the interpretation of the same spirometric data. We examined how two different SIAs may influence decision making among primary-care physicians. Data for this initiative were gathered from 113 primary-care physicians attending accredited workshops in Canada between 2011 and 2013. Physicians were asked to interpret nine spirograms presented twice in random sequence using two different SIAs and touch pad technology for anonymous data recording. We observed differences in the interpretation of spirograms using two different SIAs. When the pre-bronchodilator FEV1/FVC (forced expiratory volume in one second/forced vital capacity) ratio was >0.70, algorithm 1 led to a 'normal' interpretation (78% of physicians), whereas algorithm 2 prompted a bronchodilator challenge revealing changes in FEV1 that were consistent with asthma, an interpretation selected by 94% of physicians. When the FEV1/FVC ratio was <0.70 after bronchodilator challenge but FEV1 increased >12% and 200 ml, 76% suspected asthma and 10% suspected COPD using algorithm 1, whereas 74% suspected asthma versus COPD using algorithm 2 across five separate cases. The absence of a post-bronchodilator FEV1/FVC decision node in algorithm 1 did not permit consideration of possible COPD. This study suggests that differences in SIAs may influence decision making and lead clinicians to interpret the same spirometry data differently.

  11. A tuning algorithm for model predictive controllers based on genetic algorithms and fuzzy decision making.

    PubMed

    van der Lee, J H; Svrcek, W Y; Young, B R

    2008-01-01

    Model Predictive Control is a valuable tool for the process control engineer in a wide variety of applications. Because of this the structure of an MPC can vary dramatically from application to application. There have been a number of works dedicated to MPC tuning for specific cases. Since MPCs can differ significantly, this means that these tuning methods become inapplicable and a trial and error tuning approach must be used. This can be quite time consuming and can result in non-optimum tuning. In an attempt to resolve this, a generalized automated tuning algorithm for MPCs was developed. This approach is numerically based and combines a genetic algorithm with multi-objective fuzzy decision-making. The key advantages to this approach are that genetic algorithms are not problem specific and only need to be adapted to account for the number and ranges of tuning parameters for a given MPC. As well, multi-objective fuzzy decision-making can handle qualitative statements of what optimum control is, in addition to being able to use multiple inputs to determine tuning parameters that best match the desired results. This is particularly useful for multi-input, multi-output (MIMO) cases where the definition of "optimum" control is subject to the opinion of the control engineer tuning the system. A case study will be presented in order to illustrate the use of the tuning algorithm. This will include how different definitions of "optimum" control can arise, and how they are accounted for in the multi-objective decision making algorithm. The resulting tuning parameters from each of the definition sets will be compared, and in doing so show that the tuning parameters vary in order to meet each definition of optimum control, thus showing the generalized automated tuning algorithm approach for tuning MPCs is feasible.

  12. The Computational Complexity of Valuation and Motivational Forces in Decision-Making Processes

    PubMed Central

    Schultheiss, Nathan W.; Carter, Evan C.

    2015-01-01

    The concept of value is fundamental to most theories of motivation and decision making. However, value has to be measured experimentally. Different methods of measuring value produce incompatible valuation hierarchies. Taking the agent’s perspective (rather than the experimenter’s), we interpret the different valuation measurement methods as accessing different decision-making systems and show how these different systems depend on different information processing algorithms. This identifies the translation from these multiple decision-making systems into a single action taken by a given agent as one of the most important open questions in decision making today. We conclude by looking at how these different valuation measures accessing different decision-making systems can be used to understand and treat decision dysfunction such as in addiction. PMID:25981912

  13. Network-centric decision architecture for financial or 1/f data models

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.; Massey, Stoney; Case, Carl T.; Songy, Claude G.

    2002-12-01

    This paper presents a decision architecture algorithm for training neural equation based networks to make autonomous multi-goal oriented, multi-class decisions. These architectures make decisions based on their individual goals and draw from the same network centric feature set. Traditionally, these architectures are comprised of neural networks that offer marginal performance due to lack of convergence of the training set. We present an approach for autonomously extracting sample points as I/O exemplars for generation of multi-branch, multi-node decision architectures populated by adaptively derived neural equations. To test the robustness of this architecture, open source data sets in the form of financial time series were used, requiring a three-class decision space analogous to the lethal, non-lethal, and clutter discrimination problem. This algorithm and the results of its application are presented here.

  14. Smart algorithms and adaptive methods in computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Tinsley Oden, J.

    1989-05-01

    A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.

  15. Differences in spirometry interpretation algorithms: influence on decision making among primary-care physicians

    PubMed Central

    He, Xiao-Ou; D’Urzo, Anthony; Jugovic, Pieter; Jhirad, Reuven; Sehgal, Prateek; Lilly, Evan

    2015-01-01

    Background: Spirometry is recommended for the diagnosis of asthma and chronic obstructive pulmonary disease (COPD) in international guidelines and may be useful for distinguishing asthma from COPD. Numerous spirometry interpretation algorithms (SIAs) are described in the literature, but no studies highlight how different SIAs may influence the interpretation of the same spirometric data. Aims: We examined how two different SIAs may influence decision making among primary-care physicians. Methods: Data for this initiative were gathered from 113 primary-care physicians attending accredited workshops in Canada between 2011 and 2013. Physicians were asked to interpret nine spirograms presented twice in random sequence using two different SIAs and touch pad technology for anonymous data recording. Results: We observed differences in the interpretation of spirograms using two different SIAs. When the pre-bronchodilator FEV1/FVC (forced expiratory volume in one second/forced vital capacity) ratio was >0.70, algorithm 1 led to a ‘normal’ interpretation (78% of physicians), whereas algorithm 2 prompted a bronchodilator challenge revealing changes in FEV1 that were consistent with asthma, an interpretation selected by 94% of physicians. When the FEV1/FVC ratio was <0.70 after bronchodilator challenge but FEV1 increased >12% and 200 ml, 76% suspected asthma and 10% suspected COPD using algorithm 1, whereas 74% suspected asthma versus COPD using algorithm 2 across five separate cases. The absence of a post-bronchodilator FEV1/FVC decision node in algorithm 1 did not permit consideration of possible COPD. Conclusions: This study suggests that differences in SIAs may influence decision making and lead clinicians to interpret the same spirometry data differently. PMID:25763716

  16. Composite collective decision-making

    PubMed Central

    Czaczkes, Tomer J.; Czaczkes, Benjamin; Iglhaut, Carolin; Heinze, Jürgen

    2015-01-01

    Individual animals are adept at making decisions and have cognitive abilities, such as memory, which allow them to hone their decisions. Social animals can also share information. This allows social animals to make adaptive group-level decisions. Both individual and collective decision-making systems also have drawbacks and limitations, and while both are well studied, the interaction between them is still poorly understood. Here, we study how individual and collective decision-making interact during ant foraging. We first gathered empirical data on memory-based foraging persistence in the ant Lasius niger. We used these data to create an agent-based model where ants may use social information (trail pheromones), private information (memories) or both to make foraging decisions. The combined use of social and private information by individuals results in greater efficiency at the group level than when either information source was used alone. The modelled ants couple consensus decision-making, allowing them to quickly exploit high-quality food sources, and combined decision-making, allowing different individuals to specialize in exploiting different resource patches. Such a composite collective decision-making system reaps the benefits of both its constituent parts. Exploiting such insights into composite collective decision-making may lead to improved decision-making algorithms. PMID:26019155

  17. A probabilistic, distributed, recursive mechanism for decision-making in the brain

    PubMed Central

    Gurney, Kevin N.

    2018-01-01

    Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077

  18. HUMAN DECISIONS AND MACHINE PREDICTIONS.

    PubMed

    Kleinberg, Jon; Lakkaraju, Himabindu; Leskovec, Jure; Ludwig, Jens; Mullainathan, Sendhil

    2018-02-01

    Can machine learning improve human decision making? Bail decisions provide a good test case. Millions of times each year, judges make jail-or-release decisions that hinge on a prediction of what a defendant would do if released. The concreteness of the prediction task combined with the volume of data available makes this a promising machine-learning application. Yet comparing the algorithm to judges proves complicated. First, the available data are generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the variable the algorithm predicts; for instance, judges may care specifically about violent crimes or about racial inequities. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: one policy simulation shows crime reductions up to 24.7% with no change in jailing rates, or jailing rate reductions up to 41.9% with no increase in crime rates. Moreover, all categories of crime, including violent crimes, show reductions; and these gains can be achieved while simultaneously reducing racial disparities. These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals. JEL Codes: C10 (Econometric and statistical methods and methodology), C55 (Large datasets: Modeling and analysis), K40 (Legal procedure, the legal system, and illegal behavior).

  19. HUMAN DECISIONS AND MACHINE PREDICTIONS*

    PubMed Central

    Kleinberg, Jon; Lakkaraju, Himabindu; Leskovec, Jure; Ludwig, Jens; Mullainathan, Sendhil

    2018-01-01

    Can machine learning improve human decision making? Bail decisions provide a good test case. Millions of times each year, judges make jail-or-release decisions that hinge on a prediction of what a defendant would do if released. The concreteness of the prediction task combined with the volume of data available makes this a promising machine-learning application. Yet comparing the algorithm to judges proves complicated. First, the available data are generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the variable the algorithm predicts; for instance, judges may care specifically about violent crimes or about racial inequities. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: one policy simulation shows crime reductions up to 24.7% with no change in jailing rates, or jailing rate reductions up to 41.9% with no increase in crime rates. Moreover, all categories of crime, including violent crimes, show reductions; and these gains can be achieved while simultaneously reducing racial disparities. These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals. JEL Codes: C10 (Econometric and statistical methods and methodology), C55 (Large datasets: Modeling and analysis), K40 (Legal procedure, the legal system, and illegal behavior) PMID:29755141

  20. Algorithm of choosing type of mechanical assembly production of instrument making enterprises of Industry 4.0

    NASA Astrophysics Data System (ADS)

    Zakoldaev, D. A.; Shukalov, A. V.; Zharinov, I. O.; Zharinov, O. O.

    2018-05-01

    The task of the algorithm of choosing the type of mechanical assembly production of instrument making enterprises of Industry 4.0 is being studied. There is a comparison of two project algorithms for Industry 3.0 and Industry 4.0. The algorithm of choosing the type of mechanical assembly production of instrument making enterprises of Industry 4.0 is based on the technological route analysis of the manufacturing process in a company equipped with cyber and physical systems. This algorithm may give some project solutions selected from the primary part or the auxiliary one of the production. The algorithm decisive rules are based on the optimal criterion.

  1. Reinforcement Learning for Weakly-Coupled MDPs and an Application to Planetary Rover Control

    NASA Technical Reports Server (NTRS)

    Bernstein, Daniel S.; Zilberstein, Shlomo

    2003-01-01

    Weakly-coupled Markov decision processes can be decomposed into subprocesses that interact only through a small set of bottleneck states. We study a hierarchical reinforcement learning algorithm designed to take advantage of this particular type of decomposability. To test our algorithm, we use a decision-making problem faced by autonomous planetary rovers. In this problem, a Mars rover must decide which activities to perform and when to traverse between science sites in order to make the best use of its limited resources. In our experiments, the hierarchical algorithm performs better than Q-learning in the early stages of learning, but unlike Q-learning it converges to a suboptimal policy. This suggests that it may be advantageous to use the hierarchical algorithm when training time is limited.

  2. Application of majority voting and consensus voting algorithms in N-version software

    NASA Astrophysics Data System (ADS)

    Tsarev, R. Yu; Durmuş, M. S.; Üstoglu, I.; Morozov, V. A.

    2018-05-01

    N-version programming is one of the most common techniques which is used to improve the reliability of software by building in fault tolerance, redundancy and decreasing common cause failures. N different equivalent software versions are developed by N different and isolated workgroups by considering the same software specifications. The versions solve the same task and return results that have to be compared to determine the correct result. Decisions of N different versions are evaluated by a voting algorithm or the so-called voter. In this paper, two of the most commonly used software voting algorithms such as the majority voting algorithm and the consensus voting algorithm are studied. The distinctive features of Nversion programming with majority voting and N-version programming with consensus voting are described. These two algorithms make a decision about the correct result on the base of the agreement matrix. However, if the equivalence relation on the agreement matrix is not satisfied it is impossible to make a decision. It is shown that the agreement matrix can be transformed into an appropriate form by using the Boolean compositions when the equivalence relation is satisfied.

  3. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  4. A novel clinical decision support algorithm for constructing complete medication histories.

    PubMed

    Long, Ju; Yuan, Michael Juntao

    2017-07-01

    A patient's complete medication history is a crucial element for physicians to develop a full understanding of the patient's medical conditions and treatment options. However, due to the fragmented nature of medical data, this process can be very time-consuming and often impossible for physicians to construct a complete medication history for complex patients. In this paper, we describe an accurate, computationally efficient and scalable algorithm to construct a medication history timeline. The algorithm is developed and validated based on 1 million random prescription records from a large national prescription data aggregator. Our evaluation shows that the algorithm can be scaled horizontally on-demand, making it suitable for future delivery in a cloud-computing environment. We also propose that this cloud-based medication history computation algorithm could be integrated into Electronic Medical Records, enabling informed clinical decision-making at the point of care. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Decision theory, reinforcement learning, and the brain.

    PubMed

    Dayan, Peter; Daw, Nathaniel D

    2008-12-01

    Decision making is a core competence for animals and humans acting and surviving in environments they only partially comprehend, gaining rewards and punishments for their troubles. Decision-theoretic concepts permeate experiments and computational models in ethology, psychology, and neuroscience. Here, we review a well-known, coherent Bayesian approach to decision making, showing how it unifies issues in Markovian decision problems, signal detection psychophysics, sequential sampling, and optimal exploration and discuss paradigmatic psychological and neural examples of each problem. We discuss computational issues concerning what subjects know about their task and how ambitious they are in seeking optimal solutions; we address algorithmic topics concerning model-based and model-free methods for making choices; and we highlight key aspects of the neural implementation of decision making.

  6. Patterns of out-of-home placement decision-making in child welfare.

    PubMed

    Chor, Ka Ho Brian; McClelland, Gary M; Weiner, Dana A; Jordan, Neil; Lyons, John S

    2013-10-01

    Out-of-home placement decision-making in child welfare is founded on the best interest of the child in the least restrictive setting. After a child is removed from home, however, little is known about the mechanism of placement decision-making. This study aims to systematically examine the patterns of out-of-home placement decisions made in a state's child welfare system by comparing two models of placement decision-making: a multidisciplinary team decision-making model and a clinically based decision support algorithm. Based on records of 7816 placement decisions representing 6096 children over a 4-year period, hierarchical log-linear modeling characterized concordance or agreement, and discordance or disagreement when comparing the two models and accounting for age-appropriate placement options. Children aged below 16 had an overall concordance rate of 55.7%, most apparent in the least restrictive (20.4%) and the most restrictive placement (18.4%). Older youth showed greater discordant distributions (62.9%). Log-linear analysis confirmed the overall robustness of concordance (odd ratios [ORs] range: 2.9-442.0), though discordance was most evident from small deviations from the decision support algorithm, such as one-level under-placement in group home (OR=5.3) and one-level over-placement in residential treatment center (OR=4.8). Concordance should be further explored using child-level clinical and placement stability outcomes. Discordance might be explained by dynamic factors such as availability of placements, caregiver preferences, or policy changes and could be justified by positive child-level outcomes. Empirical placement decision-making is critical to a child's journey in child welfare and should be continuously improved to effect positive child welfare outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Prospective Analysis of Decision Making During Joint Cardiology Cardiothoracic Conference in Treatment of 107 Consecutive Children with Congenital Heart Disease.

    PubMed

    Duignan, Sophie; Ryan, Aedin; O'Keeffe, Dara; Kenny, Damien; McMahon, Colin J

    2018-05-12

    The complexity and potential biases involved in decision making have long been recognised and examined in both the aviation and business industries. More recently, the medical community have started to explore this concept and its particular importance in our field. Paediatric cardiology is a rapidly expanding field and for many of the conditions we treat, there is limited evidence available to support our decision-making. Variability exists within decision-making in paediatric cardiology and this may influence outcomes. There are no validated tools available to support and examine consistent decision-making for various treatment strategies in children with congenital heart disease in a multidisciplinary cardiology and cardiothoracic institution. Our primary objective was to analyse the complexity of decision-making for children with cardiac conditions in the context of our joint cardiology and cardiothoracic conference (JCC). Two paediatric cardiologists acted as investigators by observing the weekly joint cardiology-cardiothoracic surgery conference and prospectively evaluating the degree of complexity of decision-making in the management of 107 sequential children with congenital heart disease discussed. Additionally, the group consensus on the same patients was prospectively assessed to compare this to the independent observers. Of 107 consecutive children discussed at our JCC conference 32 (27%) went on to receive surgical intervention, 20 (17%) underwent catheterisation and 65 (56%) received medical treatment. There were 53 (50%) cases rated as simple by one senior observer, while 54 (50%) were rated as complex to some degree. There was high inter-observer agreement with a Krippendorff's alpha of ≥ 0.8 between 2 observers and between 2 observers and the group consensus as a whole for grading of the complexity of decision-making. Different decisions were occasionally made on patients with the same data set. Discussions revisiting the same patient, in complex cases, resulted in different management decisions being reached in this series. Anchoring of decision-making was witnessed in certain cases. Potential application of decision making algorithms is discussed in making decisions in paediatric cardiology patients. Decision-making in our institution's joint cardiology-cardiothoracic conference proved to be complex in approximately half of our patients. Inconsistency in decision-making for patients with the same diagnosis, and different decisions made for the same complex patient at different time points confounds the reliability of the decision-making process. These novel data highlight the absence of evidence-based medicine for many decisions, occasional lack of consistency and the impact of anchoring, heuristics and other biases in complex cases. Validated decision-making algorithms may assist in providing consistency to decision-making in this setting.

  8. Bayesian design of decision rules for failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.

  9. Development of Decision-Making Automated System for Optimal Placement of Physical Access Control System’s Elements

    NASA Astrophysics Data System (ADS)

    Danilova, Olga; Semenova, Zinaida

    2018-04-01

    The objective of this study is a detailed analysis of physical protection systems development for information resources. The optimization theory and decision-making mathematical apparatus is used to formulate correctly and create an algorithm of selection procedure for security systems optimal configuration considering the location of the secured object’s access point and zones. The result of this study is a software implementation scheme of decision-making system for optimal placement of the physical access control system’s elements.

  10. Composite collective decision-making.

    PubMed

    Czaczkes, Tomer J; Czaczkes, Benjamin; Iglhaut, Carolin; Heinze, Jürgen

    2015-06-22

    Individual animals are adept at making decisions and have cognitive abilities, such as memory, which allow them to hone their decisions. Social animals can also share information. This allows social animals to make adaptive group-level decisions. Both individual and collective decision-making systems also have drawbacks and limitations, and while both are well studied, the interaction between them is still poorly understood. Here, we study how individual and collective decision-making interact during ant foraging. We first gathered empirical data on memory-based foraging persistence in the ant Lasius niger. We used these data to create an agent-based model where ants may use social information (trail pheromones), private information (memories) or both to make foraging decisions. The combined use of social and private information by individuals results in greater efficiency at the group level than when either information source was used alone. The modelled ants couple consensus decision-making, allowing them to quickly exploit high-quality food sources, and combined decision-making, allowing different individuals to specialize in exploiting different resource patches. Such a composite collective decision-making system reaps the benefits of both its constituent parts. Exploiting such insights into composite collective decision-making may lead to improved decision-making algorithms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  11. Helping parents cope with crying babies: decision-making and interaction at NHS Direct.

    PubMed

    Smith, Suzanne

    2010-02-01

    This paper is a report of a study of how nurses at a national telephone triage centre in England (NHS Direct) make different use of the algorithms and organizational protocols to make decisions and give advice to parents with crying babies, how their clinical knowledge and experience influences these decisions, and the techniques used to enhance parental coping ability. Parents of persistently crying babies state that they need to be listened to, understood, believed and reassured to help them cope. Nurses at NHS Direct use their clinical judgement in decision-making, and see the software as a guide that can be both valuable and problematic. The study design was influenced by grounded theory and incorporated discourse and thematic analysis. It had two phases involving data collection and analysis over the period 2002-2006. A theoretical sample of 11 calls was analysed and later a focus group of six nurses at the same site. NHS Direct nurses used the 'crying baby' algorithm in various ways, influenced by their experience and confidence to use the algorithm to support their clinical knowledge. Its medical elements were regarded as safe but its non-medical elements, including questions about the likelihood of shaking a child, were treated differently. Nurses were reluctant to deviate from the algorithm when dealing with child-focused calls. However, this reluctance did not apply when they were prompted to ask the caller if they felt that they were reaching a point where they might shake their baby, or when prompted to give related advice.

  12. Syndromic treatment of gonococcal and chlamydial infections in women seeking primary care for the genital discharge syndrome: decision-making.

    PubMed Central

    Behets, F. M.; Miller, W. C.; Cohen, M. S.

    2001-01-01

    The syndromic treatment of gonococcal and chlamydial infections in women seeking primary care in clinics where resources are scarce, as recommended by WHO and implemented in many developing countries, necessitates a balance to be struck between overtreatment and undertreatment. The present paper identifies factors that are relevant to the selection of specific strategies for syndromic treatment in the above circumstances. Among them are the general aspects of decision-making and caveats concerning the rational decision-making approach. The positive and negative implications are outlined of providing or withholding treatment following a specific algorithm with a given accuracy to detect infection, i.e. sensitivity, specificity and predictive values. Other decision-making considerations that are identified are related to implementation and include the stability of risk factors with regard to time, space and the implementer, acceptability by stakeholders, and environmental constraints. There is a need to consider empirically developed treatment algorithms as a basis for policy discourse, to be evaluated together with the evidence, alternatives and arguments by the stakeholders. PMID:11731816

  13. Highlights from the 17th International Conference on Multi-Criteria Decision Making, Whistler, BC, August 6-11, 2004

    DTIC Science & Technology

    2005-04-01

    related to one of the following areas: 1. Group Decision Support Methods; 2. Decision Support Methods; 3. AHP applications; 4. Multi...Objective Linear Programming (MOLP) algorithms; 5. Industrial engineering applications; 6. Behavioural considerations, and 7. Fuzzy MCDM. 3...making. This is especially important when using software like AHP or when constructing questionnaires for SME’s ( see [10] for many examples

  14. Value-based decision-making battery: A Bayesian adaptive approach to assess impulsive and risky behavior.

    PubMed

    Pooseh, Shakoor; Bernhardt, Nadine; Guevara, Alvaro; Huys, Quentin J M; Smolka, Michael N

    2018-02-01

    Using simple mathematical models of choice behavior, we present a Bayesian adaptive algorithm to assess measures of impulsive and risky decision making. Practically, these measures are characterized by discounting rates and are used to classify individuals or population groups, to distinguish unhealthy behavior, and to predict developmental courses. However, a constant demand for improved tools to assess these constructs remains unanswered. The algorithm is based on trial-by-trial observations. At each step, a choice is made between immediate (certain) and delayed (risky) options. Then the current parameter estimates are updated by the likelihood of observing the choice, and the next offers are provided from the indifference point, so that they will acquire the most informative data based on the current parameter estimates. The procedure continues for a certain number of trials in order to reach a stable estimation. The algorithm is discussed in detail for the delay discounting case, and results from decision making under risk for gains, losses, and mixed prospects are also provided. Simulated experiments using prescribed parameter values were performed to justify the algorithm in terms of the reproducibility of its parameters for individual assessments, and to test the reliability of the estimation procedure in a group-level analysis. The algorithm was implemented as an experimental battery to measure temporal and probability discounting rates together with loss aversion, and was tested on a healthy participant sample.

  15. Remote Sensing Applications to Water Quality Management in Florida

    NASA Astrophysics Data System (ADS)

    Lehrter, J. C.; Schaeffer, B. A.; Hagy, J.; Spiering, B.; Barnes, B.; Hu, C.; Le, C.; McEachron, L.; Underwood, L. W.; Ellis, C.; Fisher, B.

    2013-12-01

    Optical datasets from estuarine and coastal systems are increasingly available for remote sensing algorithm development, validation, and application. With validated algorithms, the data streams from satellite sensors can provide unprecedented spatial and temporal data for local and regional coastal water quality management. Our presentation will highlight two recent applications of optical data and remote sensing to water quality decision-making in coastal regions of the state of Florida; (1) informing the development of estuarine and coastal nutrient criteria for the state of Florida and (2) informing the rezoning of the Florida Keys National Marine Sanctuary. These efforts involved building up the underlying science to demonstrate the applicability of satellite data as well as an outreach component to educate decision-makers about the use, utility, and uncertainties of remote sensing data products. Scientific developments included testing existing algorithms and generating new algorithms for water clarity and chlorophylla in case II (CDOM or turbidity dominated) estuarine and coastal waters and demonstrating the accuracy of remote sensing data products in comparison to traditional field based measurements. Including members from decision-making organizations on the research team and interacting with decision-makers early and often in the process were key factors for the success of the outreach efforts and the eventual adoption of satellite data into the data records and analyses used in decision-making. Florida coastal water bodies (black boxes) for which remote sensing imagery were applied to derive numeric nutrient criteria and in situ observations (black dots) used to validate imagery. Florida ocean color applied to development of numeric nutrient criteria

  16. Reinforcement learning and decision making in monkeys during a competitive game.

    PubMed

    Lee, Daeyeol; Conroy, Michelle L; McGreevy, Benjamin P; Barraclough, Dominic J

    2004-12-01

    Animals living in a dynamic environment must adjust their decision-making strategies through experience. To gain insights into the neural basis of such adaptive decision-making processes, we trained monkeys to play a competitive game against a computer in an oculomotor free-choice task. The animal selected one of two visual targets in each trial and was rewarded only when it selected the same target as the computer opponent. To determine how the animal's decision-making strategy can be affected by the opponent's strategy, the computer opponent was programmed with three different algorithms that exploited different aspects of the animal's choice and reward history. When the computer selected its targets randomly with equal probabilities, animals selected one of the targets more often, violating the prediction of probability matching, and their choices were systematically influenced by the choice history of the two players. When the computer exploited only the animal's choice history but not its reward history, animal's choice became more independent of its own choice history but was still related to the choice history of the opponent. This bias was substantially reduced, but not completely eliminated, when the computer used the choice history of both players in making its predictions. These biases were consistent with the predictions of reinforcement learning, suggesting that the animals sought optimal decision-making strategies using reinforcement learning algorithms.

  17. A Simulation Tool for Distributed Databases.

    DTIC Science & Technology

    1981-09-01

    11-8 . Reed’s multiversion system [RE1T8] may also be viewed aa updating only copies until the commit is made. The decision to make the changes...distributed voting, and Ellis’ ring algorithm. Other, significantly different algorithms not covered in his work include Reed’s multiversion algorithm, the

  18. 78 FR 54502 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... transactions. Transactions that originate from unrelated algorithms or separate and distinct trading strategies... transactions were undertaken for manipulative or other fraudulent purposes. Algorithms or trading strategies... activity and the use of algorithms by firms to make trading decisions, FINRA has observed an increase in...

  19. Toward detecting deception in intelligent systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Johnson, Gregory, Jr.

    2004-08-01

    Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.

  20. Evaluation of an Agricultural Meteorological Disaster Based on Multiple Criterion Decision Making and Evolutionary Algorithm

    PubMed Central

    Yu, Xiaobing; Yu, Xianrui; Lu, Yiqun

    2018-01-01

    The evaluation of a meteorological disaster can be regarded as a multiple-criteria decision making problem because it involves many indexes. Firstly, a comprehensive indexing system for an agricultural meteorological disaster is proposed, which includes the disaster rate, the inundated rate, and the complete loss rate. Following this, the relative weights of the three criteria are acquired using a novel proposed evolutionary algorithm. The proposed algorithm consists of a differential evolution algorithm and an evolution strategy. Finally, a novel evaluation model, based on the proposed algorithm and the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), is presented to estimate the agricultural meteorological disaster of 2008 in China. The geographic information system (GIS) technique is employed to depict the disaster. The experimental results demonstrated that the agricultural meteorological disaster of 2008 was very serious, especially in Hunan and Hubei provinces. Some useful suggestions are provided to relieve agriculture meteorological disasters. PMID:29597243

  1. Bridge health monitoring metrics : updating the bridge deficiency algorithm.

    DOT National Transportation Integrated Search

    2009-10-01

    As part of its bridge management system, the Alabama Department of Transportation (ALDOT) must decide how best to spend its bridge replacement funds. In making these decisions, ALDOT managers currently use a deficiency algorithm to rank bridges that ...

  2. Water flow algorithm decision support tool for travelling salesman problem

    NASA Astrophysics Data System (ADS)

    Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd

    2016-08-01

    This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.

  3. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  4. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    PubMed Central

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  5. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    PubMed

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  6. Evaluation of the Effect of Diagnostic Molecular Testing on the Surgical Decision-Making Process for Patients With Thyroid Nodules.

    PubMed

    Noureldine, Salem I; Najafian, Alireza; Aragon Han, Patricia; Olson, Matthew T; Genther, Dane J; Schneider, Eric B; Prescott, Jason D; Agrawal, Nishant; Mathur, Aarti; Zeiger, Martha A; Tufano, Ralph P

    2016-07-01

    Diagnostic molecular testing is used in the workup of thyroid nodules. While these tests appear to be promising in more definitively assigning a risk of malignancy, their effect on surgical decision making has yet to be demonstrated. To investigate the effect of diagnostic molecular profiling of thyroid nodules on the surgical decision-making process. A surgical management algorithm was developed and published after peer review that incorporated individual Bethesda System for Reporting Thyroid Cytopathology classifications with clinical, laboratory, and radiological results. This algorithm was created to formalize the decision-making process selected herein in managing patients with thyroid nodules. Between April 1, 2014, and March 31, 2015, a prospective study of patients who had undergone diagnostic molecular testing of a thyroid nodule before being seen for surgical consultation was performed. The recommended management undertaken by the surgeon was then prospectively compared with the corresponding one in the algorithm. Patients with thyroid nodules who did not undergo molecular testing and were seen for surgical consultation during the same period served as a control group. All pertinent treatment options were presented to each patient, and any deviation from the algorithm was recorded prospectively. To evaluate the appropriateness of any change (deviation) in management, the surgical histopathology diagnosis was correlated with the surgery performed. The study cohort comprised 140 patients who underwent molecular testing. Their mean (SD) age was 50.3 (14.6) years, and 75.0% (105 of 140) were female. Over a 1-year period, 20.3% (140 of 688) had undergone diagnostic molecular testing before surgical consultation, and 79.7% (548 of 688) had not undergone molecular testing. The surgical management deviated from the treatment algorithm in 12.9% (18 of 140) with molecular testing and in 10.2% (56 of 548) without molecular testing (P = .37). In the group with molecular testing, the surgical management plan of only 7.9% (11 of 140) was altered as a result of the molecular test. All but 1 of those patients were found to be overtreated relative to the surgical histopathology analysis. Molecular testing did not significantly affect the surgical decision-making process in this study. Among patients whose treatment was altered based on these markers, there was evidence of overtreatment.

  7. Acting Irrationally to Improve Performance in Stochastic Worlds

    NASA Astrophysics Data System (ADS)

    Belavkin, Roman V.

    Despite many theories and algorithms for decision-making, after estimating the utility function the choice is usually made by maximising its expected value (the max EU principle). This traditional and 'rational' conclusion of the decision-making process is compared in this paper with several 'irrational' techniques that make choice in Monte-Carlo fashion. The comparison is made by evaluating the performance of simple decision-theoretic agents in stochastic environments. It is shown that not only the random choice strategies can achieve performance comparable to the max EU method, but under certain conditions the Monte-Carlo choice methods perform almost two times better than the max EU. The paper concludes by quoting evidence from recent cognitive modelling works as well as the famous decision-making paradoxes.

  8. Comprehensible knowledge model creation for cancer treatment decision making.

    PubMed

    Afzal, Muhammad; Hussain, Maqbool; Ali Khan, Wajahat; Ali, Taqdir; Lee, Sungyoung; Huh, Eui-Nam; Farooq Ahmad, Hafiz; Jamshed, Arif; Iqbal, Hassan; Irfan, Muhammad; Abbas Hydari, Manzar

    2017-03-01

    A wealth of clinical data exists in clinical documents in the form of electronic health records (EHRs). This data can be used for developing knowledge-based recommendation systems that can assist clinicians in clinical decision making and education. One of the big hurdles in developing such systems is the lack of automated mechanisms for knowledge acquisition to enable and educate clinicians in informed decision making. An automated knowledge acquisition methodology with a comprehensible knowledge model for cancer treatment (CKM-CT) is proposed. With the CKM-CT, clinical data are acquired automatically from documents. Quality of data is ensured by correcting errors and transforming various formats into a standard data format. Data preprocessing involves dimensionality reduction and missing value imputation. Predictive algorithm selection is performed on the basis of the ranking score of the weighted sum model. The knowledge builder prepares knowledge for knowledge-based services: clinical decisions and education support. Data is acquired from 13,788 head and neck cancer (HNC) documents for 3447 patients, including 1526 patients of the oral cavity site. In the data quality task, 160 staging values are corrected. In the preprocessing task, 20 attributes and 106 records are eliminated from the dataset. The Classification and Regression Trees (CRT) algorithm is selected and provides 69.0% classification accuracy in predicting HNC treatment plans, consisting of 11 decision paths that yield 11 decision rules. Our proposed methodology, CKM-CT, is helpful to find hidden knowledge in clinical documents. In CKM-CT, the prediction models are developed to assist and educate clinicians for informed decision making. The proposed methodology is generalizable to apply to data of other domains such as breast cancer with a similar objective to assist clinicians in decision making and education. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Health decision making: lynchpin of evidence-based practice.

    PubMed

    Spring, Bonnie

    2008-01-01

    Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers' intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed.

  10. Health Decision Making: Lynchpin of Evidence-Based Practice

    PubMed Central

    Spring, Bonnie

    2008-01-01

    Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. Implications for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers’ intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed. PMID:19015288

  11. Expanded envelope concepts for aircraft control-element failure detection and identification

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.; Hsu, John Y.

    1988-01-01

    The purpose of this effort was to develop and demonstrate concepts for expanding the envelope of failure detection and isolation (FDI) algorithms for aircraft-path failures. An algorithm which uses analytic-redundancy in the form of aerodynamic force and moment balance equations was used. Because aircraft-path FDI uses analytical models, there is a tradeoff between accuracy and the ability to detect and isolate failures. For single flight condition operation, design and analysis methods are developed to deal with this robustness problem. When the departure from the single flight condition is significant, algorithm adaptation is necessary. Adaptation requirements for the residual generation portion of the FDI algorithm are interpreted as the need for accurate, large-motion aero-models, over a broad range of velocity and altitude conditions. For the decision-making part of the algorithm, adaptation may require modifications to filtering operations, thresholds, and projection vectors that define the various hypothesis tests performed in the decision mechanism. Methods of obtaining and evaluating adequate residual generation and decision-making designs have been developed. The application of the residual generation ideas to a high-performance fighter is demonstrated by developing adaptive residuals for the AFTI-F-16 and simulating their behavior under a variety of maneuvers using the results of a NASA F-16 simulation.

  12. A Hierarchical Approach to Target Recognition and Tracking. Summary of Results for the Period April 1, 1989-November 30, 1989

    DTIC Science & Technology

    1990-02-07

    performance assessment, human intervention, or operator training. Algorithms on different levels are allowed to deal with the world with different degrees...have on the decisions made by the driver are a complex combination of human factors, driving experience, mission objectives, tactics, etc., and...motion. The distinction here is that the decision making program may I 12 1 I not necessarily make its decisions based on the same factors as the human

  13. PGA/MOEAD: a preference-guided evolutionary algorithm for multi-objective decision-making problems with interval-valued fuzzy preferences

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Lin, Lin; Zhong, ShiSheng

    2018-02-01

    In this research, we propose a preference-guided optimisation algorithm for multi-criteria decision-making (MCDM) problems with interval-valued fuzzy preferences. The interval-valued fuzzy preferences are decomposed into a series of precise and evenly distributed preference-vectors (reference directions) regarding the objectives to be optimised on the basis of uniform design strategy firstly. Then the preference information is further incorporated into the preference-vectors based on the boundary intersection approach, meanwhile, the MCDM problem with interval-valued fuzzy preferences is reformulated into a series of single-objective optimisation sub-problems (each sub-problem corresponds to a decomposed preference-vector). Finally, a preference-guided optimisation algorithm based on MOEA/D (multi-objective evolutionary algorithm based on decomposition) is proposed to solve the sub-problems in a single run. The proposed algorithm incorporates the preference-vectors within the optimisation process for guiding the search procedure towards a more promising subset of the efficient solutions matching the interval-valued fuzzy preferences. In particular, lots of test instances and an engineering application are employed to validate the performance of the proposed algorithm, and the results demonstrate the effectiveness and feasibility of the algorithm.

  14. A mechanism for value-sensitive decision-making.

    PubMed

    Pais, Darren; Hogan, Patrick M; Schlegel, Thomas; Franks, Nigel R; Leonard, Naomi E; Marshall, James A R

    2013-01-01

    We present a dynamical systems analysis of a decision-making mechanism inspired by collective choice in house-hunting honeybee swarms, revealing the crucial role of cross-inhibitory 'stop-signalling' in improving the decision-making capabilities. We show that strength of cross-inhibition is a decision-parameter influencing how decisions depend both on the difference in value and on the mean value of the alternatives; this is in contrast to many previous mechanistic models of decision-making, which are typically sensitive to decision accuracy rather than the value of the option chosen. The strength of cross-inhibition determines when deadlock over similarly valued alternatives is maintained or broken, as a function of the mean value; thus, changes in cross-inhibition strength allow adaptive time-dependent decision-making strategies. Cross-inhibition also tunes the minimum difference between alternatives required for reliable discrimination, in a manner similar to Weber's law of just-noticeable difference. Finally, cross-inhibition tunes the speed-accuracy trade-off realised when differences in the values of the alternatives are sufficiently large to matter. We propose that the model, and the significant role of the values of the alternatives, may describe other decision-making systems, including intracellular regulatory circuits, and simple neural circuits, and may provide guidance in the design of decision-making algorithms for artificial systems, particularly those functioning without centralised control.

  15. A haptic-inspired audio approach for structural health monitoring decision-making

    NASA Astrophysics Data System (ADS)

    Mao, Zhu; Todd, Michael; Mascareñas, David

    2015-03-01

    Haptics is the field at the interface of human touch (tactile sensation) and classification, whereby tactile feedback is used to train and inform a decision-making process. In structural health monitoring (SHM) applications, haptic devices have been introduced and applied in a simplified laboratory scale scenario, in which nonlinearity, representing the presence of damage, was encoded into a vibratory manual interface. In this paper, the "spirit" of haptics is adopted, but here ultrasonic guided wave scattering information is transformed into audio (rather than tactile) range signals. After sufficient training, the structural damage condition, including occurrence and location, can be identified through the encoded audio waveforms. Different algorithms are employed in this paper to generate the transformed audio signals and the performance of each encoding algorithms is compared, and also compared with standard machine learning classifiers. In the long run, the haptic decision-making is aiming to detect and classify structural damages in a more rigorous environment, and approaching a baseline-free fashion with embedded temperature compensation.

  16. Towards a web-based decision support tool for selecting appropriate statistical test in medical and biological sciences.

    PubMed

    Suner, Aslı; Karakülah, Gökhan; Dicle, Oğuz

    2014-01-01

    Statistical hypothesis testing is an essential component of biological and medical studies for making inferences and estimations from the collected data in the study; however, the misuse of statistical tests is widely common. In order to prevent possible errors in convenient statistical test selection, it is currently possible to consult available test selection algorithms developed for various purposes. However, the lack of an algorithm presenting the most common statistical tests used in biomedical research in a single flowchart causes several problems such as shifting users among the algorithms, poor decision support in test selection and lack of satisfaction of potential users. Herein, we demonstrated a unified flowchart; covers mostly used statistical tests in biomedical domain, to provide decision aid to non-statistician users while choosing the appropriate statistical test for testing their hypothesis. We also discuss some of the findings while we are integrating the flowcharts into each other to develop a single but more comprehensive decision algorithm.

  17. Interacting neural networks.

    PubMed

    Metzler, R; Kinzel, W; Kanter, I

    2000-08-01

    Several scenarios of interacting neural networks which are trained either in an identical or in a competitive way are solved analytically. In the case of identical training each perceptron receives the output of its neighbor. The symmetry of the stationary state as well as the sensitivity to the used training algorithm are investigated. Two competitive perceptrons trained on mutually exclusive learning aims and a perceptron which is trained on the opposite of its own output are examined analytically. An ensemble of competitive perceptrons is used as decision-making algorithms in a model of a closed market (El Farol Bar problem or the Minority Game. In this game, a set of agents who have to make a binary decision is considered.); each network is trained on the history of minority decisions. This ensemble of perceptrons relaxes to a stationary state whose performance can be better than random.

  18. Interacting neural networks

    NASA Astrophysics Data System (ADS)

    Metzler, R.; Kinzel, W.; Kanter, I.

    2000-08-01

    Several scenarios of interacting neural networks which are trained either in an identical or in a competitive way are solved analytically. In the case of identical training each perceptron receives the output of its neighbor. The symmetry of the stationary state as well as the sensitivity to the used training algorithm are investigated. Two competitive perceptrons trained on mutually exclusive learning aims and a perceptron which is trained on the opposite of its own output are examined analytically. An ensemble of competitive perceptrons is used as decision-making algorithms in a model of a closed market (El Farol Bar problem or the Minority Game. In this game, a set of agents who have to make a binary decision is considered.); each network is trained on the history of minority decisions. This ensemble of perceptrons relaxes to a stationary state whose performance can be better than random.

  19. Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data.

    PubMed

    Barros, Rodrigo C; Winck, Ana T; Machado, Karina S; Basgalupp, Márcio P; de Carvalho, André C P L F; Ruiz, Duncan D; de Souza, Osmar Norberto

    2012-11-21

    This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.

  20. Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data

    PubMed Central

    2012-01-01

    Background This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. Results The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. Conclusions We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor. PMID:23171000

  1. Intersubjective decision-making for computer-aided forging technology design

    NASA Astrophysics Data System (ADS)

    Kanyukov, S. I.; Konovalov, A. V.; Muizemnek, O. Yu.

    2017-12-01

    We propose a concept of intersubjective decision-making for problems of open-die forging technology design. The intersubjective decisions are chosen from a set of feasible decisions using the fundamentals of the decision-making theory in fuzzy environment according to the Bellman-Zadeh scheme. We consider the formalization of subjective goals and the choice of membership functions for the decisions depending on subjective goals. We study the arrangement of these functions into an intersubjective membership function. The function is constructed for a resulting decision, which is chosen from a set of feasible decisions. The choice of the final intersubjective decision is discussed. All the issues are exemplified by a specific technological problem. The considered concept of solving technological problems under conditions of fuzzy goals allows one to choose the most efficient decisions from a set of feasible ones. These decisions correspond to the stated goals. The concept allows one to reduce human participation in automated design. This concept can be used to develop algorithms and design programs for forging numerous types of forged parts.

  2. Algorithms for in-season nutrient management in cereals

    USDA-ARS?s Scientific Manuscript database

    The demand for improved decision making products for cereal production systems has placed added emphasis on using plant sensors in-season, and that incorporate real-time, site specific, growing environments. The objective of this work was to describe validated in-season sensor based algorithms prese...

  3. A systematic approach to embedded biomedical decision making.

    PubMed

    Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver

    2012-11-01

    An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Manual and computer-aided materials selection for industrial production: An exercise in decision making

    NASA Technical Reports Server (NTRS)

    Bates, Seth P.

    1990-01-01

    Students are introduced to methods and concepts for systematic selection and evaluation of materials which are to be used to manufacture specific products in industry. For this laboratory exercise, students are asked to work in groups to identify and describe a product, then to proceed through the process to select a list of three candidates to make the item from. The exercise draws on knowledge of mechanical, physical, and chemical properties, common materials test techniques, and resource management skills in finding and assessing property data. A very important part of the exercise is the students' introduction to decision making algorithms, and learning how to apply them to a complex decision making process.

  5. Satellite image processing for precision agriculture and agroindustry using convolutional neural network and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Firdaus; Arkeman, Y.; Buono, A.; Hermadi, I.

    2017-01-01

    Translating satellite imagery to a useful data for decision making during this time are usually done manually by human. In this research, we are going to translate satellite imagery by using artificial intelligence method specifically using convolutional neural network and genetic algorithm to become a useful data for decision making, especially for precision agriculture and agroindustry. In this research, we are focused on how to made a sustainable land use planning with 3 objectives. The first is maximizing economic factor. Second is minimizing CO2 emission and the last is minimizing land degradation. Results show that by using artificial intelligence method, can produced a good pareto optimum solutions in a short time.

  6. Evolutionary Algorithm Based Automated Reverse Engineering and Defect Discovery

    DTIC Science & Technology

    2007-09-21

    a previous application of a GP as a data mining function to evolve fuzzy decision trees symbolically [3-5], the terminal set consisted of fuzzy...of input and output information is required. In the case of fuzzy decision trees, the database represented a collection of scenarios about which the...fuzzy decision tree to be evolved would make decisions . The database also had entries created by experts representing decisions about the scenarios

  7. Design and implementation of intelligent electronic warfare decision making algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Hsin-Hsien; Chen, Chang-Kuo; Hsueh, Chi-Shun

    2017-05-01

    Electromagnetic signals and the requirements of timely response have been a rapid growth in modern electronic warfare. Although jammers are limited resources, it is possible to achieve the best electronic warfare efficiency by tactical decisions. This paper proposes the intelligent electronic warfare decision support system. In this work, we develop a novel hybrid algorithm, Digital Pheromone Particle Swarm Optimization, based on Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and Shuffled Frog Leaping Algorithm (SFLA). We use PSO to solve the problem and combine the concept of pheromones in ACO to accumulate more useful information in spatial solving process and speed up finding the optimal solution. The proposed algorithm finds the optimal solution in reasonable computation time by using the method of matrix conversion in SFLA. The results indicated that jammer allocation was more effective. The system based on the hybrid algorithm provides electronic warfare commanders with critical information to assist commanders in effectively managing the complex electromagnetic battlefield.

  8. Using Virtual Reality to Assess Ethical Decisions in Road Traffic Scenarios: Applicability of Value-of-Life-Based Models and Influences of Time Pressure.

    PubMed

    Sütfeld, Leon R; Gast, Richard; König, Peter; Pipa, Gordon

    2017-01-01

    Self-driving cars are posing a new challenge to our ethics. By using algorithms to make decisions in situations where harming humans is possible, probable, or even unavoidable, a self-driving car's ethical behavior comes pre-defined. Ad hoc decisions are made in milliseconds, but can be based on extensive research and debates. The same algorithms are also likely to be used in millions of cars at a time, increasing the impact of any inherent biases, and increasing the importance of getting it right. Previous research has shown that moral judgment and behavior are highly context-dependent, and comprehensive and nuanced models of the underlying cognitive processes are out of reach to date. Models of ethics for self-driving cars should thus aim to match human decisions made in the same context. We employed immersive virtual reality to assess ethical behavior in simulated road traffic scenarios, and used the collected data to train and evaluate a range of decision models. In the study, participants controlled a virtual car and had to choose which of two given obstacles they would sacrifice in order to spare the other. We randomly sampled obstacles from a variety of inanimate objects, animals and humans. Our model comparison shows that simple models based on one-dimensional value-of-life scales are suited to describe human ethical behavior in these situations. Furthermore, we examined the influence of severe time pressure on the decision-making process. We found that it decreases consistency in the decision patterns, thus providing an argument for algorithmic decision-making in road traffic. This study demonstrates the suitability of virtual reality for the assessment of ethical behavior in humans, delivering consistent results across subjects, while closely matching the experimental settings to the real world scenarios in question.

  9. Using Virtual Reality to Assess Ethical Decisions in Road Traffic Scenarios: Applicability of Value-of-Life-Based Models and Influences of Time Pressure

    PubMed Central

    Sütfeld, Leon R.; Gast, Richard; König, Peter; Pipa, Gordon

    2017-01-01

    Self-driving cars are posing a new challenge to our ethics. By using algorithms to make decisions in situations where harming humans is possible, probable, or even unavoidable, a self-driving car's ethical behavior comes pre-defined. Ad hoc decisions are made in milliseconds, but can be based on extensive research and debates. The same algorithms are also likely to be used in millions of cars at a time, increasing the impact of any inherent biases, and increasing the importance of getting it right. Previous research has shown that moral judgment and behavior are highly context-dependent, and comprehensive and nuanced models of the underlying cognitive processes are out of reach to date. Models of ethics for self-driving cars should thus aim to match human decisions made in the same context. We employed immersive virtual reality to assess ethical behavior in simulated road traffic scenarios, and used the collected data to train and evaluate a range of decision models. In the study, participants controlled a virtual car and had to choose which of two given obstacles they would sacrifice in order to spare the other. We randomly sampled obstacles from a variety of inanimate objects, animals and humans. Our model comparison shows that simple models based on one-dimensional value-of-life scales are suited to describe human ethical behavior in these situations. Furthermore, we examined the influence of severe time pressure on the decision-making process. We found that it decreases consistency in the decision patterns, thus providing an argument for algorithmic decision-making in road traffic. This study demonstrates the suitability of virtual reality for the assessment of ethical behavior in humans, delivering consistent results across subjects, while closely matching the experimental settings to the real world scenarios in question. PMID:28725188

  10. Comparison of rule induction, decision trees and formal concept analysis approaches for classification

    NASA Astrophysics Data System (ADS)

    Kotelnikov, E. V.; Milov, V. R.

    2018-05-01

    Rule-based learning algorithms have higher transparency and easiness to interpret in comparison with neural networks and deep learning algorithms. These properties make it possible to effectively use such algorithms to solve descriptive tasks of data mining. The choice of an algorithm depends also on its ability to solve predictive tasks. The article compares the quality of the solution of the problems with binary and multiclass classification based on the experiments with six datasets from the UCI Machine Learning Repository. The authors investigate three algorithms: Ripper (rule induction), C4.5 (decision trees), In-Close (formal concept analysis). The results of the experiments show that In-Close demonstrates the best quality of classification in comparison with Ripper and C4.5, however the latter two generate more compact rule sets.

  11. Team formation and breakup in multiagent systems

    NASA Astrophysics Data System (ADS)

    Rao, Venkatesh Guru

    The goal of this dissertation is to pose and solve problems involving team formation and breakup in two specific multiagent domains: formation travel and space-based interferometric observatories. The methodology employed comprises elements drawn from control theory, scheduling theory and artificial intelligence (AI). The original contribution of the work comprises three elements. The first contribution, the partitioned state-space approach is a technique for formulating and solving co-ordinated motion problem using calculus of variations techniques. The approach is applied to obtain optimal two-agent formation travel trajectories on graphs. The second contribution is the class of MixTeam algorithms, a class of team dispatchers that extends classical dispatching by accommodating team formation and breakup and exploration/exploitation learning. The algorithms are applied to observation scheduling and constellation geometry design for interferometric space telescopes. The use of feedback control for team scheduling is also demonstrated with these algorithms. The third contribution is the analysis of the optimality properties of greedy, or myopic, decision-making for a simple class of team dispatching problems. This analysis represents a first step towards the complete analysis of complex team schedulers such as the MixTeam algorithms. The contributions represent an extension to the literature on team dynamics in control theory. The broad conclusions that emerge from this research are that greedy or myopic decision-making strategies for teams perform well when specific parameters in the domain are weakly affected by an agent's actions, and that intelligent systems require a closer integration of domain knowledge in decision-making functions.

  12. FINITE-STATE APPROXIMATIONS TO DENUMERABLE-STATE DYNAMIC PROGRAMS,

    DTIC Science & Technology

    AIR FORCE OPERATIONS, LOGISTICS), (*INVENTORY CONTROL, DYNAMIC PROGRAMMING), (*DYNAMIC PROGRAMMING, APPROXIMATION(MATHEMATICS)), INVENTORY CONTROL, DECISION MAKING, STOCHASTIC PROCESSES, GAME THEORY, ALGORITHMS, CONVERGENCE

  13. Response time distributions in rapid chess: a large-scale decision making experiment.

    PubMed

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  14. Response Time Distributions in Rapid Chess: A Large-Scale Decision Making Experiment

    PubMed Central

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A.

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation. PMID:21031032

  15. A collaborative scheduling model for the supply-hub with multiple suppliers and multiple manufacturers.

    PubMed

    Li, Guo; Lv, Fei; Guan, Xu

    2014-01-01

    This paper investigates a collaborative scheduling model in the assembly system, wherein multiple suppliers have to deliver their components to the multiple manufacturers under the operation of Supply-Hub. We first develop two different scenarios to examine the impact of Supply-Hub. One is that suppliers and manufacturers make their decisions separately, and the other is that the Supply-Hub makes joint decisions with collaborative scheduling. The results show that our scheduling model with the Supply-Hub is a NP-complete problem, therefore, we propose an auto-adapted differential evolution algorithm to solve this problem. Moreover, we illustrate that the performance of collaborative scheduling by the Supply-Hub is superior to separate decision made by each manufacturer and supplier. Furthermore, we also show that the algorithm proposed has good convergence and reliability, which can be applicable to more complicated supply chain environment.

  16. Quantum ensembles of quantum classifiers.

    PubMed

    Schuld, Maria; Petruccione, Francesco

    2018-02-09

    Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which - similar to Bayesian learning - the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.

  17. A Collaborative Scheduling Model for the Supply-Hub with Multiple Suppliers and Multiple Manufacturers

    PubMed Central

    Lv, Fei; Guan, Xu

    2014-01-01

    This paper investigates a collaborative scheduling model in the assembly system, wherein multiple suppliers have to deliver their components to the multiple manufacturers under the operation of Supply-Hub. We first develop two different scenarios to examine the impact of Supply-Hub. One is that suppliers and manufacturers make their decisions separately, and the other is that the Supply-Hub makes joint decisions with collaborative scheduling. The results show that our scheduling model with the Supply-Hub is a NP-complete problem, therefore, we propose an auto-adapted differential evolution algorithm to solve this problem. Moreover, we illustrate that the performance of collaborative scheduling by the Supply-Hub is superior to separate decision made by each manufacturer and supplier. Furthermore, we also show that the algorithm proposed has good convergence and reliability, which can be applicable to more complicated supply chain environment. PMID:24892104

  18. Benefits Assessment of Algorithmically Combining Generic High Altitude Airspace Sectors

    NASA Technical Reports Server (NTRS)

    Bloem, Michael; Gupta, Pramod; Lai, Chok Fung; Kopardekar, Parimal

    2009-01-01

    In today's air traffic control operations, sectors that have traffic demand below capacity are combined so that fewer controller teams are required to manage air traffic. Controllers in current operations are certified to control a group of six to eight sectors, known as an area of specialization. Sector combinations are restricted to occur within areas of specialization. Since there are few sector combination possibilities in each area of specialization, human supervisors can effectively make sector combination decisions. In the future, automation and procedures will allow any appropriately trained controller to control any of a large set of generic sectors. The primary benefit of this will be increased controller staffing flexibility. Generic sectors will also allow more options for combining sectors, making sector combination decisions difficult for human supervisors. A sector-combining algorithm can assist supervisors as they make generic sector combination decisions. A heuristic algorithm for combining under-utilized air space sectors to conserve air traffic control resources has been described and analyzed. Analysis of the algorithm and comparisons with operational sector combinations indicate that this algorithm could more efficiently utilize air traffic control resources than current sector combinations. This paper investigates the benefits of using the sector-combining algorithm proposed in previous research to combine high altitude generic airspace sectors. Simulations are conducted in which all the high altitude sectors in a center are allowed to combine, as will be possible in generic high altitude airspace. Furthermore, the algorithm is adjusted to use a version of the simplified dynamic density (SDD) workload metric that has been modified to account for workload reductions due to automatic handoffs and Automatic Dependent Surveillance Broadcast (ADS-B). This modified metric is referred to here as future simplified dynamic density (FSDD). Finally, traffic demand sets with increased air traffic demand are used in the simulations to capture the expected growth in air traffic demand by the mid-term.

  19. Final findings on the development and evaluation of an en-route fuel optimal conflict resolution algorithm to support strategic decision-making.

    DOT National Transportation Integrated Search

    2012-01-01

    The novel strategic conflict-resolution algorithm for fuel minimization that is documented in this report : provides air traffic controllers and/or pilots with fuel-optimal heading, speed, and altitude : recommendations in the en route flight phase, ...

  20. Human-like machines: Transparency and comprehensibility.

    PubMed

    Patrzyk, Piotr M; Link, Daniela; Marewski, Julian N

    2017-01-01

    Artificial intelligence algorithms seek inspiration from human cognitive systems in areas where humans outperform machines. But on what level should algorithms try to approximate human cognition? We argue that human-like machines should be designed to make decisions in transparent and comprehensible ways, which can be achieved by accurately mirroring human cognitive processes.

  1. Does Decision Quality (Always) Increase with the Size of Information Samples? Some Vicissitudes in Applying the Law of Large Numbers

    ERIC Educational Resources Information Center

    Fiedler, Klaus; Kareev, Yaakov

    2006-01-01

    Adaptive decision making requires that contingencies between decision options and their relative assets be assessed accurately and quickly. The present research addresses the challenging notion that contingencies may be more visible from small than from large samples of observations. An algorithmic account for such a seemingly paradoxical effect…

  2. Pattern recognition for passive polarimetric data using nonparametric classifiers

    NASA Astrophysics Data System (ADS)

    Thilak, Vimal; Saini, Jatinder; Voelz, David G.; Creusere, Charles D.

    2005-08-01

    Passive polarization based imaging is a useful tool in computer vision and pattern recognition. A passive polarization imaging system forms a polarimetric image from the reflection of ambient light that contains useful information for computer vision tasks such as object detection (classification) and recognition. Applications of polarization based pattern recognition include material classification and automatic shape recognition. In this paper, we present two target detection algorithms for images captured by a passive polarimetric imaging system. The proposed detection algorithms are based on Bayesian decision theory. In these approaches, an object can belong to one of any given number classes and classification involves making decisions that minimize the average probability of making incorrect decisions. This minimum is achieved by assigning an object to the class that maximizes the a posteriori probability. Computing a posteriori probabilities requires estimates of class conditional probability density functions (likelihoods) and prior probabilities. A Probabilistic neural network (PNN), which is a nonparametric method that can compute Bayes optimal boundaries, and a -nearest neighbor (KNN) classifier, is used for density estimation and classification. The proposed algorithms are applied to polarimetric image data gathered in the laboratory with a liquid crystal-based system. The experimental results validate the effectiveness of the above algorithms for target detection from polarimetric data.

  3. A novel medical information management and decision model for uncertain demand optimization.

    PubMed

    Bi, Ya

    2015-01-01

    Accurately planning the procurement volume is an effective measure for controlling the medicine inventory cost. Due to uncertain demand it is difficult to make accurate decision on procurement volume. As to the biomedicine sensitive to time and season demand, the uncertain demand fitted by the fuzzy mathematics method is obviously better than general random distribution functions. To establish a novel medical information management and decision model for uncertain demand optimization. A novel optimal management and decision model under uncertain demand has been presented based on fuzzy mathematics and a new comprehensive improved particle swarm algorithm. The optimal management and decision model can effectively reduce the medicine inventory cost. The proposed improved particle swarm optimization is a simple and effective algorithm to improve the Fuzzy interference and hence effectively reduce the calculation complexity of the optimal management and decision model. Therefore the new model can be used for accurate decision on procurement volume under uncertain demand.

  4. Using Anticipative Malware Analysis to Support Decision Making

    DTIC Science & Technology

    2010-11-01

    specifically, we have designed and implemented a network sandbox, i.e. a sandbox that allows us to study malware behaviour from the network perspective. We...plan to use this sandbox to generate malware-sample profiles that can be used by decision making algorithms to help network administrators and security...also allows the user to specify the network topology to be used. 1 INTRODUCTION Once the presence of a malicious software (malware) threat has been

  5. Modeling Confidence Judgments, Response Times, and Multiple Choices in Decision Making: Recognition Memory and Motion Discrimination

    PubMed Central

    Ratcliff, Roger; Starns, Jeffrey J.

    2014-01-01

    Confidence in judgments is a fundamental aspect of decision making, and tasks that collect confidence judgments are an instantiation of multiple-choice decision making. We present a model for confidence judgments in recognition memory tasks that uses a multiple-choice diffusion decision process with separate accumulators of evidence for the different confidence choices. The accumulator that first reaches its decision boundary determines which choice is made. Five algorithms for accumulating evidence were compared, and one of them produced proportions of responses for each of the choices and full response time distributions for each choice that closely matched empirical data. With this algorithm, an increase in the evidence in one accumulator is accompanied by a decrease in the others so that the total amount of evidence in the system is constant. Application of the model to the data from an earlier experiment (Ratcliff, McKoon, & Tindall, 1994) uncovered a relationship between the shapes of z-transformed receiver operating characteristics and the behavior of response time distributions. Both are explained in the model by the behavior of the decision boundaries. For generality, we also applied the decision model to a 3-choice motion discrimination task and found it accounted for data better than a competing class of models. The confidence model presents a coherent account of confidence judgments and response time that cannot be explained with currently popular signal detection theory analyses or dual-process models of recognition. PMID:23915088

  6. Clean birth kits to improve birth practices: development and testing of a country level decision support tool.

    PubMed

    Hundley, Vanora A; Avan, Bilal I; Ahmed, Haris; Graham, Wendy J

    2012-12-19

    Clean birth practices can prevent sepsis, one of the leading causes of both maternal and newborn mortality. Evidence suggests that clean birth kits (CBKs), as part of package that includes education, are associated with a reduction in newborn mortality, omphalitis, and puerperal sepsis. However, questions remain about how best to approach the introduction of CBKs in country. We set out to develop a practical decision support tool for programme managers of public health systems who are considering the potential role of CBKs in their strategy for care at birth. Development and testing of the decision support tool was a three-stage process involving an international expert group and country level testing. Stage 1, the development of the tool was undertaken by the Birth Kit Working Group and involved a review of the evidence, a consensus meeting, drafting of the proposed tool and expert review. In Stage 2 the tool was tested with users through interviews (9) and a focus group, with federal and provincial level decision makers in Pakistan. In Stage 3 the findings from the country level testing were reviewed by the expert group. The decision support tool comprised three separate algorithms to guide the policy maker or programme manager through the specific steps required in making the country level decision about whether to use CBKs. The algorithms were supported by a series of questions (that could be administered by interview, focus group or questionnaire) to help the decision maker identify the information needed. The country level testing revealed that the decision support tool was easy to follow and helpful in making decisions about the potential role of CBKs. Minor modifications were made and the final algorithms are presented. Testing of the tool with users in Pakistan suggests that the tool facilitates discussion and aids decision making. However, testing in other countries is needed to determine whether these results can be replicated and to identify how the tool can be adapted to meet country specific needs.

  7. Multi-Parent Clustering Algorithms from Stochastic Grammar Data Models

    NASA Technical Reports Server (NTRS)

    Mjoisness, Eric; Castano, Rebecca; Gray, Alexander

    1999-01-01

    We introduce a statistical data model and an associated optimization-based clustering algorithm which allows data vectors to belong to zero, one or several "parent" clusters. For each data vector the algorithm makes a discrete decision among these alternatives. Thus, a recursive version of this algorithm would place data clusters in a Directed Acyclic Graph rather than a tree. We test the algorithm with synthetic data generated according to the statistical data model. We also illustrate the algorithm using real data from large-scale gene expression assays.

  8. Enhancement of Fast Face Detection Algorithm Based on a Cascade of Decision Trees

    NASA Astrophysics Data System (ADS)

    Khryashchev, V. V.; Lebedev, A. A.; Priorov, A. L.

    2017-05-01

    Face detection algorithm based on a cascade of ensembles of decision trees (CEDT) is presented. The new approach allows detecting faces other than the front position through the use of multiple classifiers. Each classifier is trained for a specific range of angles of the rotation head. The results showed a high rate of productivity for CEDT on images with standard size. The algorithm increases the area under the ROC-curve of 13% compared to a standard Viola-Jones face detection algorithm. Final realization of given algorithm consist of 5 different cascades for frontal/non-frontal faces. One more thing which we take from the simulation results is a low computational complexity of CEDT algorithm in comparison with standard Viola-Jones approach. This could prove important in the embedded system and mobile device industries because it can reduce the cost of hardware and make battery life longer.

  9. Intelligent deflection routing in buffer-less networks.

    PubMed

    Haeri, Soroush; Trajković, Ljiljana

    2015-02-01

    Deflection routing is employed to ameliorate packet loss caused by contention in buffer-less architectures such as optical burst-switched networks. The main goal of deflection routing is to successfully deflect a packet based only on a limited knowledge that network nodes possess about their environment. In this paper, we present a framework that introduces intelligence to deflection routing (iDef). iDef decouples the design of the signaling infrastructure from the underlying learning algorithm. It consists of a signaling and a decision-making module. Signaling module implements a feedback management protocol while the decision-making module implements a reinforcement learning algorithm. We also propose several learning-based deflection routing protocols, implement them in iDef using the ns-3 network simulator, and compare their performance.

  10. How do small groups make decisions? : A theoretical framework to inform the implementation and study of clinical competency committees.

    PubMed

    Chahine, Saad; Cristancho, Sayra; Padgett, Jessica; Lingard, Lorelei

    2017-06-01

    In the competency-based medical education (CBME) approach, clinical competency committees are responsible for making decisions about trainees' competence. However, we currently lack a theoretical model for group decision-making to inform this emerging assessment phenomenon. This paper proposes an organizing framework to study and guide the decision-making processes of clinical competency committees.This is an explanatory, non-exhaustive review, tailored to identify relevant theoretical and evidence-based papers related to small group decision-making. The search was conducted using Google Scholar, Web of Science, MEDLINE, ERIC, and PsycINFO for relevant literature. Using a thematic analysis, two researchers (SC & JP) met four times between April-June 2016 to consolidate the literature included in this review.Three theoretical orientations towards group decision-making emerged from the review: schema, constructivist, and social influence. Schema orientations focus on how groups use algorithms for decision-making. Constructivist orientations focus on how groups construct their shared understanding. Social influence orientations focus on how individual members influence the group's perspective on a decision. Moderators of decision-making relevant to all orientations include: guidelines, stressors, authority, and leadership.Clinical competency committees are the mechanisms by which groups of clinicians will be in charge of interpreting multiple assessment data points and coming to a shared decision about trainee competence. The way in which these committees make decisions can have huge implications for trainee progression and, ultimately, patient care. Therefore, there is a pressing need to build the science of how such group decision-making works in practice. This synthesis suggests a preliminary organizing framework that can be used in the implementation and study of clinical competency committees.

  11. Noise, cost and speed-accuracy trade-offs: decision-making in a decentralized system

    PubMed Central

    Marshall, James A.R.; Dornhaus, Anna; Franks, Nigel R.; Kovacs, Tim

    2005-01-01

    Many natural and artificial decision-making systems face decision problems where there is an inherent compromise between two or more objectives. One such common compromise is between the speed and accuracy of a decision. The ability to exploit the characteristics of a decision problem in order to vary between the extremes of making maximally rapid, or maximally accurate decisions, is a useful property of such systems. Colonies of the ant Temnothorax albipennis (formerly Leptothorax albipennis) are a paradigmatic decentralized decision-making system, and have been shown flexibly to compromise accuracy for speed when making decisions during house-hunting. During emigration, a colony must typically evaluate and choose between several possible alternative new nest sites of differing quality. In this paper, we examine this speed-accuracy trade-off through modelling, and conclude that noise and time-cost of assessing alternative choices are likely to be significant for T. albipennis. Noise and cost of such assessments are likely to mean that T. albipennis' decision-making mechanism is Pareto-optimal in one crucial regard; increasing the willingness of individuals to change their decisions cannot improve collective accuracy overall without impairing speed. We propose that a decentralized control algorithm based on this emigration behaviour may be derived for applications in engineering domains and specify the characteristics of the problems to which it should be suited, based on our new results. PMID:16849234

  12. Decision making based on data analysis and optimization algorithm applied for cogeneration systems integration into a grid

    NASA Astrophysics Data System (ADS)

    Asmar, Joseph Al; Lahoud, Chawki; Brouche, Marwan

    2018-05-01

    Cogeneration and trigeneration systems can contribute to the reduction of primary energy consumption and greenhouse gas emissions in residential and tertiary sectors, by reducing fossil fuels demand and grid losses with respect to conventional systems. The cogeneration systems are characterized by a very high energy efficiency (80 to 90%) as well as a less polluting aspect compared to the conventional energy production. The integration of these systems into the energy network must simultaneously take into account their economic and environmental challenges. In this paper, a decision-making strategy will be introduced and is divided into two parts. The first one is a strategy based on a multi-objective optimization tool with data analysis and the second part is based on an optimization algorithm. The power dispatching of the Lebanese electricity grid is then simulated and considered as a case study in order to prove the compatibility of the cogeneration power calculated by our decision-making technique. In addition, the thermal energy produced by the cogeneration systems which capacity is selected by our technique shows compatibility with the thermal demand for district heating.

  13. A controllable sensor management algorithm capable of learning

    NASA Astrophysics Data System (ADS)

    Osadciw, Lisa A.; Veeramacheneni, Kalyan K.

    2005-03-01

    Sensor management technology progress is challenged by the geographic space it spans, the heterogeneity of the sensors, and the real-time timeframes within which plans controlling the assets are executed. This paper presents a new sensor management paradigm and demonstrates its application in a sensor management algorithm designed for a biometric access control system. This approach consists of an artificial intelligence (AI) algorithm focused on uncertainty measures, which makes the high level decisions to reduce uncertainties and interfaces with the user, integrated cohesively with a bottom up evolutionary algorithm, which optimizes the sensor network"s operation as determined by the AI algorithm. The sensor management algorithm presented is composed of a Bayesian network, the AI algorithm component, and a swarm optimization algorithm, the evolutionary algorithm. Thus, the algorithm can change its own performance goals in real-time and will modify its own decisions based on observed measures within the sensor network. The definition of the measures as well as the Bayesian network determine the robustness of the algorithm and its utility in reacting dynamically to changes in the global system.

  14. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty

    PubMed Central

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications. PMID:27835670

  15. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    PubMed

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  16. Revisiting the age of enlightenment from a collective decision making systems perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Marko A; Watkins, Jennifer H

    2009-01-01

    The ideals of the eighteenth century's Age of Enlightenment are the foundation of modern democracies. The era was characterized by thinkers who promoted progressive social reforms that opposed the long-established aristocracies and monarchies of the time. Prominent examples of such reforms include the establishment of inalienable human rights, self-governing republics, and market capitalism. Twenty-first century democratic nations can benefit from revisiting the systems developed during the Enlightenment and reframing them within the techno-social context of the Information Age. This article explores the application of social algorithms that make use of Thomas Paine's (English: 1737--1809) representatives, Adam Smith's (Scottish: 1723--1790) self-interestedmore » actors, and Marquis de Condorcet's (French: 1743--1794) optimal decision making groups. It is posited that technology-enabled social algorithms can better realize the ideals articulated during the Enlightenment.« less

  17. Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application

    NASA Astrophysics Data System (ADS)

    Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei

    2016-04-01

    In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.

  18. Using multicriteria decision analysis during drug development to predict reimbursement decisions.

    PubMed

    Williams, Paul; Mauskopf, Josephine; Lebiecki, Jake; Kilburg, Anne

    2014-01-01

    Pharmaceutical companies design clinical development programs to generate the data that they believe will support reimbursement for the experimental compound. The objective of the study was to present a process for using multicriteria decision analysis (MCDA) by a pharmaceutical company to estimate the probability of a positive recommendation for reimbursement for a new drug given drug and environmental attributes. The MCDA process included 1) selection of decisions makers who were representative of those making reimbursement decisions in a specific country; 2) two pre-workshop questionnaires to identify the most important attributes and their relative importance for a positive recommendation for a new drug; 3) a 1-day workshop during which participants undertook three tasks: i) they agreed on a final list of decision attributes and their importance weights, ii) they developed level descriptions for these attributes and mapped each attribute level to a value function, and iii) they developed profiles for hypothetical products 'just likely to be reimbursed'; and 4) use of the data from the workshop to develop a prediction algorithm based on a logistic regression analysis. The MCDA process is illustrated using case studies for three countries, the United Kingdom, Germany, and Spain. The extent to which the prediction algorithms for each country captured the decision processes for the workshop participants in our case studies was tested using a post-meeting questionnaire that asked the participants to make recommendations for a set of hypothetical products. The data collected in the case study workshops resulted in a prediction algorithm: 1) for the United Kingdom, the probability of a positive recommendation for different ranges of cost-effectiveness ratios; 2) for Spain, the probability of a positive recommendation at the national and regional levels; and 3) for Germany, the probability of a determination of clinical benefit. The results from the post-meeting questionnaire revealed a high predictive value for the algorithm developed using MCDA. Prediction algorithms developed using MCDA could be used by pharmaceutical companies when designing their clinical development programs to estimate the likelihood of a favourable reimbursement recommendation for different product profiles and for different positions in the treatment pathway.

  19. Using multicriteria decision analysis during drug development to predict reimbursement decisions

    PubMed Central

    Williams, Paul; Mauskopf, Josephine; Lebiecki, Jake; Kilburg, Anne

    2014-01-01

    Background Pharmaceutical companies design clinical development programs to generate the data that they believe will support reimbursement for the experimental compound. Objective The objective of the study was to present a process for using multicriteria decision analysis (MCDA) by a pharmaceutical company to estimate the probability of a positive recommendation for reimbursement for a new drug given drug and environmental attributes. Methods The MCDA process included 1) selection of decisions makers who were representative of those making reimbursement decisions in a specific country; 2) two pre-workshop questionnaires to identify the most important attributes and their relative importance for a positive recommendation for a new drug; 3) a 1-day workshop during which participants undertook three tasks: i) they agreed on a final list of decision attributes and their importance weights, ii) they developed level descriptions for these attributes and mapped each attribute level to a value function, and iii) they developed profiles for hypothetical products ‘just likely to be reimbursed’; and 4) use of the data from the workshop to develop a prediction algorithm based on a logistic regression analysis. The MCDA process is illustrated using case studies for three countries, the United Kingdom, Germany, and Spain. The extent to which the prediction algorithms for each country captured the decision processes for the workshop participants in our case studies was tested using a post-meeting questionnaire that asked the participants to make recommendations for a set of hypothetical products. Results The data collected in the case study workshops resulted in a prediction algorithm: 1) for the United Kingdom, the probability of a positive recommendation for different ranges of cost-effectiveness ratios; 2) for Spain, the probability of a positive recommendation at the national and regional levels; and 3) for Germany, the probability of a determination of clinical benefit. The results from the post-meeting questionnaire revealed a high predictive value for the algorithm developed using MCDA. Conclusions Prediction algorithms developed using MCDA could be used by pharmaceutical companies when designing their clinical development programs to estimate the likelihood of a favourable reimbursement recommendation for different product profiles and for different positions in the treatment pathway.

  20. Implementation of an Evidence-Based Seizure Algorithm in Intellectual Disability Nursing: A Pilot Study

    ERIC Educational Resources Information Center

    Auberry, Kathy; Cullen, Deborah

    2016-01-01

    Based on the results of the Surrogate Decision-Making Self Efficacy Scale (Lopez, 2009a), this study sought to determine whether nurses working in the field of intellectual disability (ID) experience increased confidence when they implemented the American Association of Neuroscience Nurses (AANN) Seizure Algorithm during telephone triage. The…

  1. Functional specialization of the primate frontal cortex during decision making.

    PubMed

    Lee, Daeyeol; Rushworth, Matthew F S; Walton, Mark E; Watanabe, Masataka; Sakagami, Masamichi

    2007-08-01

    Economic theories of decision making are based on the principle of utility maximization, and reinforcement-learning theory provides computational algorithms that can be used to estimate the overall reward expected from alternative choices. These formal models not only account for a large range of behavioral observations in human and animal decision makers, but also provide useful tools for investigating the neural basis of decision making. Nevertheless, in reality, decision makers must combine different types of information about the costs and benefits associated with each available option, such as the quality and quantity of expected reward and required work. In this article, we put forward the hypothesis that different subdivisions of the primate frontal cortex may be specialized to focus on different aspects of dynamic decision-making processes. In this hypothesis, the lateral prefrontal cortex is primarily involved in maintaining the state representation necessary to identify optimal actions in a given environment. In contrast, the orbitofrontal cortex and the anterior cingulate cortex might be primarily involved in encoding and updating the utilities associated with different sensory stimuli and alternative actions, respectively. These cortical areas are also likely to contribute to decision making in a social context.

  2. Primary Repair of Moderate Severity Rhegmatogenous Retinal Detachment: A Critical Decision-Making Algorithm.

    PubMed

    Velez-Montoya, Raul; Jacobo-Oceguera, Paola; Flores-Preciado, Javier; Dalma-Weiszhausz, Jose; Guerrero-Naranjo, Jose; Salcedo-Villanueva, Guillermo; Garcia-Aguirre, Gerardo; Fromow-Guerra, Jans; Morales-Canton, Virgilio

    2016-01-01

    We reviewed all the available data regarding the current management of non-complex rhegmatogenous retinal detachment and aimed to propose a new decision-making algorithm aimed to improve the single surgery success rate for mid-severity rhegmatogenous retinal detachment. An online review of the Pubmed database was performed. We searched for all available manuscripts about the anatomical and functional outcomes after the surgical management, by either scleral buckle or primary pars plana vitrectomy, of retinal detachment. The search was limited to articles published from January 1995 to December 2015. All articles obtained from the search were carefully screened and their references were manually reviewed for additional relevant data. Our search specifically focused on preoperative clinical data that were associated with the surgical outcomes. After categorizing the available data according to their level of evidence, with randomized-controlled clinical trials as the highest possible level of evidence, followed by retrospective studies, and retrospective case series as the lowest level of evidence, we proceeded to design a logical decision-making algorithm, enhanced by our experiences as retinal surgeons. A total of 7 randomized-controlled clinical trials, 19 retrospective studies, and 9 case series were considered. Additional articles were also included in order to support the observations further. Rhegmatogenous retinal detachment is a potentially blinding disorder. Its surgical management seems to depend more on a surgeon´s preference than solid scientific data or is based on a good clinical history and examination. The algorithms proposed herein strive to offer a more rational approach to improve both anatomical and functional outcomes after the first surgery.

  3. Primary Repair of Moderate Severity Rhegmatogenous Retinal Detachment: A Critical Decision-Making Algorithm

    PubMed Central

    VELEZ-MONTOYA, Raul; JACOBO-OCEGUERA, Paola; FLORES-PRECIADO, Javier; DALMA-WEISZHAUSZ, Jose; GUERRERO-NARANJO, Jose; SALCEDO-VILLANUEVA, Guillermo; GARCIA-AGUIRRE, Gerardo; FROMOW-GUERRA, Jans; MORALES-CANTON, Virgilio

    2016-01-01

    We reviewed all the available data regarding the current management of non-complex rhegmatogenous retinal detachment and aimed to propose a new decision-making algorithm aimed to improve the single surgery success rate for mid-severity rhegmatogenous retinal detachment. An online review of the Pubmed database was performed. We searched for all available manuscripts about the anatomical and functional outcomes after the surgical management, by either scleral buckle or primary pars plana vitrectomy, of retinal detachment. The search was limited to articles published from January 1995 to December 2015. All articles obtained from the search were carefully screened and their references were manually reviewed for additional relevant data. Our search specifically focused on preoperative clinical data that were associated with the surgical outcomes. After categorizing the available data according to their level of evidence, with randomized-controlled clinical trials as the highest possible level of evidence, followed by retrospective studies, and retrospective case series as the lowest level of evidence, we proceeded to design a logical decision-making algorithm, enhanced by our experiences as retinal surgeons. A total of 7 randomized-controlled clinical trials, 19 retrospective studies, and 9 case series were considered. Additional articles were also included in order to support the observations further. Rhegmatogenous retinal detachment is a potentially blinding disorder. Its surgical management seems to depend more on a surgeon´s preference than solid scientific data or is based on a good clinical history and examination. The algorithms proposed herein strive to offer a more rational approach to improve both anatomical and functional outcomes after the first surgery. PMID:28289689

  4. A wavelet-based adaptive fusion algorithm of infrared polarization imaging

    NASA Astrophysics Data System (ADS)

    Yang, Wei; Gu, Guohua; Chen, Qian; Zeng, Haifang

    2011-08-01

    The purpose of infrared polarization image is to highlight man-made target from a complex natural background. For the infrared polarization images can significantly distinguish target from background with different features, this paper presents a wavelet-based infrared polarization image fusion algorithm. The method is mainly for image processing of high-frequency signal portion, as for the low frequency signal, the original weighted average method has been applied. High-frequency part is processed as follows: first, the source image of the high frequency information has been extracted by way of wavelet transform, then signal strength of 3*3 window area has been calculated, making the regional signal intensity ration of source image as a matching measurement. Extraction method and decision mode of the details are determined by the decision making module. Image fusion effect is closely related to the setting threshold of decision making module. Compared to the commonly used experiment way, quadratic interpolation optimization algorithm is proposed in this paper to obtain threshold. Set the endpoints and midpoint of the threshold searching interval as initial interpolation nodes, and compute the minimum quadratic interpolation function. The best threshold can be obtained by comparing the minimum quadratic interpolation function. A series of image quality evaluation results show this method has got improvement in fusion effect; moreover, it is not only effective for some individual image, but also for a large number of images.

  5. Prediction of insemination outcomes in Holstein dairy cattle using alternative machine learning algorithms.

    PubMed

    Shahinfar, Saleh; Page, David; Guenther, Jerry; Cabrera, Victor; Fricke, Paul; Weigel, Kent

    2014-02-01

    When making the decision about whether or not to breed a given cow, knowledge about the expected outcome would have an economic impact on profitability of the breeding program and net income of the farm. The outcome of each breeding can be affected by many management and physiological features that vary between farms and interact with each other. Hence, the ability of machine learning algorithms to accommodate complex relationships in the data and missing values for explanatory variables makes these algorithms well suited for investigation of reproduction performance in dairy cattle. The objective of this study was to develop a user-friendly and intuitive on-farm tool to help farmers make reproduction management decisions. Several different machine learning algorithms were applied to predict the insemination outcomes of individual cows based on phenotypic and genotypic data. Data from 26 dairy farms in the Alta Genetics (Watertown, WI) Advantage Progeny Testing Program were used, representing a 10-yr period from 2000 to 2010. Health, reproduction, and production data were extracted from on-farm dairy management software, and estimated breeding values were downloaded from the US Department of Agriculture Agricultural Research Service Animal Improvement Programs Laboratory (Beltsville, MD) database. The edited data set consisted of 129,245 breeding records from primiparous Holstein cows and 195,128 breeding records from multiparous Holstein cows. Each data point in the final data set included 23 and 25 explanatory variables and 1 binary outcome for of 0.756 ± 0.005 and 0.736 ± 0.005 for primiparous and multiparous cows, respectively. The naïve Bayes algorithm, Bayesian network, and decision tree algorithms showed somewhat poorer classification performance. An information-based variable selection procedure identified herd average conception rate, incidence of ketosis, number of previous (failed) inseminations, days in milk at breeding, and mastitis as the most effective explanatory variables in predicting pregnancy outcome. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Merkel cell carcinoma: An algorithm for multidisciplinary management and decision-making.

    PubMed

    Prieto, Isabel; Pérez de la Fuente, Teresa; Medina, Susana; Castelo, Beatriz; Sobrino, Beatriz; Fortes, Jose R; Esteban, David; Cassinello, Fernando; Jover, Raquel; Rodríguez, Nuria

    2016-02-01

    Merkel cell carcinoma (MCC) is a rare and aggressive neuroendocrine tumor of the skin. Therapeutic approach is often unclear, and considerable controversy exists regarding MCC pathogenesis and optimal management. Due to its rising incidence and poor prognosis, it is imperative to establish the optimal therapy for both the tumor and the lymph node basin, and for treatment to include sentinel node biopsy. Sentinel node biopsy is currently the most consistent predictor of survival for MCC patients, although there are conflicting views and a lack of awareness regarding node management. Tumor and node management involve different specialists, and their respective decisions and interventions are interrelated. No effective systemic treatment has been made available to date, and therefore patients continue to experience distant failure, often without local failure. This review aims to improve multidisciplinary decision-making by presenting scientific evidence of the contributions of each team member implicated in MCC management. Following this review of previously published research, the authors conclude that multidisciplinary team management is beneficial for care, and propose a multidisciplinary decision algorithm for managing this tumor. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Automated power management and control

    NASA Technical Reports Server (NTRS)

    Dolce, James L.

    1991-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.

  8. A Decision Processing Algorithm for CDC Location Under Minimum Cost SCM Network

    NASA Astrophysics Data System (ADS)

    Park, N. K.; Kim, J. Y.; Choi, W. Y.; Tian, Z. M.; Kim, D. J.

    Location of CDC in the matter of network on Supply Chain is becoming on the high concern these days. Present status of methods on CDC has been mainly based on the calculation manually by the spread sheet to achieve the goal of minimum logistics cost. This study is focused on the development of new processing algorithm to overcome the limit of present methods, and examination of the propriety of this algorithm by case study. The algorithm suggested by this study is based on the principle of optimization on the directive GRAPH of SCM model and suggest the algorithm utilizing the traditionally introduced MST, shortest paths finding methods, etc. By the aftermath of this study, it helps to assess suitability of the present on-going SCM network and could be the criterion on the decision-making process for the optimal SCM network building-up for the demand prospect in the future.

  9. Tactical assessment in a squad of intelligent bots

    NASA Astrophysics Data System (ADS)

    Gołuński, Marcel; Wasiewicz, Piotr

    2010-09-01

    In this paper we explore the problem of communication and coordination in a team of intelligent game bots (aka embodied agents). It presents a tactical decision making system controlling the behavior of an autonomous bot followed by the concept of a team tactical decision making system controlling the team of intelligent bots. The algorithms to be introduced have been implemented in the Java language by means of Pogamut 2 framework, interfacing the bot logic with Unreal Tournament 2004 virtual environment.

  10. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis

    PubMed Central

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956

  11. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    PubMed

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  12. Multi-alternative decision-making with non-stationary inputs.

    PubMed

    Nunes, Luana F; Gurney, Kevin

    2016-08-01

    One of the most widely implemented models for multi-alternative decision-making is the multihypothesis sequential probability ratio test (MSPRT). It is asymptotically optimal, straightforward to implement, and has found application in modelling biological decision-making. However, the MSPRT is limited in application to discrete ('trial-based'), non-time-varying scenarios. By contrast, real world situations will be continuous and entail stimulus non-stationarity. In these circumstances, decision-making mechanisms (like the MSPRT) which work by accumulating evidence, must be able to discard outdated evidence which becomes progressively irrelevant. To address this issue, we introduce a new decision mechanism by augmenting the MSPRT with a rectangular integration window and a transparent decision boundary. This allows selection and de-selection of options as their evidence changes dynamically. Performance was enhanced by adapting the window size to problem difficulty. Further, we present an alternative windowing method which exponentially decays evidence and does not significantly degrade performance, while greatly reducing the memory resources necessary. The methods presented have proven successful at allowing for the MSPRT algorithm to function in a non-stationary environment.

  13. Decision-making in ileocecal Crohn's disease management: surgery versus pharmacotherapy.

    PubMed

    Eshuis, Emma J; Stokkers, Pieter Cf; Bemelman, Willem A

    2010-04-01

    Ileocecal Crohn's disease (CD) can be treated medically as well as surgically. Both treatment modalities have been improved markedly in the last two decades, making CD more manageable. However, multidisciplinary research, addressing issues such as timing of surgery or medical treatment versus surgery, is scarce. Particularly in limited ileocecal CD, ileocolic resection might be a good alternative to long-term medical therapy. This review discusses the evidence on medical and surgical treatment options for ileocecal CD. It provides an aid in decision-making by discussing a treatment algorithm that can be used until further evidence on treatment is available.

  14. Integration of Dynamic Models in Range Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    This work addresses the various model interactions in real-time to make an efficient internet based decision making tool for Shuttle launch. The decision making tool depends on the launch commit criteria coupled with physical models. Dynamic interaction between a wide variety of simulation applications and techniques, embedded algorithms, and data visualizations are needed to exploit the full potential of modeling and simulation. This paper also discusses in depth details of web based 3-D graphics and applications to range safety. The advantages of this dynamic model integration are secure accessibility and distribution of real time information to other NASA centers.

  15. Guidelines, Algorithms, and Evidence-Based Psychopharmacology Training for Psychiatric Residents

    ERIC Educational Resources Information Center

    Osser, David N.; Patterson, Robert D.; Levitt, James J.

    2005-01-01

    Objective: The authors describe a course of instruction for psychiatry residents that attempts to provide the cognitive and informational tools necessary to make scientifically grounded decision making a routine part of clinical practice. Methods: In weekly meetings over two academic years, the course covers the psychopharmacology of various…

  16. Study on Data Clustering and Intelligent Decision Algorithm of Indoor Localization

    NASA Astrophysics Data System (ADS)

    Liu, Zexi

    2018-01-01

    Indoor positioning technology enables the human beings to have the ability of positional perception in architectural space, and there is a shortage of single network coverage and the problem of location data redundancy. So this article puts forward the indoor positioning data clustering algorithm and intelligent decision-making research, design the basic ideas of multi-source indoor positioning technology, analyzes the fingerprint localization algorithm based on distance measurement, position and orientation of inertial device integration. By optimizing the clustering processing of massive indoor location data, the data normalization pretreatment, multi-dimensional controllable clustering center and multi-factor clustering are realized, and the redundancy of locating data is reduced. In addition, the path is proposed based on neural network inference and decision, design the sparse data input layer, the dynamic feedback hidden layer and output layer, low dimensional results improve the intelligent navigation path planning.

  17. Linguistic hesitant fuzzy multi-criteria decision-making method based on evidential reasoning

    NASA Astrophysics Data System (ADS)

    Zhou, Huan; Wang, Jian-qiang; Zhang, Hong-yu; Chen, Xiao-hong

    2016-01-01

    Linguistic hesitant fuzzy sets (LHFSs), which can be used to represent decision-makers' qualitative preferences as well as reflect their hesitancy and inconsistency, have attracted a great deal of attention due to their flexibility and efficiency. This paper focuses on a multi-criteria decision-making approach that combines LHFSs with the evidential reasoning (ER) method. After reviewing existing studies of LHFSs, a new order relationship and Hamming distance between LHFSs are introduced and some linguistic scale functions are applied. Then, the ER algorithm is used to aggregate the distributed assessment of each alternative. Subsequently, the set of aggregated alternatives on criteria are further aggregated to get the overall value of each alternative. Furthermore, a nonlinear programming model is developed and genetic algorithms are used to obtain the optimal weights of the criteria. Finally, two illustrative examples are provided to show the feasibility and usability of the method, and comparison analysis with the existing method is made.

  18. A Theoretical Analysis of Why Hybrid Ensembles Work.

    PubMed

    Hsu, Kuo-Wei

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles.

  19. A Decision Support Prototype Tool for Predicting Student Performance in an ODL Environment

    ERIC Educational Resources Information Center

    Kotsiantis, S. B.; Pintelas, P. E.

    2004-01-01

    Machine Learning algorithms fed with data sets which include information such as attendance data, test scores and other student information can provide tutors with powerful tools for decision-making. Until now, much of the research has been limited to the relation between single variables and student performance. Combining multiple variables as…

  20. Fuzzy set methods for object recognition in space applications

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1991-01-01

    Progress on the following tasks is reported: (1) fuzzy set-based decision making methodologies; (2) feature calculation; (3) clustering for curve and surface fitting; and (4) acquisition of images. The general structure for networks based on fuzzy set connectives which are being used for information fusion and decision making in space applications is described. The structure and training techniques for such networks consisting of generalized means and gamma-operators are described. The use of other hybrid operators in multicriteria decision making is currently being examined. Numerous classical features on image regions such as gray level statistics, edge and curve primitives, texture measures from cooccurrance matrix, and size and shape parameters were implemented. Several fractal geometric features which may have a considerable impact on characterizing cluttered background, such as clouds, dense star patterns, or some planetary surfaces, were used. A new approach to a fuzzy C-shell algorithm is addressed. NASA personnel are in the process of acquiring suitable simulation data and hopefully videotaped actual shuttle imagery. Photographs have been digitized to use in the algorithms. Also, a model of the shuttle was assembled and a mechanism to orient this model in 3-D to digitize for experiments on pose estimation is being constructed.

  1. [The guideline for the treatment of mood disorders in USA and Japan].

    PubMed

    Higuchi, T

    2001-08-01

    Recently, the number of available antidepressants has increased dramatically and psychopharmacological treatment is becoming complex. It is important to present some guideline for supporting clinical decision making. Three different kinds of guideline for the treatment of mood disorders, that is, the APA style guideline, the algorithm and the consensus guideline, have been developed in our country. The APA style guideline and the algorithm are basically evidence based and the consensus guideline is developed through the consensus panel format. These guidelines should be used as 'a starting point' for specifying decisions that will be modified occasionally.

  2. Bayesian inference and decision theory - A framework for decision making in natural resource management

    USGS Publications Warehouse

    Dorazio, R.M.; Johnson, F.A.

    2003-01-01

    Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.

  3. A dynamic data source selection system for smartwatch platform.

    PubMed

    Nemati, Ebrahim; Sideris, Konstantinos; Kalantarian, Haik; Sarrafzadeh, Majid

    2016-08-01

    A novel data source selection algorithm is proposed for ambulatory activity tracking of elderly people. The algorithm introduces the concept of dynamic switching between the data collection modules (a smartwatch and a smartphone) to improve accuracy and battery life using contextual information. We show that by making offloading decisions as a function of activity, the proposed algorithm improves power consumption and accuracy of the previous work by 7 hours and 5% respectively compared to the baseline.

  4. A Briefing on Metrics and Risks for Autonomous Decision-Making in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Frost, Susan; Goebel, Kai Frank; Galvan, Jose Ramon

    2012-01-01

    Significant technology advances will enable future aerospace systems to safely and reliably make decisions autonomously, or without human interaction. The decision-making may result in actions that enable an aircraft or spacecraft in an off-nominal state or with slightly degraded components to achieve mission performance and safety goals while reducing or avoiding damage to the aircraft or spacecraft. Some key technology enablers for autonomous decision-making include: a continuous state awareness through the maturation of the prognostics health management field, novel sensor development, and the considerable gains made in computation power and data processing bandwidth versus system size. Sophisticated algorithms and physics based models coupled with these technological advances allow reliable assessment of a system, subsystem, or components. Decisions that balance mission objectives and constraints with remaining useful life predictions can be made autonomously to maintain safety requirements, optimal performance, and ensure mission objectives. This autonomous approach to decision-making will come with new risks and benefits, some of which will be examined in this paper. To start, an account of previous work to categorize or quantify autonomy in aerospace systems will be presented. In addition, a survey of perceived risks in autonomous decision-making in the context of piloted aircraft and remotely piloted or completely autonomous unmanned autonomous systems (UAS) will be presented based on interviews that were conducted with individuals from industry, academia, and government.

  5. Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise

    NASA Astrophysics Data System (ADS)

    Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej

    2010-11-01

    The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.

  6. Analysis of methods of processing of expert information by optimization of administrative decisions

    NASA Astrophysics Data System (ADS)

    Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.

    2018-03-01

    In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.

  7. Assessing an AI knowledge-base for asymptomatic liver diseases.

    PubMed

    Babic, A; Mathiesen, U; Hedin, K; Bodemar, G; Wigertz, O

    1998-01-01

    Discovering not yet seen knowledge from clinical data is of importance in the field of asymptomatic liver diseases. Avoidance of liver biopsy which is used as the ultimate confirmation of diagnosis by making the decision based on relevant laboratory findings only, would be considered an essential support. The system based on Quinlan's ID3 algorithm was simple and efficient in extracting the sought knowledge. Basic principles of applying the AI systems are therefore described and complemented with medical evaluation. Some of the diagnostic rules were found to be useful as decision algorithms i.e. they could be directly applied in clinical work and made a part of the knowledge-base of the Liver Guide, an automated decision support system.

  8. Application of a Dynamic Programming Algorithm for Weapon Target Assignment

    DTIC Science & Technology

    2016-02-01

    25] A . Turan , “Techniques for the Allocation of Resources Under Uncertainty,” Middle Eastern Technical University, Ankara, Turkey, 2012. [26] K...UNCLASSIFIED UNCLASSIFIED Application of a Dynamic Programming Algorithm for Weapon Target Assignment Lloyd Hammond Weapons and...optimisation techniques to support the decision making process. This report documents the methodology used to identify, develop and assess a

  9. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  10. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  11. Protocol-based care: the standardisation of decision-making?

    PubMed

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.

  12. Goal-Directed Decision Making with Spiking Neurons.

    PubMed

    Friedrich, Johannes; Lengyel, Máté

    2016-02-03

    Behavioral and neuroscientific data on reward-based decision making point to a fundamental distinction between habitual and goal-directed action selection. The formation of habits, which requires simple updating of cached values, has been studied in great detail, and the reward prediction error theory of dopamine function has enjoyed prominent success in accounting for its neural bases. In contrast, the neural circuit mechanisms of goal-directed decision making, requiring extended iterative computations to estimate values online, are still unknown. Here we present a spiking neural network that provably solves the difficult online value estimation problem underlying goal-directed decision making in a near-optimal way and reproduces behavioral as well as neurophysiological experimental data on tasks ranging from simple binary choice to sequential decision making. Our model uses local plasticity rules to learn the synaptic weights of a simple neural network to achieve optimal performance and solves one-step decision-making tasks, commonly considered in neuroeconomics, as well as more challenging sequential decision-making tasks within 1 s. These decision times, and their parametric dependence on task parameters, as well as the final choice probabilities match behavioral data, whereas the evolution of neural activities in the network closely mimics neural responses recorded in frontal cortices during the execution of such tasks. Our theory provides a principled framework to understand the neural underpinning of goal-directed decision making and makes novel predictions for sequential decision-making tasks with multiple rewards. Goal-directed actions requiring prospective planning pervade decision making, but their circuit-level mechanisms remain elusive. We show how a model circuit of biologically realistic spiking neurons can solve this computationally challenging problem in a novel way. The synaptic weights of our network can be learned using local plasticity rules such that its dynamics devise a near-optimal plan of action. By systematically comparing our model results to experimental data, we show that it reproduces behavioral decision times and choice probabilities as well as neural responses in a rich set of tasks. Our results thus offer the first biologically realistic account for complex goal-directed decision making at a computational, algorithmic, and implementational level. Copyright © 2016 the authors 0270-6474/16/361529-18$15.00/0.

  13. Goal-Directed Decision Making with Spiking Neurons

    PubMed Central

    Lengyel, Máté

    2016-01-01

    Behavioral and neuroscientific data on reward-based decision making point to a fundamental distinction between habitual and goal-directed action selection. The formation of habits, which requires simple updating of cached values, has been studied in great detail, and the reward prediction error theory of dopamine function has enjoyed prominent success in accounting for its neural bases. In contrast, the neural circuit mechanisms of goal-directed decision making, requiring extended iterative computations to estimate values online, are still unknown. Here we present a spiking neural network that provably solves the difficult online value estimation problem underlying goal-directed decision making in a near-optimal way and reproduces behavioral as well as neurophysiological experimental data on tasks ranging from simple binary choice to sequential decision making. Our model uses local plasticity rules to learn the synaptic weights of a simple neural network to achieve optimal performance and solves one-step decision-making tasks, commonly considered in neuroeconomics, as well as more challenging sequential decision-making tasks within 1 s. These decision times, and their parametric dependence on task parameters, as well as the final choice probabilities match behavioral data, whereas the evolution of neural activities in the network closely mimics neural responses recorded in frontal cortices during the execution of such tasks. Our theory provides a principled framework to understand the neural underpinning of goal-directed decision making and makes novel predictions for sequential decision-making tasks with multiple rewards. SIGNIFICANCE STATEMENT Goal-directed actions requiring prospective planning pervade decision making, but their circuit-level mechanisms remain elusive. We show how a model circuit of biologically realistic spiking neurons can solve this computationally challenging problem in a novel way. The synaptic weights of our network can be learned using local plasticity rules such that its dynamics devise a near-optimal plan of action. By systematically comparing our model results to experimental data, we show that it reproduces behavioral decision times and choice probabilities as well as neural responses in a rich set of tasks. Our results thus offer the first biologically realistic account for complex goal-directed decision making at a computational, algorithmic, and implementational level. PMID:26843636

  14. Cyborg practices: call-handlers and computerised decision support systems in urgent and emergency care.

    PubMed

    Pope, Catherine; Halford, Susan; Turnbull, Joanne; Prichard, Jane

    2014-06-01

    This article draws on data collected during a 2-year project examining the deployment of a computerised decision support system. This computerised decision support system was designed to be used by non-clinical staff for dealing with calls to emergency (999) and urgent care (out-of-hours) services. One of the promises of computerised decisions support technologies is that they can 'hold' vast amounts of sophisticated clinical knowledge and combine it with decision algorithms to enable standardised decision-making by non-clinical (clerical) staff. This article draws on our ethnographic study of this computerised decision support system in use, and we use our analysis to question the 'automated' vision of decision-making in healthcare call-handling. We show that embodied and experiential (human) expertise remains central and highly salient in this work, and we propose that the deployment of the computerised decision support system creates something new, that this conjunction of computer and human creates a cyborg practice.

  15. Multi-test decision tree and its application to microarray data classification.

    PubMed

    Czajkowski, Marcin; Grześ, Marek; Kretowski, Marek

    2014-05-01

    The desirable property of tools used to investigate biological data is easy to understand models and predictive decisions. Decision trees are particularly promising in this regard due to their comprehensible nature that resembles the hierarchical process of human decision making. However, existing algorithms for learning decision trees have tendency to underfit gene expression data. The main aim of this work is to improve the performance and stability of decision trees with only a small increase in their complexity. We propose a multi-test decision tree (MTDT); our main contribution is the application of several univariate tests in each non-terminal node of the decision tree. We also search for alternative, lower-ranked features in order to obtain more stable and reliable predictions. Experimental validation was performed on several real-life gene expression datasets. Comparison results with eight classifiers show that MTDT has a statistically significantly higher accuracy than popular decision tree classifiers, and it was highly competitive with ensemble learning algorithms. The proposed solution managed to outperform its baseline algorithm on 14 datasets by an average 6%. A study performed on one of the datasets showed that the discovered genes used in the MTDT classification model are supported by biological evidence in the literature. This paper introduces a new type of decision tree which is more suitable for solving biological problems. MTDTs are relatively easy to analyze and much more powerful in modeling high dimensional microarray data than their popular counterparts. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. The analysis of the pilot's cognitive and decision processes

    NASA Technical Reports Server (NTRS)

    Curry, R. E.

    1975-01-01

    Articles are presented on pilot performance in zero-visibility precision approach, failure detection by pilots during automatic landing, experiments in pilot decision-making during simulated low visibility approaches, a multinomial maximum likelihood program, and a random search algorithm for laboratory computers. Other topics discussed include detection of system failures in multi-axis tasks and changes in pilot workload during an instrument landing.

  17. A Theoretical Analysis of Why Hybrid Ensembles Work

    PubMed Central

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles. PMID:28255296

  18. A clinical decision-making algorithm for penicillin allergy.

    PubMed

    Soria, Angèle; Autegarden, Elodie; Amsler, Emmanuelle; Gaouar, Hafida; Vial, Amandine; Francès, Camille; Autegarden, Jean-Eric

    2017-12-01

    About 10% of subjects report suspected penicillin allergy, but 85-90% of these patients are not truly allergic and could safely receive beta-lactam antibiotics Objective: To design and validate a clinical decision-making algorithm, based on anamnesis (chronology, severity, and duration of the suspected allergic reactions) and reaching a 100% sensitivity and negative predictive value, to assess allergy risk related to a penicillin prescription in general practise. All patients were included prospectively and explorated based on ENDA/EAACI recommendations. Results of penicillin allergy work-up (gold standard) were compared with results of the algorithm. Allergological work-up diagnosed penicillin hypersensitivity in 41/259 patients (15.8%) [95% CI: 11.5-20.3]. Three of these patients were diagnosed as having immediate-type hypersensitivity to penicillin, but had been misdiagnosed as low risk patients using the clinical algorithm. Thus, the sensitivity and negative predictive value of the algorithm were 92.7% [95% CI: 80.1-98.5] and 96.3% [95% CI: 89.6-99.2], respectively, and the probability that a patient with true penicillin allergy had been misclassified was 3.7% [95% CI: 0.8-10.4]. Although the risk of misclassification is low, we cannot recommend the use of this algorithm in general practice. However, the algorithm can be useful in emergency situations in hospital settings. Key messages True penicillin allergy is considerably lower than alleged penicillin allergy (15.8%; 41 of the 259 patients with suspected penicillin allergy). A clinical algorithm based on the patient's clinical history of the supposed allergic event to penicillin misclassified 3/41 (3.7%) truly allergic patients.

  19. Control fast or control smart: When should invading pathogens be controlled?

    PubMed

    Thompson, Robin N; Gilligan, Christopher A; Cunniffe, Nik J

    2018-02-01

    The intuitive response to an invading pathogen is to start disease management as rapidly as possible, since this would be expected to minimise the future impacts of disease. However, since more spread data become available as an outbreak unfolds, processes underpinning pathogen transmission can almost always be characterised more precisely later in epidemics. This allows the future progression of any outbreak to be forecast more accurately, and so enables control interventions to be targeted more precisely. There is also the chance that the outbreak might die out without any intervention whatsoever, making prophylactic control unnecessary. Optimal decision-making involves continuously balancing these potential benefits of waiting against the possible costs of further spread. We introduce a generic, extensible data-driven algorithm based on parameter estimation and outbreak simulation for making decisions in real-time concerning when and how to control an invading pathogen. The Control Smart Algorithm (CSA) resolves the trade-off between the competing advantages of controlling as soon as possible and controlling later when more information has become available. We show-using a generic mathematical model representing the transmission of a pathogen of agricultural animals or plants through a population of farms or fields-how the CSA allows the timing and level of deployment of vaccination or chemical control to be optimised. In particular, the algorithm outperforms simpler strategies such as intervening when the outbreak size reaches a pre-specified threshold, or controlling when the outbreak has persisted for a threshold length of time. This remains the case even if the simpler methods are fully optimised in advance. Our work highlights the potential benefits of giving careful consideration to the question of when to start disease management during emerging outbreaks, and provides a concrete framework to allow policy-makers to make this decision.

  20. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  1. Clinical decision making: how surgeons do it.

    PubMed

    Crebbin, Wendy; Beasley, Spencer W; Watters, David A K

    2013-06-01

    Clinical decision making is a core competency of surgical practice. It involves two distinct types of mental process best considered as the ends of a continuum, ranging from intuitive and subconscious to analytical and conscious. In practice, individual decisions are usually reached by a combination of each, according to the complexity of the situation and the experience/expertise of the surgeon. An expert moves effortlessly along this continuum, according to need, able to apply learned rules or algorithms to specific presentations, choosing these as a result of either pattern recognition or analytical thinking. The expert recognizes and responds quickly to any mismatch between what is observed and what was expected, coping with gaps in information and making decisions even where critical data may be uncertain or unknown. Even for experts, the cognitive processes involved are difficult to articulate as they tend to be very complex. However, if surgeons are to assist trainees in developing their decision-making skills, the processes need to be identified and defined, and the competency needs to be measurable. This paper examines the processes of clinical decision making in three contexts: making a decision about how to manage a patient; preparing for an operative procedure; and reviewing progress during an operative procedure. The models represented here are an exploration of the complexity of the processes, designed to assist surgeons understand how expert clinical decision making occurs and to highlight the challenge of teaching these skills to surgical trainees. © 2013 The Authors. ANZ Journal of Surgery © 2013 Royal Australasian College of Surgeons.

  2. Obstetric Anaesthetists' Association and Difficult Airway Society guidelines for the management of difficult and failed tracheal intubation in obstetrics.

    PubMed

    Mushambi, M C; Kinsella, S M; Popat, M; Swales, H; Ramaswamy, K K; Winton, A L; Quinn, A C

    2015-11-01

    The Obstetric Anaesthetists' Association and Difficult Airway Society have developed the first national obstetric guidelines for the safe management of difficult and failed tracheal intubation during general anaesthesia. They comprise four algorithms and two tables. A master algorithm provides an overview. Algorithm 1 gives a framework on how to optimise a safe general anaesthetic technique in the obstetric patient, and emphasises: planning and multidisciplinary communication; how to prevent the rapid oxygen desaturation seen in pregnant women by advocating nasal oxygenation and mask ventilation immediately after induction; limiting intubation attempts to two; and consideration of early release of cricoid pressure if difficulties are encountered. Algorithm 2 summarises the management after declaring failed tracheal intubation with clear decision points, and encourages early insertion of a (preferably second-generation) supraglottic airway device if appropriate. Algorithm 3 covers the management of the 'can't intubate, can't oxygenate' situation and emergency front-of-neck airway access, including the necessity for timely perimortem caesarean section if maternal oxygenation cannot be achieved. Table 1 gives a structure for assessing the individual factors relevant in the decision to awaken or proceed should intubation fail, which include: urgency related to maternal or fetal factors; seniority of the anaesthetist; obesity of the patient; surgical complexity; aspiration risk; potential difficulty with provision of alternative anaesthesia; and post-induction airway device and airway patency. This decision should be considered by the team in advance of performing a general anaesthetic to make a provisional plan should failed intubation occur. The table is also intended to be used as a teaching tool to facilitate discussion and learning regarding the complex nature of decision-making when faced with a failed intubation. Table 2 gives practical considerations of how to awaken or proceed with surgery. The background paper covers recommendations on drugs, new equipment, teaching and training. © 2015 The Authors. Anaesthesia published by John Wiley & Sons Ltd on behalf of Association of Anaesthetists of Great Britain and Ireland.

  3. Obstetric Anaesthetists' Association and Difficult Airway Society guidelines for the management of difficult and failed tracheal intubation in obstetrics*

    PubMed Central

    Mushambi, M C; Kinsella, S M; Popat, M; Swales, H; Ramaswamy, K K; Winton, A L; Quinn, A C

    2015-01-01

    The Obstetric Anaesthetists' Association and Difficult Airway Society have developed the first national obstetric guidelines for the safe management of difficult and failed tracheal intubation during general anaesthesia. They comprise four algorithms and two tables. A master algorithm provides an overview. Algorithm 1 gives a framework on how to optimise a safe general anaesthetic technique in the obstetric patient, and emphasises: planning and multidisciplinary communication; how to prevent the rapid oxygen desaturation seen in pregnant women by advocating nasal oxygenation and mask ventilation immediately after induction; limiting intubation attempts to two; and consideration of early release of cricoid pressure if difficulties are encountered. Algorithm 2 summarises the management after declaring failed tracheal intubation with clear decision points, and encourages early insertion of a (preferably second-generation) supraglottic airway device if appropriate. Algorithm 3 covers the management of the ‘can't intubate, can't oxygenate’ situation and emergency front-of-neck airway access, including the necessity for timely perimortem caesarean section if maternal oxygenation cannot be achieved. Table 1 gives a structure for assessing the individual factors relevant in the decision to awaken or proceed should intubation fail, which include: urgency related to maternal or fetal factors; seniority of the anaesthetist; obesity of the patient; surgical complexity; aspiration risk; potential difficulty with provision of alternative anaesthesia; and post-induction airway device and airway patency. This decision should be considered by the team in advance of performing a general anaesthetic to make a provisional plan should failed intubation occur. The table is also intended to be used as a teaching tool to facilitate discussion and learning regarding the complex nature of decision-making when faced with a failed intubation. Table 2 gives practical considerations of how to awaken or proceed with surgery. The background paper covers recommendations on drugs, new equipment, teaching and training. PMID:26449292

  4. Eye Tracking and Pupillometry are Indicators of Dissociable Latent Decision Processes

    PubMed Central

    Cavanagh, James F.; Wiecki, Thomas V.; Kochar, Angad; Frank, Michael J.

    2014-01-01

    Can you predict what someone is going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the Drift Diffusion Model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PMID:24548281

  5. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID2, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results of a real-life recognition problem of distinguishing defects in a glass ribbon and of a benchmark problem of differentiating two spirals are shown and discussed.

  6. Self-growing neural network architecture using crisp and fuzzy entropy

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.

    1992-01-01

    The paper briefly describes the self-growing neural network algorithm, CID3, which makes decision trees equivalent to hidden layers of a neural network. The algorithm generates a feedforward architecture using crisp and fuzzy entropy measures. The results for a real-life recognition problem of distinguishing defects in a glass ribbon, and for a benchmark problen of telling two spirals apart are shown and discussed.

  7. Flexible Multi agent Algorithm for Distributed Decision Making

    DTIC Science & Technology

    2015-01-01

    How, J. P. Consensus - Based Auction Approaches for Decentralized task Assignment. Proceedings of the AIAA Guidance, Navigation, and Control...G. ; Kim, Y. Market- based Decentralized Task Assignment for Cooperative UA V Mission Including Rendezvous. Proceedings of the AIAA Guidance...scalable and adaptable to a variety of specific mission tasks . Additionally, the algorithm could easily be adapted for use on land or sea- based systems

  8. Information Search and Decision Making: The Effects of Age and Complexity on Strategy Use

    PubMed Central

    Queen, Tara L.; Hess, Thomas M.; Ennis, Gilda E.; Dowd, Keith; Grühn, Daniel

    2012-01-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults’ performance. Participants utilized two decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants’ preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and this ability may benefit from accrued knowledge and experience. PMID:22663157

  9. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    PubMed

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  10. Fusion of Heterogeneous Intrusion Detection Systems for Network Attack Detection

    PubMed Central

    Kaliappan, Jayakumar; Thiagarajan, Revathi; Sundararajan, Karpagam

    2015-01-01

    An intrusion detection system (IDS) helps to identify different types of attacks in general, and the detection rate will be higher for some specific category of attacks. This paper is designed on the idea that each IDS is efficient in detecting a specific type of attack. In proposed Multiple IDS Unit (MIU), there are five IDS units, and each IDS follows a unique algorithm to detect attacks. The feature selection is done with the help of genetic algorithm. The selected features of the input traffic are passed on to the MIU for processing. The decision from each IDS is termed as local decision. The fusion unit inside the MIU processes all the local decisions with the help of majority voting rule and makes the final decision. The proposed system shows a very good improvement in detection rate and reduces the false alarm rate. PMID:26295058

  11. Fusion of Heterogeneous Intrusion Detection Systems for Network Attack Detection.

    PubMed

    Kaliappan, Jayakumar; Thiagarajan, Revathi; Sundararajan, Karpagam

    2015-01-01

    An intrusion detection system (IDS) helps to identify different types of attacks in general, and the detection rate will be higher for some specific category of attacks. This paper is designed on the idea that each IDS is efficient in detecting a specific type of attack. In proposed Multiple IDS Unit (MIU), there are five IDS units, and each IDS follows a unique algorithm to detect attacks. The feature selection is done with the help of genetic algorithm. The selected features of the input traffic are passed on to the MIU for processing. The decision from each IDS is termed as local decision. The fusion unit inside the MIU processes all the local decisions with the help of majority voting rule and makes the final decision. The proposed system shows a very good improvement in detection rate and reduces the false alarm rate.

  12. Clinical decision-making and secondary findings in systems medicine.

    PubMed

    Fischer, T; Brothers, K B; Erdmann, P; Langanke, M

    2016-05-21

    Systems medicine is the name for an assemblage of scientific strategies and practices that include bioinformatics approaches to human biology (especially systems biology); "big data" statistical analysis; and medical informatics tools. Whereas personalized and precision medicine involve similar analytical methods applied to genomic and medical record data, systems medicine draws on these as well as other sources of data. Given this distinction, the clinical translation of systems medicine poses a number of important ethical and epistemological challenges for researchers working to generate systems medicine knowledge and clinicians working to apply it. This article focuses on three key challenges: First, we will discuss the conflicts in decision-making that can arise when healthcare providers committed to principles of experimental medicine or evidence-based medicine encounter individualized recommendations derived from computer algorithms. We will explore in particular whether controlled experiments, such as comparative effectiveness trials, should mediate the translation of systems medicine, or if instead individualized findings generated through "big data" approaches can be applied directly in clinical decision-making. Second, we will examine the case of the Riyadh Intensive Care Program Mortality Prediction Algorithm, pejoratively referred to as the "death computer," to demonstrate the ethical challenges that can arise when big-data-driven scoring systems are applied in clinical contexts. We argue that the uncritical use of predictive clinical algorithms, including those envisioned for systems medicine, challenge basic understandings of the doctor-patient relationship. Third, we will build on the recent discourse on secondary findings in genomics and imaging to draw attention to the important implications of secondary findings derived from the joint analysis of data from diverse sources, including data recorded by patients in an attempt to realize their "quantified self." This paper examines possible ethical challenges that are likely to be raised as systems medicine to be translated into clinical medicine. These include the epistemological challenges for clinical decision-making, the use of scoring systems optimized by big data techniques and the risk that incidental and secondary findings will significantly increase. While some ethical implications remain still hypothetical we should use the opportunity to prospectively identify challenges to avoid making foreseeable mistakes when systems medicine inevitably arrives in routine care.

  13. hs-CRP is strongly associated with coronary heart disease (CHD): A data mining approach using decision tree algorithm.

    PubMed

    Tayefi, Maryam; Tajfard, Mohammad; Saffar, Sara; Hanachi, Parichehr; Amirabadizadeh, Ali Reza; Esmaeily, Habibollah; Taghipour, Ali; Ferns, Gordon A; Moohebati, Mohsen; Ghayour-Mobarhan, Majid

    2017-04-01

    Coronary heart disease (CHD) is an important public health problem globally. Algorithms incorporating the assessment of clinical biomarkers together with several established traditional risk factors can help clinicians to predict CHD and support clinical decision making with respect to interventions. Decision tree (DT) is a data mining model for extracting hidden knowledge from large databases. We aimed to establish a predictive model for coronary heart disease using a decision tree algorithm. Here we used a dataset of 2346 individuals including 1159 healthy participants and 1187 participant who had undergone coronary angiography (405 participants with negative angiography and 782 participants with positive angiography). We entered 10 variables of a total 12 variables into the DT algorithm (including age, sex, FBG, TG, hs-CRP, TC, HDL, LDL, SBP and DBP). Our model could identify the associated risk factors of CHD with sensitivity, specificity, accuracy of 96%, 87%, 94% and respectively. Serum hs-CRP levels was at top of the tree in our model, following by FBG, gender and age. Our model appears to be an accurate, specific and sensitive model for identifying the presence of CHD, but will require validation in prospective studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A conceptual evolutionary aseismic decision support framework for hospitals

    NASA Astrophysics Data System (ADS)

    Hu, Yufeng; Dargush, Gary F.; Shao, Xiaoyun

    2012-12-01

    In this paper, aconceptual evolutionary framework for aseismic decision support for hospitalsthat attempts to integrate a range of engineering and sociotechnical models is presented. Genetic algorithms are applied to find the optimal decision sets. A case study is completed to demonstrate how the frameworkmay applytoa specific hospital.The simulations show that the proposed evolutionary decision support framework is able to discover robust policy sets in either uncertain or fixed environments. The framework also qualitatively identifies some of the characteristicbehavior of the critical care organization. Thus, by utilizing the proposedframework, the decision makers are able to make more informed decisions, especially toenhance the seismic safety of the hospitals.

  15. The missing link in preconceptional care: the role of comparative effectiveness research.

    PubMed

    Salihu, Hamisu M; Salinas, Abraham; Mogos, Mulubrhan

    2013-07-01

    This paper discusses an important element that is missing from the existing algorithm of preconception care, namely, comparative effectiveness research (CER). To our knowledge, there has been limited assessment of the comparative effectiveness of diverse interventions that promote preconception health, conditions under which these are most effective, for which particular populations, and their comparative costs. CER can improve the decision making process for the funding, development, implementation, and evaluation of comprehensive preconception care programs, specifically by identifying the most effective interventions with acceptable costs to society. This paper will examine the framework behind preconception care and how the inclusion of comparative effectiveness research and evaluation into the existing algorithm of preconception care could foster improvement in maternal and child health. We discuss challenges and opportunities regarding the utilization of CER in the decision making process in preconception health, and finally, we provide recommendations for future directions.

  16. The Missing Link in Preconceptional Care: The Role of Comparative Effectiveness Research

    PubMed Central

    Salihu, Hamisu M.; Salinas, Abraham; Mogos, Mulubrhan

    2012-01-01

    This paper discusses an important element that is missing from the existing algorithm of preconception care, namely, comparative effectiveness research (CER). To our knowledge, there has been limited assessment of the comparative effectiveness of diverse interventions that promote preconception health, conditions under which these are most effective, for which particular populations, and their comparative costs. CER can improve the decision making process for the funding, development, implementation, and evaluation of comprehensive preconception care programs, specifically by identifying the most effective interventions with acceptable costs to society. This paper will examine the framework behind preconception care and how the inclusion of comparative effectiveness research and evaluation into the existing algorithm of preconception care could foster improvement in maternal and child health. We discuss challenges and opportunities regarding the utilization of CER in the decision making process in preconception health, and finally, we provide recommendations for future directions. PMID:22718466

  17. Neonatal physical therapy. Part I: clinical competencies and neonatal intensive care unit clinical training models.

    PubMed

    Sweeney, Jane K; Heriza, Carolyn B; Blanchard, Yvette

    2009-01-01

    To describe clinical training models, delineate clinical competencies, and outline a clinical decision-making algorithm for neonatal physical therapy. In these updated practice guidelines, advanced clinical training models, including precepted practicum and residency or fellowship training, are presented to guide practitioners in organizing mentored, competency-based preparation for neonatal care. Clinical competencies in neonatal physical therapy are outlined with advanced clinical proficiencies and knowledge areas specific to each role. An algorithm for decision making on examination, evaluation, intervention, and re-examination processes provides a framework for clinical reasoning. Because of advanced-level competency requirements and the continuous examination, evaluation, and modification of procedures during each patient contact, the intensive care unit is a restricted practice area for physical therapist assistants, physical therapist generalists, and physical therapy students. Accountable, ethical physical therapy for neonates requires advanced, competency-based training with a preceptor in the pediatric subspecialty of neonatology.

  18. Artificial Intelligence based technique for BTS placement

    NASA Astrophysics Data System (ADS)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  19. Antibiotic prophylaxis in cataract surgery in the setting of penicillin allergy: A decision-making algorithm.

    PubMed

    LaHood, Benjamin R; Andrew, Nicholas H; Goggin, Michael

    Cataract surgery is the most commonly performed surgical procedure in many developed countries. Postoperative endophthalmitis is a rare complication with potentially devastating visual outcomes. Currently, there is no global consensus regarding antibiotic prophylaxis in cataract surgery despite growing evidence of the benefits of prophylactic intracameral cefuroxime at the conclusion of surgery. The decision about which antibiotic regimen to use is further complicated in patients reporting penicillin allergy. Historic statistics suggesting crossreactivity of penicillins and cephalosporins have persisted into modern surgery. It is important for ophthalmologists to consider all available antibiotic options and have an up-to-date knowledge of antibiotic crossreactivity when faced with the dilemma of choosing appropriate antibiotic prophylaxis for patients undergoing cataract surgery with a history of penicillin allergy. Each option carries risks, and the choice may have medicolegal implications in the event of an adverse outcome. We assess the options for antibiotic prophylaxis in cataract surgery in the setting of penicillin allergy and provide an algorithm to assist decision-making for individual patients. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  20. Applications of fuzzy logic to control and decision making

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Jani, Yashvant

    1991-01-01

    Long range space missions will require high operational efficiency as well as autonomy to enhance the effectivity of performance. Fuzzy logic technology has been shown to be powerful and robust in interpreting imprecise measurements and generating appropriate control decisions for many space operations. Several applications are underway, studying the fuzzy logic approach to solving control and decision making problems. Fuzzy logic algorithms for relative motion and attitude control have been developed and demonstrated for proximity operations. Based on this experience, motion control algorithms that include obstacle avoidance were developed for a Mars Rover prototype for maneuvering during the sample collection process. A concept of an intelligent sensor system that can identify objects and track them continuously and learn from its environment is under development to support traffic management and proximity operations around the Space Station Freedom. For safe and reliable operation of Lunar/Mars based crew quarters, high speed controllers with ability to combine imprecise measurements from several sensors is required. A fuzzy logic approach that uses high speed fuzzy hardware chips is being studied.

  1. Constructing a clinical decision-making framework for image-guided radiotherapy using a Bayesian Network

    NASA Astrophysics Data System (ADS)

    Hargrave, C.; Moores, M.; Deegan, T.; Gibbs, A.; Poulsen, M.; Harden, F.; Mengersen, K.

    2014-03-01

    A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific subregions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.

  2. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers

    DTIC Science & Technology

    2006-09-26

    Alata O. Fernandez-Maloigne C., and Ferrie J.C. (2001). Unsupervised Algorithm for the Segmentation of Three-Dimensional Magnetic Resonance Brain ...instinctual and learned responses in the brain , causing it to make decisions based on patterns in the stimuli. Using this deceptively simple process...2001. [2] Bohn C. (1997). An Incremental Unsupervised Learning Scheme for Function Approximation. In: Proceedings of the 1997 IEEE International

  3. Econ's optimal decision model of wheat production and distribution-documentation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The report documents the computer programs written to implement the ECON optical decision model. The programs were written in APL, an extremely compact and powerful language particularly well suited to this model, which makes extensive use of matrix manipulations. The algorithms used are presented and listings of and descriptive information on the APL programs used are given. Possible changes in input data are also given.

  4. Modeling paradigms for medical diagnostic decision support: a survey and future directions.

    PubMed

    Wagholikar, Kavishwar B; Sundararajan, Vijayraghavan; Deshpande, Ashok W

    2012-10-01

    Use of computer based decision tools to aid clinical decision making, has been a primary goal of research in biomedical informatics. Research in the last five decades has led to the development of Medical Decision Support (MDS) applications using a variety of modeling techniques, for a diverse range of medical decision problems. This paper surveys literature on modeling techniques for diagnostic decision support, with a focus on decision accuracy. Trends and shortcomings of research in this area are discussed and future directions are provided. The authors suggest that-(i) Improvement in the accuracy of MDS application may be possible by modeling of vague and temporal data, research on inference algorithms, integration of patient information from diverse sources and improvement in gene profiling algorithms; (ii) MDS research would be facilitated by public release of de-identified medical datasets, and development of opensource data-mining tool kits; (iii) Comparative evaluations of different modeling techniques are required to understand characteristics of the techniques, which can guide developers in choice of technique for a particular medical decision problem; and (iv) Evaluations of MDS applications in clinical setting are necessary to foster physicians' utilization of these decision aids.

  5. Mutually Augmented Cognition

    NASA Astrophysics Data System (ADS)

    Friesdorf, Florian; Pangercic, Dejan; Bubb, Heiner; Beetz, Michael

    In mac, an ergonomic dialog-system and algorithms will be developed that enable human experts and companions to be integrated into knowledge gathering and decision making processes of highly complex cognitive systems (e.g. Assistive Household as manifested further in the paper). In this event we propose to join algorithms and methodologies coming from Ergonomics and Artificial Intelligence that: a) make cognitive systems more congenial for non-expert humans, b) facilitate their comprehension by utilizing a high-level expandable control code for human experts and c) augment representation of such cognitive system into “deep representation” obtained through an interaction with human companions.

  6. Data-driven modeling of hydroclimatic trends and soil moisture: Multi-scale data integration and decision support

    NASA Astrophysics Data System (ADS)

    Coopersmith, Evan Joseph

    The techniques and information employed for decision-making vary with the spatial and temporal scope of the assessment required. In modern agriculture, the farm owner or manager makes decisions on a day-to-day or even hour-to-hour basis for dozens of fields scattered over as much as a fifty-mile radius from some central location. Following precipitation events, land begins to dry. Land-owners and managers often trace serpentine paths of 150+ miles every morning to inspect the conditions of their various parcels. His or her objective lies in appropriate resource usage -- is a given tract of land dry enough to be workable at this moment or would he or she be better served waiting patiently? Longer-term, these owners and managers decide upon which seeds will grow most effectively and which crops will make their operations profitable. At even longer temporal scales, decisions are made regarding which fields must be acquired and sold and what types of equipment will be necessary in future operations. This work develops and validates algorithms for these shorter-term decisions, along with models of national climate patterns and climate changes to enable longer-term operational planning. A test site at the University of Illinois South Farms (Urbana, IL, USA) served as the primary location to validate machine learning algorithms, employing public sources of precipitation and potential evapotranspiration to model the wetting/drying process. In expanding such local decision support tools to locations on a national scale, one must recognize the heterogeneity of hydroclimatic and soil characteristics throughout the United States. Machine learning algorithms modeling the wetting/drying process must address this variability, and yet it is wholly impractical to construct a separate algorithm for every conceivable location. For this reason, a national hydrological classification system is presented, allowing clusters of hydroclimatic similarity to emerge naturally from annual regime curve data and facilitate the development of cluster-specific algorithms. Given the desire to enable intelligent decision-making at any location, this classification system is developed in a manner that will allow for classification anywhere in the U.S., even in an ungauged basin. Daily time series data from 428 catchments in the MOPEX database are analyzed to produce an empirical classification tree, partitioning the United States into regions of hydroclimatic similarity. In constructing a classification tree based upon 55 years of data, it is important to recognize the non-stationary nature of climate data. The shifts in climatic regimes will cause certain locations to shift their ultimate position within the classification tree, requiring decision-makers to alter land usage, farming practices, and equipment needs, and algorithms to adjust accordingly. This work adapts the classification model to address the issue of regime shifts over larger temporal scales and suggests how land-usage and farming protocol may vary from hydroclimatic shifts in decades to come. Finally, the generalizability of the hydroclimatic classification system is tested with a physically-based soil moisture model calibrated at several locations throughout the continental United States. The soil moisture model is calibrated at a given site and then applied with the same parameters at other sites within and outside the same hydroclimatic class. The model's performance deteriorates minimally if the calibration and validation location are within the same hydroclimatic class, but deteriorates significantly if the calibration and validates sites are located in different hydroclimatic classes. These soil moisture estimates at the field scale are then further refined by the introduction of LiDAR elevation data, distinguishing faster-drying peaks and ridges from slower-drying valleys. The inclusion of LiDAR enabled multiple locations within the same field to be predicted accurately despite non-identical topography. This cross-application of parametric calibrations and LiDAR-driven disaggregation facilitates decision-support at locations without proximally-located soil moisture sensors.

  7. Identification of a Threshold Value for the DEMATEL Method: Using the Maximum Mean De-Entropy Algorithm

    NASA Astrophysics Data System (ADS)

    Chung-Wei, Li; Gwo-Hshiung, Tzeng

    To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.

  8. Multimodal Logistics Network Design over Planning Horizon through a Hybrid Meta-Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki; Yamazaki, Yoshihiro; Wada, Takeshi

    Logistics has been acknowledged increasingly as a key issue of supply chain management to improve business efficiency under global competition and diversified customer demands. This study aims at improving a quality of strategic decision making associated with dynamic natures in logistics network optimization. Especially, noticing an importance to concern with a multimodal logistics under multiterms, we have extended a previous approach termed hybrid tabu search (HybTS). The attempt intends to deploy a strategic planning more concretely so that the strategic plan can link to an operational decision making. The idea refers to a smart extension of the HybTS to solve a dynamic mixed integer programming problem. It is a two-level iterative method composed of a sophisticated tabu search for the location problem at the upper level and a graph algorithm for the route selection at the lower level. To keep efficiency while coping with the resulting extremely large-scale problem, we invented a systematic procedure to transform the original linear program at the lower-level into a minimum cost flow problem solvable by the graph algorithm. Through numerical experiments, we verified the proposed method outperformed the commercial software. The results indicate the proposed approach can make the conventional strategic decision much more practical and is promising for real world applications.

  9. Many-objective robust decision making for water allocation under climate change.

    PubMed

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E

    2017-12-31

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A Decision Making Methodology in Support of the Business Rules Lifecycle

    NASA Technical Reports Server (NTRS)

    Wild, Christopher; Rosca, Daniela

    1998-01-01

    The business rules that underlie an enterprise emerge as a new category of system requirements that represent decisions about how to run the business, and which are characterized by their business-orientation and their propensity for change. In this report, we introduce a decision making methodology which addresses several aspects of the business rules lifecycle: acquisition, deployment and evolution. We describe a meta-model for representing business rules in terms of an enterprise model, and also a decision support submodel for reasoning about and deriving the rules. The possibility for lifecycle automated assistance is demonstrated in terms of the automatic extraction of business rules from the decision structure. A system based on the metamodel has been implemented, including the extraction algorithm. This is the final report for Daniela Rosca's PhD fellowship. It describes the work we have done over the past year, current research and the list of publications associated with her thesis topic.

  11. Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms Based on Kalman Filter Estimation

    NASA Technical Reports Server (NTRS)

    Galvan, Jose Ramon; Saxena, Abhinav; Goebel, Kai Frank

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process, and how it relates to uncertainty representation, management and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for two while considering prognostics in making critical decisions.

  12. Imaging spectroscopy: Earth and planetary remote sensing with the USGS Tetracorder and expert systems

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Livo, K. Eric; Kokaly, Raymond F.; Sutley, Steve J.; Dalton, J. Brad; McDougal, Robert R.; Gent, Carol A.

    2003-01-01

    Imaging spectroscopy is a tool that can be used to spectrally identify and spatially map materials based on their specific chemical bonds. Spectroscopic analysis requires significantly more sophistication than has been employed in conventional broadband remote sensing analysis. We describe a new system that is effective at material identification and mapping: a set of algorithms within an expert system decision‐making framework that we call Tetracorder. The expertise in the system has been derived from scientific knowledge of spectral identification. The expert system rules are implemented in a decision tree where multiple algorithms are applied to spectral analysis, additional expert rules and algorithms can be applied based on initial results, and more decisions are made until spectral analysis is complete. Because certain spectral features are indicative of specific chemical bonds in materials, the system can accurately identify and map those materials. In this paper we describe the framework of the decision making process used for spectral identification, describe specific spectral feature analysis algorithms, and give examples of what analyses and types of maps are possible with imaging spectroscopy data. We also present the expert system rules that describe which diagnostic spectral features are used in the decision making process for a set of spectra of minerals and other common materials. We demonstrate the applications of Tetracorder to identify and map surface minerals, to detect sources of acid rock drainage, and to map vegetation species, ice, melting snow, water, and water pollution, all with one set of expert system rules. Mineral mapping can aid in geologic mapping and fault detection and can provide a better understanding of weathering, mineralization, hydrothermal alteration, and other geologic processes. Environmental site assessment, such as mapping source areas of acid mine drainage, has resulted in the acceleration of site cleanup, saving millions of dollars and years in cleanup time. Imaging spectroscopy data and Tetracorder analysis can be used to study both terrestrial and planetary science problems. Imaging spectroscopy can be used to probe planetary systems, including their atmospheres, oceans, and land surfaces.

  13. Thresholds for conservation and management: structured decision making as a conceptual framework

    USGS Publications Warehouse

    Nichols, James D.; Eaton, Mitchell J.; Martin, Julien; Edited by Guntenspergen, Glenn R.

    2014-01-01

    changes in system dynamics. They are frequently incorporated into ecological models used to project system responses to management actions. Utility thresholds are components of management objectives and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. Decision thresholds are derived from the other components of the decision process.We advocate a structured decision making (SDM) approach within which the following components are identified: objectives (possibly including utility thresholds), potential actions, models (possibly including ecological thresholds), monitoring program, and a solution algorithm (which produces decision thresholds). Adaptive resource management (ARM) is described as a special case of SDM developed for recurrent decision problems that are characterized by uncertainty. We believe that SDM, in general, and ARM, in particular, provide good approaches to conservation and management. Use of SDM and ARM also clarifies the distinct roles of ecological thresholds, utility thresholds, and decision thresholds in informed decision processes.

  14. MADM-based smart parking guidance algorithm

    PubMed Central

    Li, Bo; Pei, Yijian; Wu, Hao; Huang, Dijiang

    2017-01-01

    In smart parking environments, how to choose suitable parking facilities with various attributes to satisfy certain criteria is an important decision issue. Based on the multiple attributes decision making (MADM) theory, this study proposed a smart parking guidance algorithm by considering three representative decision factors (i.e., walk duration, parking fee, and the number of vacant parking spaces) and various preferences of drivers. In this paper, the expected number of vacant parking spaces is regarded as an important attribute to reflect the difficulty degree of finding available parking spaces, and a queueing theory-based theoretical method was proposed to estimate this expected number for candidate parking facilities with different capacities, arrival rates, and service rates. The effectiveness of the MADM-based parking guidance algorithm was investigated and compared with a blind search-based approach in comprehensive scenarios with various distributions of parking facilities, traffic intensities, and user preferences. Experimental results show that the proposed MADM-based algorithm is effective to choose suitable parking resources to satisfy users’ preferences. Furthermore, it has also been observed that this newly proposed Markov Chain-based availability attribute is more effective to represent the availability of parking spaces than the arrival rate-based availability attribute proposed in existing research. PMID:29236698

  15. Basic physiological systems indicator's informative assessment for children and adolescents obesity diagnosis tasks

    NASA Astrophysics Data System (ADS)

    Marukhina, O. V.; Berestneva, O. G.; Emelyanova, Yu A.; Romanchukov, S. V.; Petrova, L.; Lombardo, C.; Kozlova, N. V.

    2018-05-01

    The healthcare computerization creates opportunities to the clinical decision support system development. In the course of diagnosis, doctor manipulates a considerable amount of data and makes a decision in the context of uncertainty basing upon the first-hand experience and knowledge. The situation is exacerbated by the fact that the knowledge scope in medicine is incrementally growing, but the decision-making time does not increase. The amount of medical malpractice is growing and it leads to various negative effects, even the mortality rate increase. IT-solution's development for clinical purposes is one of the most promising and efficient ways to prevent these effects. That is why the efforts of many IT specialists are directed to the doctor's heuristics simulating software or expert-based medical decision-making algorithms development. Thus, the objective of this study is to develop techniques and approaches for the body physiological system's informative value assessment index for the obesity degree evaluation based on the diagnostic findings.

  16. Decision making and problem solving with computer assistance

    NASA Technical Reports Server (NTRS)

    Kraiss, F.

    1980-01-01

    In modern guidance and control systems, the human as manager, supervisor, decision maker, problem solver and trouble shooter, often has to cope with a marginal mental workload. To improve this situation, computers should be used to reduce the operator from mental stress. This should not solely be done by increased automation, but by a reasonable sharing of tasks in a human-computer team, where the computer supports the human intelligence. Recent developments in this area are summarized. It is shown that interactive support of operator by intelligent computer is feasible during information evaluation, decision making and problem solving. The applied artificial intelligence algorithms comprehend pattern recognition and classification, adaptation and machine learning as well as dynamic and heuristic programming. Elementary examples are presented to explain basic principles.

  17. Research on intelligent recommendation algorithm of e-commerce based on association rules

    NASA Astrophysics Data System (ADS)

    Shen, Jiajie; Cheng, Xianyi

    2017-09-01

    As the commodities of e-commerce are more and more rich, more and more consumers are willing to choose online shopping, because of these rich varieties of commodity information, customers will often appear aesthetic fatigue. Therefore, we need a recommendation algorithm according to the recent behavior of customers including browsing and consuming to predicate and intelligently recommend goods which the customers need, thus to improve the satisfaction of customers and to increase the profit of e-commerce. This paper first discusses recommendation algorithm, then improves Apriori. Finally, using R language realizes a recommendation algorithm of commodities. The result shows that this algorithm provides a certain decision-making role for customers to buy commodities.

  18. MDCT quantification is the dominant parameter in decision–making regarding chest tube drainage for stable patients with traumatic pneumothorax

    PubMed Central

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2013-01-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899

  19. Précis of Simple heuristics that make us smart.

    PubMed

    Todd, P M; Gigerenzer, G

    2000-10-01

    How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.

  20. Information search and decision making: effects of age and complexity on strategy use.

    PubMed

    Queen, Tara L; Hess, Thomas M; Ennis, Gilda E; Dowd, Keith; Grühn, Daniel

    2012-12-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults' performance. Participants utilized 2 decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants' preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and that this ability may benefit from accrued knowledge and experience. 2013 APA, all rights reserved

  1. Neuroethology of Decision-making

    PubMed Central

    Adams, Geoffrey K.; Watson, Karli K.; Pearson, John; Platt, Michael L.

    2012-01-01

    A neuroethological approach to decision-making considers the effect of evolutionary pressures on neural circuits mediating choice. In this view, decision systems are expected to enhance fitness with respect to the local environment, and particularly efficient solutions to specific problems should be conserved, expanded, and repurposed to solve other problems. Here, we discuss basic prerequisites for a variety of decision systems from this viewpoint. We focus on two of the best-studied and most widely represented decision problems. First, we examine patch leaving, a prototype of environmentally based switching between action patterns. Second, we consider social information seeking, a process resembling foraging with search costs. We argue that while the specific neural solutions to these problems sometimes differ across species, both the problems themselves and the algorithms instantiated by biological hardware are repeated widely throughout nature. The behavioral and mathematical study of ubiquitous decision processes like patch leaving and social information seeking thus provides a powerful new approach to uncovering the fundamental design structure of nervous systems. PMID:22902613

  2. Making decisions at the end of life when caring for a person with dementia: a literature review to explore the potential use of heuristics in difficult decision-making.

    PubMed

    Mathew, R; Davies, N; Manthorpe, J; Iliffe, S

    2016-07-19

    Decision-making, when providing care and treatment for a person with dementia at the end of life, can be complex and challenging. There is a lack of guidance available to support practitioners and family carers, and even those experienced in end of life dementia care report a lack of confidence in decision-making. It is thought that the use of heuristics (rules of thumb) may aid decision-making. The aim of this study is to identify whether heuristics are used in end of life dementia care, and if so, to identify the context in which they are being used. A narrative literature review was conducted taking a systematic approach to the search strategy, using the Centre for Reviews and Dissemination guidelines. Rapid appraisal methodology was used in order to source specific and relevant literature regarding the use of heuristics in end of life dementia care. A search using terms related to dementia, palliative care and decision-making was conducted across 4 English language electronic databases (MEDLINE, EMBASE, PsycINFO and CINAHL) in 2015. The search identified 12 papers that contained an algorithm, guideline, decision tool or set of principles that we considered compatible with heuristic decision-making. The papers addressed swallowing and feeding difficulties, the treatment of pneumonia, management of pain and agitation, rationalising medication, ending life-sustaining treatment, and ensuring a good death. The use of heuristics in palliative or end of life dementia care is not described in the research literature. However, this review identified important decision-making principles, which are largely a reflection of expert opinion. These principles may have the potential to be developed into simple heuristics that could be used in practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Making decisions at the end of life when caring for a person with dementia: a literature review to explore the potential use of heuristics in difficult decision-making

    PubMed Central

    Mathew, R; Davies, N; Manthorpe, J; Iliffe, S

    2016-01-01

    Objective Decision-making, when providing care and treatment for a person with dementia at the end of life, can be complex and challenging. There is a lack of guidance available to support practitioners and family carers, and even those experienced in end of life dementia care report a lack of confidence in decision-making. It is thought that the use of heuristics (rules of thumb) may aid decision-making. The aim of this study is to identify whether heuristics are used in end of life dementia care, and if so, to identify the context in which they are being used. Design A narrative literature review was conducted taking a systematic approach to the search strategy, using the Centre for Reviews and Dissemination guidelines. Rapid appraisal methodology was used in order to source specific and relevant literature regarding the use of heuristics in end of life dementia care. Data sources A search using terms related to dementia, palliative care and decision-making was conducted across 4 English language electronic databases (MEDLINE, EMBASE, PsycINFO and CINAHL) in 2015. Results The search identified 12 papers that contained an algorithm, guideline, decision tool or set of principles that we considered compatible with heuristic decision-making. The papers addressed swallowing and feeding difficulties, the treatment of pneumonia, management of pain and agitation, rationalising medication, ending life-sustaining treatment, and ensuring a good death. Conclusions The use of heuristics in palliative or end of life dementia care is not described in the research literature. However, this review identified important decision-making principles, which are largely a reflection of expert opinion. These principles may have the potential to be developed into simple heuristics that could be used in practice. PMID:27436665

  4. MDCT quantification is the dominant parameter in decision-making regarding chest tube drainage for stable patients with traumatic pneumothorax.

    PubMed

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2012-07-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Herding, social influence and economic decision-making: socio-psychological and neuroscientific analyses.

    PubMed

    Baddeley, Michelle

    2010-01-27

    Typically, modern economics has steered away from the analysis of sociological and psychological factors and has focused on narrow behavioural assumptions in which expectations are formed on the basis of mathematical algorithms. Blending together ideas from the social and behavioural sciences, this paper argues that the behavioural approach adopted in most economic analysis, in its neglect of sociological and psychological forces and its simplistically dichotomous categorization of behaviour as either rational or not rational, is too narrow and stark. Behaviour may reflect an interaction of cognitive and emotional factors and this can be captured more effectively using an approach that focuses on the interplay of different decision-making systems. In understanding the mechanisms affecting economic and financial decision-making, an interdisciplinary approach is needed which incorporates ideas from a range of disciplines including sociology, economic psychology, evolutionary biology and neuroeconomics.

  6. A Multicriteria Decision Making Approach for Estimating the Number of Clusters in a Data Set

    PubMed Central

    Peng, Yi; Zhang, Yong; Kou, Gang; Shi, Yong

    2012-01-01

    Determining the number of clusters in a data set is an essential yet difficult step in cluster analysis. Since this task involves more than one criterion, it can be modeled as a multiple criteria decision making (MCDM) problem. This paper proposes a multiple criteria decision making (MCDM)-based approach to estimate the number of clusters for a given data set. In this approach, MCDM methods consider different numbers of clusters as alternatives and the outputs of any clustering algorithm on validity measures as criteria. The proposed method is examined by an experimental study using three MCDM methods, the well-known clustering algorithm–k-means, ten relative measures, and fifteen public-domain UCI machine learning data sets. The results show that MCDM methods work fairly well in estimating the number of clusters in the data and outperform the ten relative measures considered in the study. PMID:22870181

  7. Modeling Mixed Groups of Humans and Robots with Reflexive Game Theory

    NASA Astrophysics Data System (ADS)

    Tarasenko, Sergey

    The Reflexive Game Theory is based on decision-making principles similar to the ones used by humans. This theory considers groups of subjects and allows to predict which action from the set each subject in the group will choose. It is possible to influence subject's decision in a way that he will make a particular choice. The purpose of this study is to illustrate how robots can refrain humans from risky actions. To determine the risky actions, the Asimov's Three Laws of robotics are employed. By fusing the RGT's power to convince humans on the mental level with Asimov's Laws' safety, we illustrate how robots in the mixed groups of humans and robots can influence on human subjects in order to refrain humans from risky actions. We suggest that this fusion has a potential to device human-like motor behaving and looking robots with the human-like decision-making algorithms.

  8. Normalization is a general neural mechanism for context-dependent decision making

    PubMed Central

    Louie, Kenway; Khaw, Mel W.; Glimcher, Paul W.

    2013-01-01

    Understanding the neural code is critical to linking brain and behavior. In sensory systems, divisive normalization seems to be a canonical neural computation, observed in areas ranging from retina to cortex and mediating processes including contrast adaptation, surround suppression, visual attention, and multisensory integration. Recent electrophysiological studies have extended these insights beyond the sensory domain, demonstrating an analogous algorithm for the value signals that guide decision making, but the effects of normalization on choice behavior are unknown. Here, we show that choice models using normalization generate significant (and classically irrational) choice phenomena driven by either the value or number of alternative options. In value-guided choice experiments, both monkey and human choosers show novel context-dependent behavior consistent with normalization. These findings suggest that the neural mechanism of value coding critically influences stochastic choice behavior and provide a generalizable quantitative framework for examining context effects in decision making. PMID:23530203

  9. Herding, social influence and economic decision-making: socio-psychological and neuroscientific analyses

    PubMed Central

    Baddeley, Michelle

    2010-01-01

    Typically, modern economics has steered away from the analysis of sociological and psychological factors and has focused on narrow behavioural assumptions in which expectations are formed on the basis of mathematical algorithms. Blending together ideas from the social and behavioural sciences, this paper argues that the behavioural approach adopted in most economic analysis, in its neglect of sociological and psychological forces and its simplistically dichotomous categorization of behaviour as either rational or not rational, is too narrow and stark. Behaviour may reflect an interaction of cognitive and emotional factors and this can be captured more effectively using an approach that focuses on the interplay of different decision-making systems. In understanding the mechanisms affecting economic and financial decision-making, an interdisciplinary approach is needed which incorporates ideas from a range of disciplines including sociology, economic psychology, evolutionary biology and neuroeconomics. PMID:20026466

  10. Eye tracking and pupillometry are indicators of dissociable latent decision processes.

    PubMed

    Cavanagh, James F; Wiecki, Thomas V; Kochar, Angad; Frank, Michael J

    2014-08-01

    Can you predict what people are going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report, we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the drift diffusion model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  11. Diagnostic decision-making and strategies to improve diagnosis.

    PubMed

    Thammasitboon, Satid; Cutrer, William B

    2013-10-01

    A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.

  12. Robot Science Autonomy in the Atacama Desert and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Wettergreen, David S.

    2013-01-01

    Science-guided autonomy augments rovers with reasoning to make observations and take actions related to the objectives of scientific exploration. When rovers can directly interpret instrument measurements then scientific goals can inform and adapt ongoing navigation decisions. These autonomous explorers will make better scientific observations and collect massive, accurate datasets. In current astrobiology studies in the Atacama Desert we are applying algorithms for science autonomy to choose effective observations and measurements. Rovers are able to decide when and where to take follow-up actions that deepen scientific understanding. These techniques apply to planetary rovers, which we can illustrate with algorithms now used by Mars rovers and by discussing future missions.

  13. Medial elbow injury in young throwing athletes

    PubMed Central

    Gregory, Bonnie; Nyland, John

    2013-01-01

    Summary This report reviews the anatomy, overhead throwing biomechanics, injury mechanism and incidence, physical examination and diagnosis, diagnostic imaging and conservative treatment of medial elbow injuries in young throwing athletes. Based on the information a clinical management decision-making algorithm is presented. PMID:23888291

  14. Airborne Network Optimization with Dynamic Network Update

    DTIC Science & Technology

    2015-03-26

    Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air University...Member Dr. Barry E. Mullins Member AFIT-ENG-MS-15-M-030 Abstract Modern networks employ congestion and routing management algorithms that can perform...airborne networks. Intelligent agents can make use of Kalman filter predictions to make informed decisions to manage communication in airborne networks. The

  15. An intelligent value-driven scheduling system for Space Station Freedom with special emphasis on the electric power system

    NASA Technical Reports Server (NTRS)

    Krupp, Joseph C.

    1991-01-01

    The Electric Power Control System (EPCS) created by Decision-Science Applications, Inc. (DSA) for the Lewis Research Center is discussed. This system makes decisions on what to schedule and when to schedule it, including making choices among various options or ways of performing a task. The system is goal-directed and seeks to shape resource usage in an optimal manner using a value-driven approach. Discussed here are considerations governing what makes a good schedule, how to design a value function to find the best schedule, and how to design the algorithm that finds the schedule that maximizes this value function. Results are shown which demonstrate the usefulness of the techniques employed.

  16. Using the modified Delphi method to establish clinical consensus for the diagnosis and treatment of patients with rotator cuff pathology.

    PubMed

    Eubank, Breda H; Mohtadi, Nicholas G; Lafave, Mark R; Wiley, J Preston; Bois, Aaron J; Boorman, Richard S; Sheps, David M

    2016-05-20

    Patients presenting to the healthcare system with rotator cuff pathology do not always receive high quality care. High quality care occurs when a patient receives care that is accessible, appropriate, acceptable, effective, efficient, and safe. The aim of this study was twofold: 1) to develop a clinical pathway algorithm that sets forth a stepwise process for making decisions about the diagnosis and treatment of rotator cuff pathology presenting to primary, secondary, and tertiary healthcare settings; and 2) to establish clinical practice guidelines for the diagnosis and treatment of rotator cuff pathology to inform decision-making processes within the algorithm. A three-step modified Delphi method was used to establish consensus. Fourteen experts representing athletic therapy, physiotherapy, sport medicine, and orthopaedic surgery were invited to participate as the expert panel. In round 1, 123 best practice statements were distributed to the panel. Panel members were asked to mark "agree" or "disagree" beside each statement, and provide comments. The same voting method was again used for round 2. Round 3 consisted of a final face-to-face meeting. In round 1, statements were grouped and reduced to 44 statements that met consensus. In round 2, five statements reached consensus. In round 3, ten statements reached consensus. Consensus was reached for 59 statements representing five domains: screening, diagnosis, physical examination, investigations, and treatment. The final face-to-face meeting was also used to develop clinical pathway algorithms (i.e., clinical care pathways) for three types of rotator cuff pathology: acute, chronic, and acute-on-chronic. This consensus guideline will help to standardize care, provide guidance on the diagnosis and treatment of rotator cuff pathology, and assist in clinical decision-making for all healthcare professionals.

  17. Brain network response underlying decisions about abstract reinforcers.

    PubMed

    Mills-Finnerty, Colleen; Hanson, Catherine; Hanson, Stephen Jose

    2014-12-01

    Decision making studies typically use tasks that involve concrete action-outcome contingencies, in which subjects do something and get something. No studies have addressed decision making involving abstract reinforcers, where there are no action-outcome contingencies and choices are entirely hypothetical. The present study examines these kinds of choices, as well as whether the same biases that exist for concrete reinforcer decisions, specifically framing effects, also apply during abstract reinforcer decisions. We use both General Linear Model as well as Bayes network connectivity analysis using the Independent Multi-sample Greedy Equivalence Search (IMaGES) algorithm to examine network response underlying choices for abstract reinforcers under positive and negative framing. We find for the first time that abstract reinforcer decisions activate the same network of brain regions as concrete reinforcer decisions, including the striatum, insula, anterior cingulate, and VMPFC, results that are further supported via comparison to a meta-analysis of decision making studies. Positive and negative framing activated different parts of this network, with stronger activation in VMPFC during negative framing and in DLPFC during positive, suggesting different decision making pathways depending on frame. These results were further clarified using connectivity analysis, which revealed stronger connections between anterior cingulate, insula, and accumbens during negative framing compared to positive. Taken together, these results suggest that not only do abstract reinforcer decisions rely on the same brain substrates as concrete reinforcers, but that the response underlying framing effects on abstract reinforcers also resemble those for concrete reinforcers, specifically increased limbic system connectivity during negative frames. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Collaborative human-machine analysis to disambiguate entities in unstructured text and structured datasets

    NASA Astrophysics Data System (ADS)

    Davenport, Jack H.

    2016-05-01

    Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.

  19. Model of Decision Making through Consensus in Ranking Case

    NASA Astrophysics Data System (ADS)

    Tarigan, Gim; Darnius, Open

    2018-01-01

    The basic problem to determine ranking consensus is a problem to combine some rankings those are decided by two or more Decision Maker (DM) into ranking consensus. DM is frequently asked to present their preferences over a group of objects in terms of ranks, for example to determine a new project, new product, a candidate in a election, and so on. The problem in ranking can be classified into two major categories; namely, cardinal and ordinal rankings. The objective of the study is to obtin the ranking consensus by appying some algorithms and methods. The algorithms and methods used in this study were partial algorithm, optimal ranking consensus, BAK (Borde-Kendal)Model. A method proposed as an alternative in ranking conssensus is a Weighted Distance Forward-Backward (WDFB) method, which gave a little difference i ranking consensus result compare to the result oethe example solved by Cook, et.al (2005).

  20. A duality theorem-based algorithm for inexact quadratic programming problems: Application to waste management under uncertainty

    NASA Astrophysics Data System (ADS)

    Kong, X. M.; Huang, G. H.; Fan, Y. R.; Li, Y. P.

    2016-04-01

    In this study, a duality theorem-based algorithm (DTA) for inexact quadratic programming (IQP) is developed for municipal solid waste (MSW) management under uncertainty. It improves upon the existing numerical solution method for IQP problems. The comparison between DTA and derivative algorithm (DAM) shows that the DTA method provides better solutions than DAM with lower computational complexity. It is not necessary to identify the uncertain relationship between the objective function and decision variables, which is required for the solution process of DAM. The developed method is applied to a case study of MSW management and planning. The results indicate that reasonable solutions have been generated for supporting long-term MSW management and planning. They could provide more information as well as enable managers to make better decisions to identify desired MSW management policies in association with minimized cost under uncertainty.

  1. Modeling spatial decisions with graph theory: logging roads and forest fragmentation in the Brazilian Amazon.

    PubMed

    Walker, Robert; Arima, Eugenio; Messina, Joe; Soares-Filho, Britaldo; Perz, Stephen; Vergara, Dante; Sales, Marcio; Pereira, Ritaumaria; Castro, Williams

    2013-01-01

    This article addresses the spatial decision-making of loggers and implications for forest fragmentation in the Amazon basin. It provides a behavioral explanation for fragmentation by modeling how loggers build road networks, typically abandoned upon removal of hardwoods. Logging road networks provide access to land, and the settlers who take advantage of them clear fields and pastures that accentuate their spatial signatures. In shaping agricultural activities, these networks organize emergent patterns of forest fragmentation, even though the loggers move elsewhere. The goal of the article is to explicate how loggers shape their road networks, in order to theoretically explain an important type of forest fragmentation found in the Amazon basin, particularly in Brazil. This is accomplished by adapting graph theory to represent the spatial decision-making of loggers, and by implementing computational algorithms that build graphs interpretable as logging road networks. The economic behavior of loggers is conceptualized as a profit maximization problem, and translated into spatial decision-making by establishing a formal correspondence between mathematical graphs and road networks. New computational approaches, adapted from operations research, are used to construct graphs and simulate spatial decision-making as a function of discount rates, land tenure, and topographic constraints. The algorithms employed bracket a range of behavioral settings appropriate for areas of terras de volutas, public lands that have not been set aside for environmental protection, indigenous peoples, or colonization. The simulation target sites are located in or near so-called Terra do Meio, once a major logging frontier in the lower Amazon Basin. Simulation networks are compared to empirical ones identified by remote sensing and then used to draw inferences about factors influencing the spatial behavior of loggers. Results overall suggest that Amazonia's logging road networks induce more fragmentation than necessary to access fixed quantities of wood. The paper concludes by considering implications of the approach and findings for Brazil's move to a system of concession logging.

  2. An approach to decision-making with triangular fuzzy reciprocal preference relations and its application

    NASA Astrophysics Data System (ADS)

    Meng, Fanyong

    2018-02-01

    Triangular fuzzy reciprocal preference relations (TFRPRs) are powerful tools to denoting decision-makers' fuzzy judgments, which permit the decision-makers to apply triangular fuzzy ratio rather than real numbers to express their judgements. Consistency analysis is one of the most crucial issues in preference relations that can guarantee the reasonable ranking order. However, all previous consistency concepts cannot well address this type of preference relations. Based on the operational laws on triangular fuzzy numbers, this paper introduces an additive consistency concept for TFRPRs by using quasi TFRPRs, which can be seen as a natural extension of the crisp case. Using this consistency concept, models to judging the additive consistency of TFRPRs and to estimating missing values in complete TFRPRs are constructed. Then, an algorithm to decision-making with TFRPRs is developed. Finally, two numerical examples are offered to illustrate the application of the proposed procedure, and comparison analysis is performed.

  3. Leveraging human decision making through the optimal management of centralized resources

    NASA Astrophysics Data System (ADS)

    Hyden, Paul; McGrath, Richard G.

    2016-05-01

    Combining results from mixed integer optimization, stochastic modeling and queuing theory, we will advance the interdisciplinary problem of efficiently and effectively allocating centrally managed resources. Academia currently fails to address this, as the esoteric demands of each of these large research areas limits work across traditional boundaries. The commercial space does not currently address these challenges due to the absence of a profit metric. By constructing algorithms that explicitly use inputs across boundaries, we are able to incorporate the advantages of using human decision makers. Key improvements in the underlying algorithms are made possible by aligning decision maker goals with the feedback loops introduced between the core optimization step and the modeling of the overall stochastic process of supply and demand. A key observation is that human decision-makers must be explicitly included in the analysis for these approaches to be ultimately successful. Transformative access gives warfighters and mission owners greater understanding of global needs and allows for relationships to guide optimal resource allocation decisions. Mastery of demand processes and optimization bottlenecks reveals long term maximum marginal utility gaps in capabilities.

  4. Acquisition and production of skilled behavior in dynamic decision-making tasks

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1992-01-01

    Detailed summaries of two NASA-funded research projects are provided. The first project was an ecological task analysis of the Star Cruiser model. Star Cruiser is a psychological model designed to test a subject's level of cognitive activity. Ecological task analysis is used as a framework to predict the types of cognitive activity required to achieve productive behavior and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The second project is presented in the form of a thesis for the Masters Degree. The thesis discusses the modeling of decision-making through the use of neural network and genetic-algorithm machine learning technologies.

  5. DeMAID/GA USER'S GUIDE Design Manager's Aid for Intelligent Decomposition with a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1996-01-01

    Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool that is available to aid in this decision making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial release of DEMAID in 1989, numerous enhancements have been added to aid the design manager in saving both cost and time in a design cycle. The key enhancement is a genetic algorithm (GA) and the enhanced version is called DeMAID/GA. The GA orders the sequence of design processes to minimize the cost and time to converge to a solution. These enhancements as well as the existing features of the original version of DEMAID are described. Two sample problems are used to show how these enhancements can be applied to improve the design cycle. This report serves as a user's guide for DeMAID/GA.

  6. Efficient decision-making by volume-conserving physical object

    NASA Astrophysics Data System (ADS)

    Kim, Song-Ju; Aono, Masashi; Nameda, Etsushi

    2015-08-01

    Decision-making is one of the most important intellectual abilities of not only humans but also other biological organisms, helping their survival. This ability, however, may not be limited to biological systems and may be exhibited by physical systems. Here we demonstrate that any physical object, as long as its volume is conserved when coupled with suitable operations, provides a sophisticated decision-making capability. We consider the multi-armed bandit problem (MBP), the problem of finding, as accurately and quickly as possible, the most profitable option from a set of options that gives stochastic rewards. Efficient MBP solvers are useful for many practical applications, because MBP abstracts a variety of decision-making problems in real-world situations in which an efficient trial-and-error is required. These decisions are made as dictated by a physical object, which is moved in a manner similar to the fluctuations of a rigid body in a tug-of-war (TOW) game. This method, called ‘TOW dynamics’, exhibits higher efficiency than conventional reinforcement learning algorithms. We show analytical calculations that validate statistical reasons for TOW dynamics to produce the high performance despite its simplicity. These results imply that various physical systems in which some conservation law holds can be used to implement an efficient ‘decision-making object’. The proposed scheme will provide a new perspective to open up a physics-based analog computing paradigm and to understanding the biological information-processing principles that exploit their underlying physics.

  7. Novel Blind Recognition Algorithm of Frame Synchronization Words Based on Soft-Decision in Digital Communication Systems.

    PubMed

    Qin, Jiangyi; Huang, Zhiping; Liu, Chunwu; Su, Shaojing; Zhou, Jing

    2015-01-01

    A novel blind recognition algorithm of frame synchronization words is proposed to recognize the frame synchronization words parameters in digital communication systems. In this paper, a blind recognition method of frame synchronization words based on the hard-decision is deduced in detail. And the standards of parameter recognition are given. Comparing with the blind recognition based on the hard-decision, utilizing the soft-decision can improve the accuracy of blind recognition. Therefore, combining with the characteristics of Quadrature Phase Shift Keying (QPSK) signal, an improved blind recognition algorithm based on the soft-decision is proposed. Meanwhile, the improved algorithm can be extended to other signal modulation forms. Then, the complete blind recognition steps of the hard-decision algorithm and the soft-decision algorithm are given in detail. Finally, the simulation results show that both the hard-decision algorithm and the soft-decision algorithm can recognize the parameters of frame synchronization words blindly. What's more, the improved algorithm can enhance the accuracy of blind recognition obviously.

  8. Prospective Architectures for Onboard vs Cloud-Based Decision Making for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Teubert, Christopher

    2017-01-01

    This paper investigates propsective architectures for decision-making in unmanned aerial systems. When these unmanned vehicles operate in urban environments, there are several sources of uncertainty that affect their behavior, and decision-making algorithms need to be robust to account for these different sources of uncertainty. It is important to account for several risk-factors that affect the flight of these unmanned systems, and facilitate decision-making by taking into consideration these various risk-factors. In addition, there are several technical challenges related to autonomous flight of unmanned aerial systems; these challenges include sensing, obstacle detection, path planning and navigation, trajectory generation and selection, etc. Many of these activities require significant computational power and in many situations, all of these activities need to be performed in real-time. In order to efficiently integrate these activities, it is important to develop a systematic architecture that can facilitate real-time decision-making. Four prospective architectures are discussed in this paper; on one end of the spectrum, the first architecture considers all activities/computations being performed onboard the vehicle whereas on the other end of the spectrum, the fourth and final architecture considers all activities/computations being performed in the cloud, using a new service known as Prognostics as a Service that is being developed at NASA Ames Research Center. The four different architectures are compared, their advantages and disadvantages are explained and conclusions are presented.

  9. A tunable algorithm for collective decision-making.

    PubMed

    Pratt, Stephen C; Sumpter, David J T

    2006-10-24

    Complex biological systems are increasingly understood in terms of the algorithms that guide the behavior of system components and the information pathways that link them. Much attention has been given to robust algorithms, or those that allow a system to maintain its functions in the face of internal or external perturbations. At the same time, environmental variation imposes a complementary need for algorithm versatility, or the ability to alter system function adaptively as external circumstances change. An important goal of systems biology is thus the identification of biological algorithms that can meet multiple challenges rather than being narrowly specified to particular problems. Here we show that emigrating colonies of the ant Temnothorax curvispinosus tune the parameters of a single decision algorithm to respond adaptively to two distinct problems: rapid abandonment of their old nest in a crisis and deliberative selection of the best available new home when their old nest is still intact. The algorithm uses a stepwise commitment scheme and a quorum rule to integrate information gathered by numerous individual ants visiting several candidate homes. By varying the rates at which they search for and accept these candidates, the ants yield a colony-level response that adaptively emphasizes either speed or accuracy. We propose such general but tunable algorithms as a design feature of complex systems, each algorithm providing elegant solutions to a wide range of problems.

  10. Conflicts of interest improve collective computation of adaptive social structures

    PubMed Central

    Brush, Eleanor R.; Krakauer, David C.; Flack, Jessica C.

    2018-01-01

    In many biological systems, the functional behavior of a group is collectively computed by the system’s individual components. An example is the brain’s ability to make decisions via the activity of billions of neurons. A long-standing puzzle is how the components’ decisions combine to produce beneficial group-level outputs, despite conflicts of interest and imperfect information. We derive a theoretical model of collective computation from mechanistic first principles, using results from previous work on the computation of power structure in a primate model system. Collective computation has two phases: an information accumulation phase, in which (in this study) pairs of individuals gather information about their fighting abilities and make decisions about their dominance relationships, and an information aggregation phase, in which these decisions are combined to produce a collective computation. To model information accumulation, we extend a stochastic decision-making model—the leaky integrator model used to study neural decision-making—to a multiagent game-theoretic framework. We then test alternative algorithms for aggregating information—in this study, decisions about dominance resulting from the stochastic model—and measure the mutual information between the resultant power structure and the “true” fighting abilities. We find that conflicts of interest can improve accuracy to the benefit of all agents. We also find that the computation can be tuned to produce different power structures by changing the cost of waiting for a decision. The successful application of a similar stochastic decision-making model in neural and social contexts suggests general principles of collective computation across substrates and scales. PMID:29376116

  11. Design for sustainability of industrial symbiosis based on emergy and multi-objective particle swarm optimization.

    PubMed

    Ren, Jingzheng; Liang, Hanwei; Dong, Liang; Sun, Lu; Gao, Zhiqiu

    2016-08-15

    Industrial symbiosis provides novel and practical pathway to the design for the sustainability. Decision support tool for its verification is necessary for practitioners and policy makers, while to date, quantitative research is limited. The objective of this work is to present an innovative approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied by the proposed method, a few of compromises between high profitability and high sustainability can be obtained for the decision-makers/stakeholders to make decision. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Meeting the nutritional needs of patients with severe dysphagia following a stroke: an interdisciplinary approach.

    PubMed

    Rodrigue, Nathalie; Côté, Robert; Kirsch, Connie; Germain, Chantal; Couturier, Céline; Fraser, Roxanne

    2002-03-01

    Dysphagia is a common problem with individuals who have experienced a stroke. The interdisciplinary stroke team noted delays in clinical decision-making, or in implementing plans for patients with severe dysphagia requiring an alternative method to oral feeding, such as enteral feeding via Dobhoff (naso-jejunum) or PEG (percutaneous endoscopic gastrostomy) tubes, occurred because protocols had not been established. This resulted in undernourishment, which in turn contributed to clinical problems, such as infections and confusion, which delayed rehabilitation and contributed to excess disability. The goal of the project was to improve quality of care and quality of life for stroke patients experiencing swallowing problems by creating a dysphagia management decision-making process. The project began with a retrospective chart review of 91 cases over a period of six months to describe the population characteristics, dysphagia frequency, stroke and dysphagia severity, and delays encountered with decision-making regarding dysphagia management. A literature search was conducted, and experts in the field were consulted to provide current knowledge prior to beginning the project. Using descriptive statistics, dysphagia was present in 44% of the stroke population and 69% had mild to moderate stroke severity deficit. Delays were found in the decision to insert a PEG (mean 10 days) and the time between decision and PEG insertion (mean 12 days). Critical periods were examined in order to speed up the process of decision-making and intervention. This resulted in the creation of a decision-making algorithm based on stroke and dysphagia severity that will be tested during winter 2002.

  13. Toward Accountable Discrimination-Aware Data Mining: The Importance of Keeping the Human in the Loop-and Under the Looking Glass.

    PubMed

    Berendt, Bettina; Preibusch, Sören

    2017-06-01

    "Big Data" and data-mined inferences are affecting more and more of our lives, and concerns about their possible discriminatory effects are growing. Methods for discrimination-aware data mining and fairness-aware data mining aim at keeping decision processes supported by information technology free from unjust grounds. However, these formal approaches alone are not sufficient to solve the problem. In the present article, we describe reasons why discrimination with data can and typically does arise through the combined effects of human and machine-based reasoning, and argue that this requires a deeper understanding of the human side of decision-making with data mining. We describe results from a large-scale human-subjects experiment that investigated such decision-making, analyzing the reasoning that participants reported during their task to assess whether a loan request should or would be granted. We derive data protection by design strategies for making decision-making discrimination-aware in an accountable way, grounding these requirements in the accountability principle of the European Union General Data Protection Regulation, and outline how their implementations can integrate algorithmic, behavioral, and user interface factors.

  14. Bounded-Degree Approximations of Stochastic Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Christopher J.; Pinar, Ali; Kiyavash, Negar

    2017-06-01

    We propose algorithms to approximate directed information graphs. Directed information graphs are probabilistic graphical models that depict causal dependencies between stochastic processes in a network. The proposed algorithms identify optimal and near-optimal approximations in terms of Kullback-Leibler divergence. The user-chosen sparsity trades off the quality of the approximation against visual conciseness and computational tractability. One class of approximations contains graphs with speci ed in-degrees. Another class additionally requires that the graph is connected. For both classes, we propose algorithms to identify the optimal approximations and also near-optimal approximations, using a novel relaxation of submodularity. We also propose algorithms to identifymore » the r-best approximations among these classes, enabling robust decision making.« less

  15. Autonomous Flight Safety System - Phase III

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Autonomous Flight Safety System (AFSS) is a joint KSC and Wallops Flight Facility project that uses tracking and attitude data from onboard Global Positioning System (GPS) and inertial measurement unit (IMU) sensors and configurable rule-based algorithms to make flight termination decisions. AFSS objectives are to increase launch capabilities by permitting launches from locations without range safety infrastructure, reduce costs by eliminating some downrange tracking and communication assets, and reduce the reaction time for flight termination decisions.

  16. Contour classification in thermographic images for detection of breast cancer

    NASA Astrophysics Data System (ADS)

    Okuniewski, Rafał; Nowak, Robert M.; Cichosz, Paweł; Jagodziński, Dariusz; Matysiewicz, Mateusz; Neumann, Łukasz; Oleszkiewicz, Witold

    2016-09-01

    Thermographic images of breast taken by the Braster device are uploaded into web application which uses different classification algorithms to automatically decide whether a patient should be more thoroughly examined. This article presents the approach to the task of classifying contours visible on thermographic images of breast taken by the Braster device in order to make the decision about the existence of cancerous tumors in breast. It presents the results of the researches conducted on the different classification algorithms.

  17. Decision algorithm for data center vortex beam receiver

    NASA Astrophysics Data System (ADS)

    Kupferman, Judy; Arnon, Shlomi

    2017-12-01

    We present a new scheme for a vortex beam communications system which exploits the radial component p of Laguerre-Gauss modes in addition to the azimuthal component l generally used. We derive a new encoding algorithm which makes use of the spatial distribution of intensity to create an alphabet dictionary for communication. We suggest an application of the scheme as part of an optical wireless link for intra data center communication. We investigate the probability of error in decoding, for several detector options.

  18. A CNN based neurobiology inspired approach for retinal image quality assessment.

    PubMed

    Mahapatra, Dwarikanath; Roy, Pallab K; Sedai, Suman; Garnavi, Rahil

    2016-08-01

    Retinal image quality assessment (IQA) algorithms use different hand crafted features for training classifiers without considering the working of the human visual system (HVS) which plays an important role in IQA. We propose a convolutional neural network (CNN) based approach that determines image quality using the underlying principles behind the working of the HVS. CNNs provide a principled approach to feature learning and hence higher accuracy in decision making. Experimental results demonstrate the superior performance of our proposed algorithm over competing methods.

  19. Adaptive Decision Making and Coordination in Variable Structure Organizations

    DTIC Science & Technology

    1994-09-01

    behavior of the net. The design problem is addressed by (a) focusing on algorithms that relate structural properties of’ the Petri Net model to... behavioral characteristics; and (b) by incorporating design requirements in the Lattice algorithm. ’K94-30756 9 4 9 2 P 0 8 II083II Bl l~ll i1111 I! 14...the more resource- consuming the process is. The architecture designer has to deal with these two parameters and perform some tradeoffs. The more

  20. Use of the AHP methodology in system dynamics: Modelling and simulation for health technology assessments to determine the correct prosthesis choice for hernia diseases.

    PubMed

    Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela

    2018-05-01

    Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Observations on the Invalid Scoring Algorithm of "NASA" and Similar Consensus Tasks.

    ERIC Educational Resources Information Center

    Slevin, Dennis P.

    1978-01-01

    The NASA ranking task and similar ranking activities used to demonstrate the superiority of group thinking are examined. It is argued that the current scores cannot be used to prove the superiority of group-consensus decision making in either training or research settings. (Author)

  2. An Evolutionary Analysis of Learned Attention

    ERIC Educational Resources Information Center

    Hullinger, Richard A.; Kruschke, John K.; Todd, Peter M.

    2015-01-01

    Humans and many other species selectively attend to stimuli or stimulus dimensions--but why should an animal constrain information input in this way? To investigate the adaptive functions of attention, we used a genetic algorithm to evolve simple connectionist networks that had to make categorization decisions in a variety of environmental…

  3. Extending the Educational Planning Discourse: Conceptual and Paradigmatic Explorations.

    ERIC Educational Resources Information Center

    Adams, Don

    1988-01-01

    Argues that rational, functionalist models of educational planning that conceptualize decision-making as an algorithmic process are relevant to a limited number of educational problems. Suggests that educational questions pertaining to goals, needs, equity, and quality must be solved with soft systems thinking and its interpretivist and relativist…

  4. Soldier Decision-Making for Allocation of Intelligence, Surveillance, and Reconnaissance Assets

    DTIC Science & Technology

    2014-06-01

    Judgments; also called Algoritmic or Statistical Judgements Computer Science , Psychology, and Statistics Actuarial or algorithmic...Jan. 2011. [17] R. M. Dawes, D. Faust, and P. E. Meehl, “Clinical versus Actuarial Judgment,” Science , vol. 243, no. 4899, pp. 1668–1674, 1989. [18...School of Computer Science

  5. Ocean Data Quality Control

    DTIC Science & Technology

    2011-11-18

    the aero- sol at the coincident time and location of the satellite SST retrievals. This informa- tion is available in the daytime for the anti-solar...are of the same form, such as probabilities or standard normal deviates. A quality control decision-making algorithm in use at the U.S. Navy oceano

  6. Robust Bayesian Algorithm for Targeted Compound Screening in Forensic Toxicology.

    PubMed

    Woldegebriel, Michael; Gonsalves, John; van Asten, Arian; Vivó-Truyols, Gabriel

    2016-02-16

    As part of forensic toxicological investigation of cases involving unexpected death of an individual, targeted or untargeted xenobiotic screening of post-mortem samples is normally conducted. To this end, liquid chromatography (LC) coupled to high-resolution mass spectrometry (MS) is typically employed. For data analysis, almost all commonly applied algorithms are threshold-based (frequentist). These algorithms examine the value of a certain measurement (e.g., peak height) to decide whether a certain xenobiotic of interest (XOI) is present/absent, yielding a binary output. Frequentist methods pose a problem when several sources of information [e.g., shape of the chromatographic peak, isotopic distribution, estimated mass-to-charge ratio (m/z), adduct, etc.] need to be combined, requiring the approach to make arbitrary decisions at substep levels of data analysis. We hereby introduce a novel Bayesian probabilistic algorithm for toxicological screening. The method tackles the problem with a different strategy. It is not aimed at reaching a final conclusion regarding the presence of the XOI, but it estimates its probability. The algorithm effectively and efficiently combines all possible pieces of evidence from the chromatogram and calculates the posterior probability of the presence/absence of XOI features. This way, the model can accommodate more information by updating the probability if extra evidence is acquired. The final probabilistic result assists the end user to make a final decision with respect to the presence/absence of the xenobiotic. The Bayesian method was validated and found to perform better (in terms of false positives and false negatives) than the vendor-supplied software package.

  7. Load balancing prediction method of cloud storage based on analytic hierarchy process and hybrid hierarchical genetic algorithm.

    PubMed

    Zhou, Xiuze; Lin, Fan; Yang, Lvqing; Nie, Jing; Tan, Qian; Zeng, Wenhua; Zhang, Nian

    2016-01-01

    With the continuous expansion of the cloud computing platform scale and rapid growth of users and applications, how to efficiently use system resources to improve the overall performance of cloud computing has become a crucial issue. To address this issue, this paper proposes a method that uses an analytic hierarchy process group decision (AHPGD) to evaluate the load state of server nodes. Training was carried out by using a hybrid hierarchical genetic algorithm (HHGA) for optimizing a radial basis function neural network (RBFNN). The AHPGD makes the aggregative indicator of virtual machines in cloud, and become input parameters of predicted RBFNN. Also, this paper proposes a new dynamic load balancing scheduling algorithm combined with a weighted round-robin algorithm, which uses the predictive periodical load value of nodes based on AHPPGD and RBFNN optimized by HHGA, then calculates the corresponding weight values of nodes and makes constant updates. Meanwhile, it keeps the advantages and avoids the shortcomings of static weighted round-robin algorithm.

  8. True Numerical Cognition in the Wild.

    PubMed

    Piantadosi, Steven T; Cantlon, Jessica F

    2017-04-01

    Cognitive and neural research over the past few decades has produced sophisticated models of the representations and algorithms underlying numerical reasoning in humans and other animals. These models make precise predictions for how humans and other animals should behave when faced with quantitative decisions, yet primarily have been tested only in laboratory tasks. We used data from wild baboons' troop movements recently reported by Strandburg-Peshkin, Farine, Couzin, and Crofoot (2015) to compare a variety of models of quantitative decision making. We found that the decisions made by these naturally behaving wild animals rely specifically on numerical representations that have key homologies with the psychophysics of human number representations. These findings provide important new data on the types of problems human numerical cognition was designed to solve and constitute the first robust evidence of true numerical reasoning in wild animals.

  9. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    PubMed

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  10. Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights

    PubMed Central

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270

  11. Brain pathways for cognitive-emotional decision making in the human animal.

    PubMed

    Levine, Daniel S

    2009-04-01

    As roles for different brain regions become clearer, a picture emerges of how primate prefrontal cortex executive circuitry influences subcortical decision making pathways inherited from other mammals. The human's basic needs or drives can be interpreted as residing in an on-center off-surround network in motivational regions of the hypothalamus and brain stem. Such a network has multiple attractors that, in this case, represent the amount of satisfaction of these needs, and we consider and interpret neurally a continuous-time simulated annealing algorithm for moving between attractors under the influence of noise that represents "discontent" combined with "initiative." For decision making on specific tasks, we employ a variety of rules whose neural circuitry appears to involve the amygdala and the orbital, cingulate, and dorsolateral regions of prefrontal cortex. These areas can be interpreted as connected in a three-layer adaptive resonance network. The vigilance of the network, which is influenced by the state of the hypothalamic needs network, determines the level of sophistication of the rule being utilized.

  12. Neurologic Complications in Infective Endocarditis

    PubMed Central

    Morris, Nicholas A.; Matiello, Marcelo; Samuels, Martin A.

    2014-01-01

    Neurologic complications of infective endocarditis (IE) are common and frequently life threatening. Neurologic events are not always obvious. The prediction and management of neurologic complications of IE are not easily approached algorithmically, and the impact they have on timing and ability to surgically repair or replace the affected valve often requires a painstaking evaluation and joint effort across multiple medical disciplines in order to achieve the best possible outcome. Although specific recommendations are always tailored to the individual patient, there are some guiding principles that can be used to help direct the decision-making process. Herein, we review the pathophysiology, epidemiology, manifestations, and diagnosis of neurological complications of IE and further consider the impact they have on clinical decision making. PMID:25360207

  13. A Discussion on Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms based on Kalman Filter Estimation Applied to Prognostics of Electronics Components

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Saxen, Abhinav; Goebel, Kai

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process and how it relates to uncertainty representation, management, and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function and the true remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for the two while considering prognostics in making critical decisions.

  14. Abyss or Shelter? On the Relevance of Web Search Engines' Search Results When People Google for Suicide.

    PubMed

    Haim, Mario; Arendt, Florian; Scherr, Sebastian

    2017-02-01

    Despite evidence that suicide rates can increase after suicides are widely reported in the media, appropriate depictions of suicide in the media can help people to overcome suicidal crises and can thus elicit preventive effects. We argue on the level of individual media users that a similar ambivalence can be postulated for search results on online suicide-related search queries. Importantly, the filter bubble hypothesis (Pariser, 2011) states that search results are biased by algorithms based on a person's previous search behavior. In this study, we investigated whether suicide-related search queries, including either potentially suicide-preventive or -facilitative terms, influence subsequent search results. This might thus protect or harm suicidal Internet users. We utilized a 3 (search history: suicide-related harmful, suicide-related helpful, and suicide-unrelated) × 2 (reactive: clicking the top-most result link and no clicking) experimental design applying agent-based testing. While findings show no influences either of search histories or of reactivity on search results in a subsequent situation, the presentation of a helpline offer raises concerns about possible detrimental algorithmic decision-making: Algorithms "decided" whether or not to present a helpline, and this automated decision, then, followed the agent throughout the rest of the observation period. Implications for policy-making and search providers are discussed.

  15. A model of interaction between anticorruption authority and corruption groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neverova, Elena G.; Malafeyef, Oleg A.

    The paper provides a model of interaction between anticorruption unit and corruption groups. The main policy functions of the anticorruption unit involve reducing corrupt practices in some entities through an optimal approach to resource allocation and effective anticorruption policy. We develop a model based on Markov decision-making process and use Howard’s policy-improvement algorithm for solving an optimal decision strategy. We examine the assumption that corruption groups retaliate against the anticorruption authority to protect themselves. This model was implemented through stochastic game.

  16. An Injury Severity-, Time Sensitivity-, and Predictability-Based Advanced Automatic Crash Notification Algorithm Improves Motor Vehicle Crash Occupant Triage.

    PubMed

    Stitzel, Joel D; Weaver, Ashley A; Talton, Jennifer W; Barnard, Ryan T; Schoell, Samantha L; Doud, Andrea N; Martin, R Shayn; Meredith, J Wayne

    2016-06-01

    Advanced Automatic Crash Notification algorithms use vehicle telemetry measurements to predict risk of serious motor vehicle crash injury. The objective of the study was to develop an Advanced Automatic Crash Notification algorithm to reduce response time, increase triage efficiency, and improve patient outcomes by minimizing undertriage (<5%) and overtriage (<50%), as recommended by the American College of Surgeons. A list of injuries associated with a patient's need for Level I/II trauma center treatment known as the Target Injury List was determined using an approach based on 3 facets of injury: severity, time sensitivity, and predictability. Multivariable logistic regression was used to predict an occupant's risk of sustaining an injury on the Target Injury List based on crash severity and restraint factors for occupants in the National Automotive Sampling System - Crashworthiness Data System 2000-2011. The Advanced Automatic Crash Notification algorithm was optimized and evaluated to minimize triage rates, per American College of Surgeons recommendations. The following rates were achieved: <50% overtriage and <5% undertriage in side impacts and 6% to 16% undertriage in other crash modes. Nationwide implementation of our algorithm is estimated to improve triage decisions for 44% of undertriaged and 38% of overtriaged occupants. Annually, this translates to more appropriate care for >2,700 seriously injured occupants and reduces unnecessary use of trauma center resources for >162,000 minimally injured occupants. The algorithm could be incorporated into vehicles to inform emergency personnel of recommended motor vehicle crash triage decisions. Lower under- and overtriage was achieved, and nationwide implementation of the algorithm would yield improved triage decision making for an estimated 165,000 occupants annually. Copyright © 2016. Published by Elsevier Inc.

  17. Behavioral Stage of Change and Dialysis Decision-Making

    PubMed Central

    McGrail, Anna; Lewis, Steven A.; Schold, Jesse; Lawless, Mary Ellen; Sehgal, Ashwini R.; Perzynski, Adam T.

    2015-01-01

    Background and objectives Behavioral stage of change (SoC) algorithms classify patients’ readiness for medical treatment decision-making. In the precontemplation stage, patients have no intention to take action within 6 months. In the contemplation stage, action is intended within 6 months. In the preparation stage, patients intend to take action within 30 days. In the action stage, the change has been made. This study examines the influence of SoC on dialysis modality decision-making. Design, setting, participants, & measurements SoC and relevant covariates were measured, and associations with dialysis decision-making were determined. In-depth interviews were conducted with 16 patients on dialysis to elicit experiences. Qualitative interview data informed the survey design. Surveys were administered to adults with CKD (eGFR≤25 ml/min/1.73 m2) from August, 2012 to June, 2013. Multivariable logistic regression modeled dialysis decision-making with predictors: SoC, provider connection, and dialysis knowledge score. Results Fifty-five patients completed the survey (71% women, 39% white, and 59% black), and median annual income was $17,500. In total, 65% of patients were in the precontemplation/contemplation (thinking) and 35% of patients were in the preparation/maintenance (acting) SoC; 62% of patients had made dialysis modality decisions. Doctors explaining modality options, higher dialysis knowledge scores, and fewer lifestyle barriers were associated with acting versus thinking SoC (all P<0.02). Patients making modality decisions had doctors who explained dialysis options (76% versus 43%), were in the acting versus the thinking SoC (50% versus 10%), had higher dialysis knowledge scores (1.4 versus 0.5), and had lower eGFR (13.9 versus 16.8 ml/min/1.73 m2; all P<0.05). In adjusted analyses, dialysis knowledge was significantly associated with decision-making (odds ratio, 4.2; 95% confidence interval, 1.4 to 12.9; P=0.01), and SoC was of borderline significance (odds ratio, 5.8; 95% confidence interval, 1.0 to 32.6; P=0.05). The model C statistic was 0.87. Conclusions Dialysis decision-making was associated with SoC, dialysis knowledge, and physicians discussing treatment options. Future studies determining ways to assist patients with CKD in making satisfying modality decisions are warranted. PMID:25591499

  18. Algorithms in the First-Line Treatment of Metastatic Clear Cell Renal Cell Carcinoma--Analysis Using Diagnostic Nodes.

    PubMed

    Rothermundt, Christian; Bailey, Alexandra; Cerbone, Linda; Eisen, Tim; Escudier, Bernard; Gillessen, Silke; Grünwald, Viktor; Larkin, James; McDermott, David; Oldenburg, Jan; Porta, Camillo; Rini, Brian; Schmidinger, Manuela; Sternberg, Cora; Putora, Paul M

    2015-09-01

    With the advent of targeted therapies, many treatment options in the first-line setting of metastatic clear cell renal cell carcinoma (mccRCC) have emerged. Guidelines and randomized trial reports usually do not elucidate the decision criteria for the different treatment options. In order to extract the decision criteria for the optimal therapy for patients, we performed an analysis of treatment algorithms from experts in the field. Treatment algorithms for the treatment of mccRCC from experts of 11 institutions were obtained, and decision trees were deduced. Treatment options were identified and a list of unified decision criteria determined. The final decision trees were analyzed with a methodology based on diagnostic nodes, which allows for an automated cross-comparison of decision trees. The most common treatment recommendations were determined, and areas of discordance were identified. The analysis revealed heterogeneity in most clinical scenarios. The recommendations selected for first-line treatment of mccRCC included sunitinib, pazopanib, temsirolimus, interferon-α combined with bevacizumab, high-dose interleukin-2, sorafenib, axitinib, everolimus, and best supportive care. The criteria relevant for treatment decisions were performance status, Memorial Sloan Kettering Cancer Center risk group, only or mainly lung metastases, cardiac insufficiency, hepatic insufficiency, age, and "zugzwang" (composite of multiple, related criteria). In the present study, we used diagnostic nodes to compare treatment algorithms in the first-line treatment of mccRCC. The results illustrate the heterogeneity of the decision criteria and treatment strategies for mccRCC and how available data are interpreted and implemented differently among experts. The data provided in the present report should not be considered to serve as treatment recommendations for the management of treatment-naïve patients with multiple metastases from metastatic clear cell renal cell carcinoma outside a clinical trial; however, the data highlight the different treatment options and the criteria used to select them. The diversity in decision making and how results from phase III trials can be interpreted and implemented differently in daily practice are demonstrated. ©AlphaMed Press.

  19. Out-of-Hospital Decision-Making and Factors Influencing the Regional Distribution of Injured Patients in a Trauma System

    PubMed Central

    Newgard, Craig D.; Nelson, Maria J.; Kampp, Michael; Saha, Somnath; Zive, Dana; Schmidt, Terri; Daya, Mohamud; Jui, Jonathan; Wittwer, Lynn; Warden, Craig; Sahni, Ritu; Stevens, Mark; Gorman, Kyle; Koenig, Karl; Gubler, Dean; Rosteck, Pontine; Lee, Jan; Hedges, Jerris R.

    2011-01-01

    Background The decision-making processes used for out-of-hospital trauma triage and hospital selection in regionalized trauma systems remain poorly understood. The objective of this study was to understand the process of field triage decision-making in an established trauma system. Methods We used a mixed methods approach, including EMS records to quantify triage decisions and reasons for hospital selection in a population-based, injury cohort (2006 - 2008), plus a focused ethnography to understand EMS cognitive reasoning in making triage decisions. The study included 10 EMS agencies providing service to a 4-county regional trauma system with 3 trauma centers and 13 non-trauma hospitals. For qualitative analyses, we conducted field observation and interviews with 35 EMS field providers and a round-table discussion with 40 EMS management personnel to generate an empirical model of out-of-hospital decision making in trauma triage. Results 64,190 injured patients were evaluated by EMS, of whom 56,444 (88.0%) were transported to acute care hospitals and 9,637 (17.1% of transports) were field trauma activations. For non-trauma activations, patient/family preference and proximity accounted for 78% of destination decisions. EMS provider judgment was cited in 36% of field trauma activations and was the sole criterion in 23% of trauma patients. The empirical model demonstrated that trauma triage is driven primarily by EMS provider “gut feeling” (judgment) and relies heavily on provider experience, mechanism of injury, and early visual cues at the scene. Conclusions Provider cognitive reasoning for field trauma triage is more heuristic than algorithmic and driven primarily by provider judgment, rather than specific triage criteria. PMID:21817971

  20. A fourth dimension in decision making in hepatology.

    PubMed

    Ilan, Yaron

    2010-12-01

    Today, the assessment of liver function in patients suffering from acute or chronic liver disease is based on liver biopsy and blood tests including synthetic function, liver enzymes and viral load, most of which provide only circumstantial evidence as to the degree of hepatic impairment. Most of these tests lack the degree of sensitivity to be useful for follow-up of these patients at the frequency that is needed for decision making in clinical hepatology. Accurate assessment of liver function is essential to determine both short- and long-term prognosis, and for making decisions about liver and non-liver surgery, TIPS, chemoembolization or radiofrequency ablation in patients with chronic liver disease. Liver function tests can serve as the basis for accurate decision-making regarding the need for liver transplantation in the setting of acute failure or in patients with chronic liver disease. The liver metabolic breath test relies on measuring exhaled (13) C tagged methacetin, which is metabolized only by the liver. Measuring this liver-specific substrate by means of molecular correlation spectroscopy is a rapid, non-invasive method for assessing liver function at the point-of-care. The (13) C methacetin breath test (MBT) is a powerful tool to aid clinical hepatologists in bedside decision-making. Our recent findings regarding the ability of point-of-care (13) C MBT to assess the hepatic functional reserve in patients with acute and chronic liver disease are reviewed along with suggested treatment algorithms for common liver disorders. © 2010 The Japan Society of Hepatology.

  1. WATCHMAN: A Data Warehouse Intelligent Cache Manager

    NASA Technical Reports Server (NTRS)

    Scheuermann, Peter; Shim, Junho; Vingralek, Radek

    1996-01-01

    Data warehouses store large volumes of data which are used frequently by decision support applications. Such applications involve complex queries. Query performance in such an environment is critical because decision support applications often require interactive query response time. Because data warehouses are updated infrequently, it becomes possible to improve query performance by caching sets retrieved by queries in addition to query execution plans. In this paper we report on the design of an intelligent cache manager for sets retrieved by queries called WATCHMAN, which is particularly well suited for data warehousing environment. Our cache manager employs two novel, complementary algorithms for cache replacement and for cache admission. WATCHMAN aims at minimizing query response time and its cache replacement policy swaps out entire retrieved sets of queries instead of individual pages. The cache replacement and admission algorithms make use of a profit metric, which considers for each retrieved set its average rate of reference, its size, and execution cost of the associated query. We report on a performance evaluation based on the TPC-D and Set Query benchmarks. These experiments show that WATCHMAN achieves a substantial performance improvement in a decision support environment when compared to a traditional LRU replacement algorithm.

  2. Interactive entity resolution in relational data: a visual analytic tool and its evaluation.

    PubMed

    Kang, Hyunmo; Getoor, Lise; Shneiderman, Ben; Bilgic, Mustafa; Licamele, Louis

    2008-01-01

    Databases often contain uncertain and imprecise references to real-world entities. Entity resolution, the process of reconciling multiple references to underlying real-world entities, is an important data cleaning process required before accurate visualization or analysis of the data is possible. In many cases, in addition to noisy data describing entities, there is data describing the relationships among the entities. This relational data is important during the entity resolution process; it is useful both for the algorithms which determine likely database references to be resolved and for visual analytic tools which support the entity resolution process. In this paper, we introduce a novel user interface, D-Dupe, for interactive entity resolution in relational data. D-Dupe effectively combines relational entity resolution algorithms with a novel network visualization that enables users to make use of an entity's relational context for making resolution decisions. Since resolution decisions often are interdependent, D-Dupe facilitates understanding this complex process through animations which highlight combined inferences and a history mechanism which allows users to inspect chains of resolution decisions. An empirical study with 12 users confirmed the benefits of the relational context visualization on the performance of entity resolution tasks in relational data in terms of time as well as users' confidence and satisfaction.

  3. Using procalcitonin-guided algorithms to improve antimicrobial therapy in ICU patients with respiratory infections and sepsis.

    PubMed

    Schuetz, Philipp; Raad, Issam; Amin, Devendra N

    2013-10-01

    In patients with systemic bacterial infections hospitalized in ICUs, the inflammatory biomarker procalcitonin (PCT) has been shown to aid diagnosis, antibiotic stewardship, and risk stratification. Our aim is to summarize recent evidence about the utility of PCT in the critical care setting and discuss the potential benefits and limitations of PCT when used for clinical decision-making. A growing body of evidence supports PCT use to differentiate bacterial from viral respiratory infections (including influenza), to help risk stratify patients, and to guide decisions about optimal duration of antibiotic therapy. Different PCT protocols were evaluated for these and similar purposes in randomized controlled trials in patients with varying severities of predominantly respiratory tract infection and sepsis. These trials demonstrated effectiveness of monitoring PCT to de-escalate antibiotic treatment earlier without increasing rates of relapsing infections or other adverse outcomes. Although serial PCT measurement has shown value in risk stratification of ICU patients, PCT-guided antibiotic escalation protocols have not yet shown benefit for patients. Inclusion of PCT data in clinical algorithms improves individualized decision-making regarding antibiotic treatment in patients in critical care for respiratory infections or sepsis. Future research should focus on use of repeated PCT measurements to risk-stratify patients and guide treatment to improve their outcomes.

  4. Optimizing the response to surveillance alerts in automated surveillance systems.

    PubMed

    Izadi, Masoumeh; Buckeridge, David L

    2011-02-28

    Although much research effort has been directed toward refining algorithms for disease outbreak alerting, considerably less attention has been given to the response to alerts generated from statistical detection algorithms. Given the inherent inaccuracy in alerting, it is imperative to develop methods that help public health personnel identify optimal policies in response to alerts. This study evaluates the application of dynamic decision making models to the problem of responding to outbreak detection methods, using anthrax surveillance as an example. Adaptive optimization through approximate dynamic programming is used to generate a policy for decision making following outbreak detection. We investigate the degree to which the model can tolerate noise theoretically, in order to keep near optimal behavior. We also evaluate the policy from our model empirically and compare it with current approaches in routine public health practice for investigating alerts. Timeliness of outbreak confirmation and total costs associated with the decisions made are used as performance measures. Using our approach, on average, 80 per cent of outbreaks were confirmed prior to the fifth day of post-attack with considerably less cost compared to response strategies currently in use. Experimental results are also provided to illustrate the robustness of the adaptive optimization approach and to show the realization of the derived error bounds in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  5. The Effects of Sensor Performance as Modeled by Signal Detection Theory on the Performance of Reinforcement Learning in a Target Acquisition Task

    NASA Astrophysics Data System (ADS)

    Quirion, Nate

    Unmanned Aerial Systems (UASs) today are fulfilling more roles than ever before. There is a general push to have these systems feature more advanced autonomous capabilities in the near future. To achieve autonomous behavior requires some unique approaches to control and decision making. More advanced versions of these approaches are able to adapt their own behavior and examine their past experiences to increase their future mission performance. To achieve adaptive behavior and decision making capabilities this study used Reinforcement Learning algorithms. In this research the effects of sensor performance, as modeled through Signal Detection Theory (SDT), on the ability of RL algorithms to accomplish a target localization task are examined. Three levels of sensor sensitivity are simulated and compared to the results of the same system using a perfect sensor. To accomplish the target localization task, a hierarchical architecture used two distinct agents. A simulated human operator is assumed to be a perfect decision maker, and is used in the system feedback. An evaluation of the system is performed using multiple metrics, including episodic reward curves and the time taken to locate all targets. Statistical analyses are employed to detect significant differences in the comparison of steady-state behavior of different systems.

  6. Integrated Traffic Flow Management Decision Making

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.; Sridhar, Banavar; Mukherjee, Avijit

    2009-01-01

    A generalized approach is proposed to support integrated traffic flow management decision making studies at both the U.S. national and regional levels. It can consider tradeoffs between alternative optimization and heuristic based models, strategic versus tactical flight controls, and system versus fleet preferences. Preliminary testing was accomplished by implementing thirteen unique traffic flow management models, which included all of the key components of the system and conducting 85, six-hour fast-time simulation experiments. These experiments considered variations in the strategic planning look-ahead times, the replanning intervals, and the types of traffic flow management control strategies. Initial testing indicates that longer strategic planning look-ahead times and re-planning intervals result in steadily decreasing levels of sector congestion for a fixed delay level. This applies when accurate estimates of the air traffic demand, airport capacities and airspace capacities are available. In general, the distribution of the delays amongst the users was found to be most equitable when scheduling flights using a heuristic scheduling algorithm, such as ration-by-distance. On the other hand, equity was the worst when using scheduling algorithms that took into account the number of seats aboard each flight. Though the scheduling algorithms were effective at alleviating sector congestion, the tactical rerouting algorithm was the primary control for avoiding en route weather hazards. Finally, the modeled levels of sector congestion, the number of weather incursions, and the total system delays, were found to be in fair agreement with the values that were operationally observed on both good and bad weather days.

  7. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    PubMed

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  8. Short-term cascaded hydroelectric system scheduling based on chaotic particle swarm optimization using improved logistic map

    NASA Astrophysics Data System (ADS)

    He, Yaoyao; Yang, Shanlin; Xu, Qifa

    2013-07-01

    In order to solve the model of short-term cascaded hydroelectric system scheduling, a novel chaotic particle swarm optimization (CPSO) algorithm using improved logistic map is introduced, which uses the water discharge as the decision variables combined with the death penalty function. According to the principle of maximum power generation, the proposed approach makes use of the ergodicity, symmetry and stochastic property of improved logistic chaotic map for enhancing the performance of particle swarm optimization (PSO) algorithm. The new hybrid method has been examined and tested on two test functions and a practical cascaded hydroelectric system. The experimental results show that the effectiveness and robustness of the proposed CPSO algorithm in comparison with other traditional algorithms.

  9. ACL Return to Sport Guidelines and Criteria.

    PubMed

    Davies, George J; McCarty, Eric; Provencher, Matthew; Manske, Robert C

    2017-09-01

    Because of the epidemiological incidence of anterior cruciate ligament (ACL) injuries, the high reinjury rates that occur when returning back to sports, the actual number of patients that return to the same premorbid level of competition, the high incidence of osteoarthritis at 5-10-year follow-ups, and the effects on the long-term health of the knee and the quality of life for the patient, individualizing the return to sports after ACL reconstruction (ACL-R) is critical. However, one of the challenging but unsolved dilemmas is what criteria and clinical decision making should be used to return an athlete back to sports following an ACL-R. This article describes an example of a functional testing algorithm (FTA) as one method for clinical decision making based on quantitative and qualitative testing and assessment utilized to make informed decisions to return an athlete to their sports safely and without compromised performance. The methods were a review of the best current evidence to support a FTA. In order to evaluate all the complicated domains of the clinical decision making for individualizing the return to sports after ACL-R, numerous assessments need to be performed including the biopsychosocial concepts, impairment testing, strength and power testing, functional testing, and patient-reported outcomes (PROs). The optimum criteria to use for individualizing the return to sports after ACL-R remain elusive. However, since this decision needs to be made on a regular basis with the safety and performance factors of the patient involved, this FTA provides one method of quantitatively and qualitatively making the decisions. Admittedly, there is no predictive validity of this system, but it does provide practical guidelines to facilitate the clinical decision making process for return to sports. The clinical decision to return an athlete back into competition has significant implications ranging from the safety of the athlete, to performance factors and actual litigation issues. By using a multifactorial FTA, such as the one described, provides quantitative and qualitatively criteria to make an informed decision in the best interests of the athlete.

  10. A clinical decision-making mechanism for context-aware and patient-specific remote monitoring systems using the correlations of multiple vital signs.

    PubMed

    Forkan, Abdur Rahim Mohammad; Khalil, Ibrahim

    2017-02-01

    In home-based context-aware monitoring patient's real-time data of multiple vital signs (e.g. heart rate, blood pressure) are continuously generated from wearable sensors. The changes in such vital parameters are highly correlated. They are also patient-centric and can be either recurrent or can fluctuate. The objective of this study is to develop an intelligent method for personalized monitoring and clinical decision support through early estimation of patient-specific vital sign values, and prediction of anomalies using the interrelation among multiple vital signs. In this paper, multi-label classification algorithms are applied in classifier design to forecast these values and related abnormalities. We proposed a completely new approach of patient-specific vital sign prediction system using their correlations. The developed technique can guide healthcare professionals to make accurate clinical decisions. Moreover, our model can support many patients with various clinical conditions concurrently by utilizing the power of cloud computing technology. The developed method also reduces the rate of false predictions in remote monitoring centres. In the experimental settings, the statistical features and correlations of six vital signs are formulated as multi-label classification problem. Eight multi-label classification algorithms along with three fundamental machine learning algorithms are used and tested on a public dataset of 85 patients. Different multi-label classification evaluation measures such as Hamming score, F1-micro average, and accuracy are used for interpreting the prediction performance of patient-specific situation classifications. We achieved 90-95% Hamming score values across 24 classifier combinations for 85 different patients used in our experiment. The results are compared with single-label classifiers and without considering the correlations among the vitals. The comparisons show that multi-label method is the best technique for this problem domain. The evaluation results reveal that multi-label classification techniques using the correlations among multiple vitals are effective ways for early estimation of future values of those vitals. In context-aware remote monitoring this process can greatly help the doctors in quick diagnostic decision making. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. How Can Bee Colony Algorithm Serve Medicine?

    PubMed Central

    Salehahmadi, Zeinab; Manafi, Amir

    2014-01-01

    Healthcare professionals usually should make complex decisions with far reaching consequences and associated risks in health care fields. As it was demonstrated in other industries, the ability to drill down into pertinent data to explore knowledge behind the data can greatly facilitate superior, informed decisions to ensue the facts. Nature has always inspired researchers to develop models of solving the problems. Bee colony algorithm (BCA), based on the self-organized behavior of social insects is one of the most popular member of the family of population oriented, nature inspired meta-heuristic swarm intelligence method which has been proved its superiority over some other nature inspired algorithms. The objective of this model was to identify valid novel, potentially useful, and understandable correlations and patterns in existing data. This review employs a thematic analysis of online series of academic papers to outline BCA in medical hive, reducing the response and computational time and optimizing the problems. To illustrate the benefits of this model, the cases of disease diagnose system are presented. PMID:25489530

  12. How can bee colony algorithm serve medicine?

    PubMed

    Salehahmadi, Zeinab; Manafi, Amir

    2014-07-01

    Healthcare professionals usually should make complex decisions with far reaching consequences and associated risks in health care fields. As it was demonstrated in other industries, the ability to drill down into pertinent data to explore knowledge behind the data can greatly facilitate superior, informed decisions to ensue the facts. Nature has always inspired researchers to develop models of solving the problems. Bee colony algorithm (BCA), based on the self-organized behavior of social insects is one of the most popular member of the family of population oriented, nature inspired meta-heuristic swarm intelligence method which has been proved its superiority over some other nature inspired algorithms. The objective of this model was to identify valid novel, potentially useful, and understandable correlations and patterns in existing data. This review employs a thematic analysis of online series of academic papers to outline BCA in medical hive, reducing the response and computational time and optimizing the problems. To illustrate the benefits of this model, the cases of disease diagnose system are presented.

  13. Optimal GENCO bidding strategy

    NASA Astrophysics Data System (ADS)

    Gao, Feng

    Electricity industries worldwide are undergoing a period of profound upheaval. The conventional vertically integrated mechanism is being replaced by a competitive market environment. Generation companies have incentives to apply novel technologies to lower production costs, for example: Combined Cycle units. Economic dispatch with Combined Cycle units becomes a non-convex optimization problem, which is difficult if not impossible to solve by conventional methods. Several techniques are proposed here: Mixed Integer Linear Programming, a hybrid method, as well as Evolutionary Algorithms. Evolutionary Algorithms share a common mechanism, stochastic searching per generation. The stochastic property makes evolutionary algorithms robust and adaptive enough to solve a non-convex optimization problem. This research implements GA, EP, and PS algorithms for economic dispatch with Combined Cycle units, and makes a comparison with classical Mixed Integer Linear Programming. The electricity market equilibrium model not only helps Independent System Operator/Regulator analyze market performance and market power, but also provides Market Participants the ability to build optimal bidding strategies based on Microeconomics analysis. Supply Function Equilibrium (SFE) is attractive compared to traditional models. This research identifies a proper SFE model, which can be applied to a multiple period situation. The equilibrium condition using discrete time optimal control is then developed for fuel resource constraints. Finally, the research discusses the issues of multiple equilibria and mixed strategies, which are caused by the transmission network. Additionally, an advantage of the proposed model for merchant transmission planning is discussed. A market simulator is a valuable training and evaluation tool to assist sellers, buyers, and regulators to understand market performance and make better decisions. A traditional optimization model may not be enough to consider the distributed, large-scale, and complex energy market. This research compares the performance and searching paths of different artificial life techniques such as Genetic Algorithm (GA), Evolutionary Programming (EP), and Particle Swarm (PS), and look for a proper method to emulate Generation Companies' (GENCOs) bidding strategies. After deregulation, GENCOs face risk and uncertainty associated with the fast-changing market environment. A profit-based bidding decision support system is critical for GENCOs to keep a competitive position in the new environment. Most past research do not pay special attention to the piecewise staircase characteristic of generator offer curves. This research proposes an optimal bidding strategy based on Parametric Linear Programming. The proposed algorithm is able to handle actual piecewise staircase energy offer curves. The proposed method is then extended to incorporate incomplete information based on Decision Analysis. Finally, the author develops an optimal bidding tool (GenBidding) and applies it to the RTS96 test system.

  14. Evaluation of a treatment-based classification algorithm for low back pain: a cross-sectional study.

    PubMed

    Stanton, Tasha R; Fritz, Julie M; Hancock, Mark J; Latimer, Jane; Maher, Christopher G; Wand, Benedict M; Parent, Eric C

    2011-04-01

    Several studies have investigated criteria for classifying patients with low back pain (LBP) into treatment-based subgroups. A comprehensive algorithm was created to translate these criteria into a clinical decision-making guide. This study investigated the translation of the individual subgroup criteria into a comprehensive algorithm by studying the prevalence of patients meeting the criteria for each treatment subgroup and the reliability of the classification. This was a cross-sectional, observational study. Two hundred fifty patients with acute or subacute LBP were recruited from the United States and Australia to participate in the study. Trained physical therapists performed standardized assessments on all participants. The researchers used these findings to classify participants into subgroups. Thirty-one participants were reassessed to determine interrater reliability of the algorithm decision. Based on individual subgroup criteria, 25.2% (95% confidence interval [CI]=19.8%-30.6%) of the participants did not meet the criteria for any subgroup, 49.6% (95% CI=43.4%-55.8%) of the participants met the criteria for only one subgroup, and 25.2% (95% CI=19.8%-30.6%) of the participants met the criteria for more than one subgroup. The most common combination of subgroups was manipulation + specific exercise (68.4% of the participants who met the criteria for 2 subgroups). Reliability of the algorithm decision was moderate (kappa=0.52, 95% CI=0.27-0.77, percentage of agreement=67%). Due to a relatively small patient sample, reliability estimates are somewhat imprecise. These findings provide important clinical data to guide future research and revisions to the algorithm. The finding that 25% of the participants met the criteria for more than one subgroup has important implications for the sequencing of treatments in the algorithm. Likewise, the finding that 25% of the participants did not meet the criteria for any subgroup provides important information regarding potential revisions to the algorithm's bottom table (which guides unclear classifications). Reliability of the algorithm is sufficient for clinical use.

  15. Knowledge of Fecal Calprotectin and Infliximab Trough Levels Alters Clinical Decision-making for IBD Outpatients on Maintenance Infliximab Therapy.

    PubMed

    Huang, Vivian W; Prosser, Connie; Kroeker, Karen I; Wang, Haili; Shalapay, Carol; Dhami, Neil; Fedorak, Darryl K; Halloran, Brendan; Dieleman, Levinus A; Goodman, Karen J; Fedorak, Richard N

    2015-06-01

    Infliximab is an effective therapy for inflammatory bowel disease (IBD). However, more than 50% of patients lose response. Empiric dose intensification is not effective for all patients because not all patients have objective disease activity or subtherapeutic drug level. The aim was to determine how an objective marker of disease activity or therapeutic drug monitoring affects clinical decisions regarding maintenance infliximab therapy in outpatients with IBD. Consecutive patients with IBD on maintenance infliximab therapy were invited to participate by providing preinfusion stool and blood samples. Fecal calprotectin (FCP) and infliximab trough levels (ITLs) were measured by enzyme linked immunosorbent assay. Three decisions were compared: (1) actual clinical decision, (2) algorithmic FCP or ITL decisions, and (3) expert panel decision based on (a) clinical data, (b) clinical data plus FCP, and (c) clinical data plus FCP plus ITL. In secondary analysis, Receiver-operating curves were used to assess the ability of FCP and ITL in predicting clinical disease activity or remission. A total of 36 sets of blood and stool were available for analysis; median FCP 191.5 μg/g, median ITLs 7.3 μg/mL. The actual clinical decision differed from the hypothetical decision in 47.2% (FCP algorithm); 69.4% (ITL algorithm); 25.0% (expert panel clinical decision); 44.4% (expert panel clinical plus FCP); 58.3% (expert panel clinical plus FCP plus ITL) cases. FCP predicted clinical relapse (area under the curve [AUC] = 0.417; 95% confidence interval [CI], 0.197-0.641) and subtherapeutic ITL (AUC = 0.774; 95% CI, 0.536-1.000). ITL predicted clinical remission (AUC = 0.498; 95% CI, 0.254-0.742) and objective remission (AUC = 0.773; 95% CI, 0.622-0.924). Using FCP and ITLs in addition to clinical data results in an increased number of decisions to optimize management in outpatients with IBD on stable maintenance infliximab therapy.

  16. Using Decision Trees to Detect and Isolate Simulated Leaks in the J-2X Rocket Engine

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark A.; Aguilar, Robert; Figueroa, Fernando F.

    2009-01-01

    The goal of this work was to use data-driven methods to automatically detect and isolate faults in the J-2X rocket engine. It was decided to use decision trees, since they tend to be easier to interpret than other data-driven methods. The decision tree algorithm automatically "learns" a decision tree by performing a search through the space of possible decision trees to find one that fits the training data. The particular decision tree algorithm used is known as C4.5. Simulated J-2X data from a high-fidelity simulator developed at Pratt & Whitney Rocketdyne and known as the Detailed Real-Time Model (DRTM) was used to "train" and test the decision tree. Fifty-six DRTM simulations were performed for this purpose, with different leak sizes, different leak locations, and different times of leak onset. To make the simulations as realistic as possible, they included simulated sensor noise, and included a gradual degradation in both fuel and oxidizer turbine efficiency. A decision tree was trained using 11 of these simulations, and tested using the remaining 45 simulations. In the training phase, the C4.5 algorithm was provided with labeled examples of data from nominal operation and data including leaks in each leak location. From the data, it "learned" a decision tree that can classify unseen data as having no leak or having a leak in one of the five leak locations. In the test phase, the decision tree produced very low false alarm rates and low missed detection rates on the unseen data. It had very good fault isolation rates for three of the five simulated leak locations, but it tended to confuse the remaining two locations, perhaps because a large leak at one of these two locations can look very similar to a small leak at the other location.

  17. Zero-block mode decision algorithm for H.264/AVC.

    PubMed

    Lee, Yu-Ming; Lin, Yinyi

    2009-03-01

    In the previous paper , we proposed a zero-block intermode decision algorithm for H.264 video coding based upon the number of zero-blocks of 4 x 4 DCT coefficients between the current macroblock and the co-located macroblock. The proposed algorithm can achieve significant improvement in computation, but the computation performance is limited for high bit-rate coding. To improve computation efficiency, in this paper, we suggest an enhanced zero-block decision algorithm, which uses an early zero-block detection method to compute the number of zero-blocks instead of direct DCT and quantization (DCT/Q) calculation and incorporates two adequate decision methods into semi-stationary and nonstationary regions of a video sequence. In addition, the zero-block decision algorithm is also applied to the intramode prediction in the P frame. The enhanced zero-block decision algorithm brings out a reduction of average 27% of total encoding time compared to the zero-block decision algorithm.

  18. Using Multimodal Input for Autonomous Decision Making for Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Neilan, James H.; Cross, Charles; Rothhaar, Paul; Tran, Loc; Motter, Mark; Qualls, Garry; Trujillo, Anna; Allen, B. Danette

    2016-01-01

    Autonomous decision making in the presence of uncertainly is a deeply studied problem space particularly in the area of autonomous systems operations for land, air, sea, and space vehicles. Various techniques ranging from single algorithm solutions to complex ensemble classifier systems have been utilized in a research context in solving mission critical flight decisions. Realized systems on actual autonomous hardware, however, is a difficult systems integration problem, constituting a majority of applied robotics development timelines. The ability to reliably and repeatedly classify objects during a vehicles mission execution is vital for the vehicle to mitigate both static and dynamic environmental concerns such that the mission may be completed successfully and have the vehicle operate and return safely. In this paper, the Autonomy Incubator proposes and discusses an ensemble learning and recognition system planned for our autonomous framework, AEON, in selected domains, which fuse decision criteria, using prior experience on both the individual classifier layer and the ensemble layer to mitigate environmental uncertainty during operation.

  19. Digital Analytics in Professional Work and Learning

    ERIC Educational Resources Information Center

    Edwards, Richard; Fenwick, Tara

    2016-01-01

    In a wide range of fields, professional practice is being transformed by the increasing influence of digital analytics: the massive volumes of big data, and software algorithms that are collecting, comparing and calculating that data to make predictions and even decisions. Researchers in a number of social sciences have been calling attention to…

  20. Random Forest as a Predictive Analytics Alternative to Regression in Institutional Research

    ERIC Educational Resources Information Center

    He, Lingjun; Levine, Richard A.; Fan, Juanjuan; Beemer, Joshua; Stronach, Jeanne

    2018-01-01

    In institutional research, modern data mining approaches are seldom considered to address predictive analytics problems. The goal of this paper is to highlight the advantages of tree-based machine learning algorithms over classic (logistic) regression methods for data-informed decision making in higher education problems, and stress the success of…

  1. Adapting the RoboCup Simulation for Autonomous Vehicle Team Information Fusion and Decision Making Experimentation

    DTIC Science & Technology

    2010-06-01

    researchers outside the government to produce the kinds of algorithms and software that would easily transition into solutions for teams of autonomous ... vehicles for military scenarios. To accomplish this, we began modifying the RoboCup soccer game step-by-step to incorporate rules that simulate these

  2. Identification and Reconfigurable Control of Impaired Multi-Rotor Drones

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmanje; Bencomo, Alfredo

    2016-01-01

    The paper presents an algorithm for control and safe landing of impaired multi-rotor drones when one or more motors fail simultaneously or in any sequence. It includes three main components: an identification block, a reconfigurable control block, and a decisions making block. The identification block monitors each motor load characteristics and the current drawn, based on which the failures are detected. The control block generates the required total thrust and three axis torques for the altitude, horizontal position and/or orientation control of the drone based on the time scale separation and nonlinear dynamic inversion. The horizontal displacement is controlled by modulating the roll and pitch angles. The decision making algorithm maps the total thrust and three torques into the individual motor thrusts based on the information provided by the identification block. The drone continues the mission execution as long as the number of functioning motors provide controllability of it. Otherwise, the controller is switched to the safe mode, which gives up the yaw control, commands a safe landing spot and descent rate while maintaining the horizontal attitude.

  3. The GIS map coloring support decision-making system based on case-based reasoning and simulated annealing algorithm

    NASA Astrophysics Data System (ADS)

    Deng, Shuang; Xiang, Wenting; Tian, Yangge

    2009-10-01

    Map coloring is a hard task even to the experienced map experts. In the GIS project, usually need to color map according to the customer, which make the work more complex. With the development of GIS, more and more programmers join the project team, which lack the training of cartology, their coloring map are harder to meet the requirements of customer. From the experience, customers with similar background usually have similar tastes for coloring map. So, we developed a GIS color scheme decision-making system which can select color schemes of similar customers from case base for customers to select and adjust. The system is a BS/CS mixed system, the client side use JSP and make it possible for the system developers to go on remote calling of the colors scheme cases in the database server and communicate with customers. Different with general case-based reasoning, even the customers are very similar, their selection may have difference, it is hard to provide a "best" option. So, we select the Simulated Annealing Algorithm (SAA) to arrange the emergence order of different color schemes. Customers can also dynamically adjust certain features colors based on existing case. The result shows that the system can facilitate the communication between the designers and the customers and improve the quality and efficiency of coloring map.

  4. Evaluation of supervised machine-learning algorithms to distinguish between inflammatory bowel disease and alimentary lymphoma in cats.

    PubMed

    Awaysheh, Abdullah; Wilcke, Jeffrey; Elvinger, François; Rees, Loren; Fan, Weiguo; Zimmerman, Kurt L

    2016-11-01

    Inflammatory bowel disease (IBD) and alimentary lymphoma (ALA) are common gastrointestinal diseases in cats. The very similar clinical signs and histopathologic features of these diseases make the distinction between them diagnostically challenging. We tested the use of supervised machine-learning algorithms to differentiate between the 2 diseases using data generated from noninvasive diagnostic tests. Three prediction models were developed using 3 machine-learning algorithms: naive Bayes, decision trees, and artificial neural networks. The models were trained and tested on data from complete blood count (CBC) and serum chemistry (SC) results for the following 3 groups of client-owned cats: normal, inflammatory bowel disease (IBD), or alimentary lymphoma (ALA). Naive Bayes and artificial neural networks achieved higher classification accuracy (sensitivities of 70.8% and 69.2%, respectively) than the decision tree algorithm (63%, p < 0.0001). The areas under the receiver-operating characteristic curve for classifying cases into the 3 categories was 83% by naive Bayes, 79% by decision tree, and 82% by artificial neural networks. Prediction models using machine learning provided a method for distinguishing between ALA-IBD, ALA-normal, and IBD-normal. The naive Bayes and artificial neural networks classifiers used 10 and 4 of the CBC and SC variables, respectively, to outperform the C4.5 decision tree, which used 5 CBC and SC variables in classifying cats into the 3 classes. These models can provide another noninvasive diagnostic tool to assist clinicians with differentiating between IBD and ALA, and between diseased and nondiseased cats. © 2016 The Author(s).

  5. Evolutionary image simplification for lung nodule classification with convolutional neural networks.

    PubMed

    Lückehe, Daniel; von Voigt, Gabriele

    2018-05-29

    Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.

  6. A stochastic conflict resolution model for trading pollutant discharge permits in river systems.

    PubMed

    Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram

    2009-07-01

    This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.

  7. Modelling the impacts of new diagnostic tools for tuberculosis in developing countries to enhance policy decisions.

    PubMed

    Langley, Ivor; Doulla, Basra; Lin, Hsien-Ho; Millington, Kerry; Squire, Bertie

    2012-09-01

    The introduction and scale-up of new tools for the diagnosis of Tuberculosis (TB) in developing countries has the potential to make a huge difference to the lives of millions of people living in poverty. To achieve this, policy makers need the information to make the right decisions about which new tools to implement and where in the diagnostic algorithm to apply them most effectively. These decisions are difficult as the new tools are often expensive to implement and use, and the health system and patient impacts uncertain, particularly in developing countries where there is a high burden of TB. The authors demonstrate that a discrete event simulation model could play a significant part in improving and informing these decisions. The feasibility of linking the discrete event simulation to a dynamic epidemiology model is also explored in order to take account of longer term impacts on the incidence of TB. Results from two diagnostic districts in Tanzania are used to illustrate how the approach could be used to improve decisions.

  8. Assessing decision quality in patient-centred care requires a preference-sensitive measure

    PubMed Central

    Kaltoft, Mette; Cunich, Michelle; Salkeld, Glenn; Dowie, Jack

    2014-01-01

    A theory-based instrument for measuring the quality of decisions made using any form of decision technology, including both decision-aided and unaided clinical consultations is required to enable person- and patient-centred care and to respond positively to individual heterogeneity in the value aspects of decision making. Current instruments using the term ‘decision quality’ have adopted a decision- and thus condition-specific approach. We argue that patient-centred care requires decision quality to be regarded as both preference-sensitive across multiple relevant criteria and generic across all conditions and decisions. MyDecisionQuality is grounded in prescriptive multi criteria decision analysis and employs a simple expected value algorithm to calculate a score for the quality of a decision that combines, in the clinical case, the patient’s individual preferences for eight quality criteria (expressed as importance weights) and their ratings of the decision just taken on each of these criteria (expressed as performance rates). It thus provides an index of decision quality that encompasses both these aspects. It also provides patients with help in prioritizing quality criteria for future decision making by calculating, for each criterion, the Incremental Value of Perfect Rating, that is, the increase in their decision quality score that would result if their performance rating on the criterion had been 100%, weightings unchanged. MyDecisionQuality, which is a web-based generic and preference-sensitive instrument, can constitute a key patient-reported measure of the quality of the decision-making process. It can provide the basis for future decision improvement, especially when the clinician (or other stakeholders) completes the equivalent instrument and the extent and nature of concordance and discordance can be established. Apart from its role in decision preparation and evaluation, it can also provide real time and relevant documentation for the patient’s record. PMID:24335587

  9. Processing Technology Selection for Municipal Sewage Treatment Based on a Multi-Objective Decision Model under Uncertainty.

    PubMed

    Chen, Xudong; Xu, Zhongwen; Yao, Liming; Ma, Ning

    2018-03-05

    This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.

  10. Active Learning Using Hint Information.

    PubMed

    Li, Chun-Liang; Ferng, Chun-Sung; Lin, Hsuan-Tien

    2015-08-01

    The abundance of real-world data and limited labeling budget calls for active learning, an important learning paradigm for reducing human labeling efforts. Many recently developed active learning algorithms consider both uncertainty and representativeness when making querying decisions. However, exploiting representativeness with uncertainty concurrently usually requires tackling sophisticated and challenging learning tasks, such as clustering. In this letter, we propose a new active learning framework, called hinted sampling, which takes both uncertainty and representativeness into account in a simpler way. We design a novel active learning algorithm within the hinted sampling framework with an extended support vector machine. Experimental results validate that the novel active learning algorithm can result in a better and more stable performance than that achieved by state-of-the-art algorithms. We also show that the hinted sampling framework allows improving another active learning algorithm designed from the transductive support vector machine.

  11. Advancing beyond the system: telemedicine nurses' clinical reasoning using a computerised decision support system for patients with COPD - an ethnographic study.

    PubMed

    Barken, Tina Lien; Thygesen, Elin; Söderhamn, Ulrika

    2017-12-28

    Telemedicine is changing traditional nursing care, and entails nurses performing advanced and complex care within a new clinical environment, and monitoring patients at a distance. Telemedicine practice requires complex disease management, advocating that the nurses' reasoning and decision-making processes are supported. Computerised decision support systems are being used increasingly to assist reasoning and decision-making in different situations. However, little research has focused on the clinical reasoning of nurses using a computerised decision support system in a telemedicine setting. Therefore, the objective of the study is to explore the process of telemedicine nurses' clinical reasoning when using a computerised decision support system for the management of patients with chronic obstructive pulmonary disease. The factors influencing the reasoning and decision-making processes were investigated. In this ethnographic study, a combination of data collection methods, including participatory observations, the think-aloud technique, and a focus group interview was employed. Collected data were analysed using qualitative content analysis. When telemedicine nurses used a computerised decision support system for the management of patients with complex, unstable chronic obstructive pulmonary disease, two categories emerged: "the process of telemedicine nurses' reasoning to assess health change" and "the influence of the telemedicine setting on nurses' reasoning and decision-making processes". An overall theme, termed "advancing beyond the system", represented the connection between the reasoning processes and the telemedicine work and setting, where being familiar with the patient functioned as a foundation for the nurses' clinical reasoning process. In the telemedicine setting, when supported by a computerised decision support system, nurses' reasoning was enabled by the continuous flow of digital clinical data, regular video-mediated contact and shared decision-making with the patient. These factors fostered an in-depth knowledge of the patients and acted as a foundation for the nurses' reasoning process. Nurses' reasoning frequently advanced beyond the computerised decision support system recommendations. Future studies are warranted to develop more accurate algorithms, increase system maturity, and improve the integration of the digital clinical information with clinical experiences, to support telemedicine nurses' reasoning process.

  12. Quantum-Like Models for Decision Making in Psychology and Cognitive Science

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2009-02-01

    We show that (in contrast to rather common opinion) the domain of applications of the mathematical formalism of quantum mechanics is not restricted to physics. This formalism can be applied to the description of various quantum-like (QL) information processing. In particular, the calculus of quantum (and more general QL) probabilities can be used to explain some paradoxical statistical data which was collected in psychology and cognitive science. The main lesson of our study is that one should sharply distinguish the mathematical apparatus of QM from QM as a physical theory. The domain of application of the mathematical apparatus is essentially wider than quantum physics. Quantum-like representation algorithm, formula of total probability, interference of probabilities, psychology, cognition, decision making.

  13. Dynamic Computation Offloading for Low-Power Wearable Health Monitoring Systems.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Mortazavi, Bobak; Alshurafa, Nabil; Sarrafzadeh, Majid

    2017-03-01

    The objective of this paper is to describe and evaluate an algorithm to reduce power usage and increase battery lifetime for wearable health-monitoring devices. We describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data processing between the wearable device and mobile application as a function of desired classification accuracy. By making the correct offloading decision based on current system parameters, we show that we are able to reduce system power by as much as 20%. We demonstrate that computation offloading can be applied to real-time monitoring systems, and yields significant power savings. Making correct offloading decisions for health monitoring devices can extend battery life and improve adherence.

  14. Privacy-preserving heterogeneous health data sharing.

    PubMed

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-05-01

    Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.

  15. Privacy-preserving heterogeneous health data sharing

    PubMed Central

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-01-01

    Objective Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. Methods The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. Results We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. Limitation The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Conclusions Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis. PMID:23242630

  16. A decision support system using combined-classifier for high-speed data stream in smart grid

    NASA Astrophysics Data System (ADS)

    Yang, Hang; Li, Peng; He, Zhian; Guo, Xiaobin; Fong, Simon; Chen, Huajun

    2016-11-01

    Large volume of high-speed streaming data is generated by big power grids continuously. In order to detect and avoid power grid failure, decision support systems (DSSs) are commonly adopted in power grid enterprises. Among all the decision-making algorithms, incremental decision tree is the most widely used one. In this paper, we propose a combined classifier that is a composite of a cache-based classifier (CBC) and a main tree classifier (MTC). We integrate this classifier into a stream processing engine on top of the DSS such that high-speed steaming data can be transformed into operational intelligence efficiently. Experimental results show that our proposed classifier can return more accurate answers than other existing ones.

  17. An Evidence-Based Medicine Approach to Antihyperglycemic Therapy in Diabetes Mellitus to Overcome Overtreatment.

    PubMed

    Makam, Anil N; Nguyen, Oanh K

    2017-01-10

    Overtreatment is pervasive in medicine and leads to potential patient harms and excessive costs in health care. Although evidence-based medicine is often derided as practice by rote algorithmic medicine, the appropriate application of key evidence-based medicine principles in clinical decision making is fundamental to preventing overtreatment and promoting high-value, individualized patient-centered care. Specifically, this article discusses the importance of (1) using absolute rather than relative estimates of benefits to inform treatment decisions; (2) considering the time horizon to benefit of treatments; (3) balancing potential harms and benefits; and (4) using shared decision making by physicians to incorporate the patient's values and preferences into treatment decisions. Here, we illustrate the application of these principles to considering the decision of whether or not to recommend intensive glycemic control to patients to minimize microvascular and cardiovascular complications in type 2 diabetes mellitus. Through this lens, this example will illustrate how an evidence-based medicine approach can be used to individualize glycemic goals and prevent overtreatment, and can serve as a template for applying evidence-based medicine to inform treatment decisions for other conditions to optimize health and individualize patient care. © 2017 American Heart Association, Inc.

  18. The Separatrix Algorithm for Synthesis and Analysis of Stochastic Simulations with Applications in Disease Modeling

    PubMed Central

    Klein, Daniel J.; Baym, Michael; Eckhoff, Philip

    2014-01-01

    Decision makers in epidemiology and other disciplines are faced with the daunting challenge of designing interventions that will be successful with high probability and robust against a multitude of uncertainties. To facilitate the decision making process in the context of a goal-oriented objective (e.g., eradicate polio by ), stochastic models can be used to map the probability of achieving the goal as a function of parameters. Each run of a stochastic model can be viewed as a Bernoulli trial in which “success” is returned if and only if the goal is achieved in simulation. However, each run can take a significant amount of time to complete, and many replicates are required to characterize each point in parameter space, so specialized algorithms are required to locate desirable interventions. To address this need, we present the Separatrix Algorithm, which strategically locates parameter combinations that are expected to achieve the goal with a user-specified probability of success (e.g. 95%). Technically, the algorithm iteratively combines density-corrected binary kernel regression with a novel information-gathering experiment design to produce results that are asymptotically correct and work well in practice. The Separatrix Algorithm is demonstrated on several test problems, and on a detailed individual-based simulation of malaria. PMID:25078087

  19. Tracks detection from high-orbit space objects

    NASA Astrophysics Data System (ADS)

    Shumilov, Yu. P.; Vygon, V. G.; Grishin, E. A.; Konoplev, A. O.; Semichev, O. P.; Shargorodskii, V. D.

    2017-05-01

    The paper presents studies results of a complex algorithm for the detection of highly orbital space objects. Before the implementation of the algorithm, a series of frames with weak tracks of space objects, which can be discrete, is recorded. The algorithm includes pre-processing, classical for astronomy, consistent filtering of each frame and its threshold processing, shear transformation, median filtering of the transformed series of frames, repeated threshold processing and detection decision making. Modeling of space objects weak tracks on of the night starry sky real frames obtained in the regime of a stationary telescope was carried out. It is shown that the permeability of an optoelectronic device has increased by almost 2m.

  20. Projection pursuit water quality evaluation model based on chicken swam algorithm

    NASA Astrophysics Data System (ADS)

    Hu, Zhe

    2018-03-01

    In view of the uncertainty and ambiguity of each index in water quality evaluation, in order to solve the incompatibility of evaluation results of individual water quality indexes, a projection pursuit model based on chicken swam algorithm is proposed. The projection index function which can reflect the water quality condition is constructed, the chicken group algorithm (CSA) is introduced, the projection index function is optimized, the best projection direction of the projection index function is sought, and the best projection value is obtained to realize the water quality evaluation. The comparison between this method and other methods shows that it is reasonable and feasible to provide decision-making basis for water pollution control in the basin.

  1. Transforming clinical practice guidelines and clinical pathways into fast-and-frugal decision trees to improve clinical care strategies.

    PubMed

    Djulbegovic, Benjamin; Hozo, Iztok; Dale, William

    2018-02-27

    Contemporary delivery of health care is inappropriate in many ways, largely due to suboptimal Q5 decision-making. A typical approach to improve practitioners' decision-making is to develop evidence-based clinical practice guidelines (CPG) by guidelines panels, who are instructed to use their judgments to derive practice recommendations. However, mechanisms for the formulation of guideline judgments remains a "black-box" operation-a process with defined inputs and outputs but without sufficient knowledge of its internal workings. Increased explicitness and transparency in the process can be achieved by implementing CPG as clinical pathways (CPs) (also known as clinical algorithms or flow-charts). However, clinical recommendations thus derived are typically ad hoc and developed by experts in a theory-free environment. As any recommendation can be right (true positive or negative), or wrong (false positive or negative), the lack of theoretical structure precludes the quantitative assessment of the management strategies recommended by CPGs/CPs. To realize the full potential of CPGs/CPs, they need to be placed on more solid theoretical grounds. We believe this potential can be best realized by converting CPGs/CPs within the heuristic theory of decision-making, often implemented as fast-and-frugal (FFT) decision trees. This is possible because FFT heuristic strategy of decision-making can be linked to signal detection theory, evidence accumulation theory, and a threshold model of decision-making, which, in turn, allows quantitative analysis of the accuracy of clinical management strategies. Fast-and-frugal provides a simple and transparent, yet solid and robust, methodological framework connecting decision science to clinical care, a sorely needed missing link between CPGs/CPs and patient outcomes. We therefore advocate that all guidelines panels express their recommendations as CPs, which in turn should be converted into FFTs to guide clinical care. © 2018 John Wiley & Sons, Ltd.

  2. A Genetic Algorithm for the Bi-Level Topological Design of Local Area Networks

    PubMed Central

    Camacho-Vallejo, José-Fernando; Mar-Ortiz, Julio; López-Ramos, Francisco; Rodríguez, Ricardo Pedraza

    2015-01-01

    Local access networks (LAN) are commonly used as communication infrastructures which meet the demand of a set of users in the local environment. Usually these networks consist of several LAN segments connected by bridges. The topological LAN design bi-level problem consists on assigning users to clusters and the union of clusters by bridges in order to obtain a minimum response time network with minimum connection cost. Therefore, the decision of optimally assigning users to clusters will be made by the leader and the follower will make the decision of connecting all the clusters while forming a spanning tree. In this paper, we propose a genetic algorithm for solving the bi-level topological design of a Local Access Network. Our solution method considers the Stackelberg equilibrium to solve the bi-level problem. The Stackelberg-Genetic algorithm procedure deals with the fact that the follower’s problem cannot be optimally solved in a straightforward manner. The computational results obtained from two different sets of instances show that the performance of the developed algorithm is efficient and that it is more suitable for solving the bi-level problem than a previous Nash-Genetic approach. PMID:26102502

  3. A bio-inspired swarm robot coordination algorithm for multiple target searching

    NASA Astrophysics Data System (ADS)

    Meng, Yan; Gan, Jing; Desai, Sachi

    2008-04-01

    The coordination of a multi-robot system searching for multi targets is challenging under dynamic environment since the multi-robot system demands group coherence (agents need to have the incentive to work together faithfully) and group competence (agents need to know how to work together well). In our previous proposed bio-inspired coordination method, Local Interaction through Virtual Stigmergy (LIVS), one problem is the considerable randomness of the robot movement during coordination, which may lead to more power consumption and longer searching time. To address these issues, an adaptive LIVS (ALIVS) method is proposed in this paper, which not only considers the travel cost and target weight, but also predicting the target/robot ratio and potential robot redundancy with respect to the detected targets. Furthermore, a dynamic weight adjustment is also applied to improve the searching performance. This new method a truly distributed method where each robot makes its own decision based on its local sensing information and the information from its neighbors. Basically, each robot only communicates with its neighbors through a virtual stigmergy mechanism and makes its local movement decision based on a Particle Swarm Optimization (PSO) algorithm. The proposed ALIVS algorithm has been implemented on the embodied robot simulator, Player/Stage, in a searching target. The simulation results demonstrate the efficiency and robustness in a power-efficient manner with the real-world constraints.

  4. Uncertainty quantification in downscaling procedures for effective decisions in energy systems

    NASA Astrophysics Data System (ADS)

    Constantinescu, E. M.

    2010-12-01

    Weather is a major driver both of energy supply and demand, and with the massive adoption of renewable energy sources and changing economic and producer-consumer paradigms, the management of the next-generation energy systems is becoming ever more challenging. The operational and planning decisions in energy systems are guided by efficiency and reliability, and therefore a central role in these decisions will be played by the ability to obtain weather condition forecasts with accurate uncertainty estimates. The appropriate temporal and spatial resolutions needed for effective decision-making, be it operational or planning, is not clear. It is arguably certain however, that such temporal scales as hourly variations of temperature or wind conditions and ramp events are essential in this process. Planning activities involve decade or decades-long projections of weather. One sensible way to achieve this is to embed regional weather models in a global climate system. This strategy acts as a downscaling procedure. Uncertainty modeling techniques must be developed in order to quantify and minimize forecast errors as well as target variables that impact the decision-making process the most. We discuss the challenges of obtaining a realistic uncertainty quantification estimate using mathematical algorithms based on scalable matrix-free computations and physics-based statistical models. The process of making decisions for energy management systems based on future weather scenarios is a very complex problem. We shall focus on the challenges in generating wind power predictions based on regional weather predictions, and discuss the implications of making the common assumptions about the uncertainty models.

  5. Perspective on intelligent avionics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, H.L.

    1987-01-01

    Technical issues which could potentially limit the capability and acceptibility of expert systems decision-making for avionics applications are addressed. These issues are: real-time AI, mission-critical software, conventional algorithms, pilot interface, knowledge acquisition, and distributed expert systems. Examples from on-going expert system development programs are presented to illustrate likely architectures and applications of future intelligent avionic systems. 13 references.

  6. Application of wildfire simulation models for risk analysis

    Treesearch

    Alan A. Ager; Mark A. Finney

    2009-01-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of...

  7. An In-Process Surface Roughness Recognition System in End Milling Operations

    ERIC Educational Resources Information Center

    Yang, Lieh-Dai; Chen, Joseph C.

    2004-01-01

    To develop an in-process quality control system, a sensor technique and a decision-making algorithm need to be applied during machining operations. Several sensor techniques have been used in the in-process prediction of quality characteristics in machining operations. For example, an accelerometer sensor can be used to monitor the vibration of…

  8. Predictive models for mortality after ruptured aortic aneurysm repair do not predict futility and are not useful for clinical decision making.

    PubMed

    Thompson, Patrick C; Dalman, Ronald L; Harris, E John; Chandra, Venita; Lee, Jason T; Mell, Matthew W

    2016-12-01

    The clinical decision-making utility of scoring algorithms for predicting mortality after ruptured abdominal aortic aneurysms (rAAAs) remains unknown. We sought to determine the clinical utility of the algorithms compared with our clinical decision making and outcomes for management of rAAA during a 10-year period. Patients admitted with a diagnosis rAAA at a large university hospital were identified from 2005 to 2014. The Glasgow Aneurysm Score, Hardman Index, Vancouver Score, Edinburgh Ruptured Aneurysm Score, University of Washington Ruptured Aneurysm Score, Vascular Study Group of New England rAAA Risk Score, and the Artificial Neural Network Score were analyzed for accuracy in predicting mortality. Among patients quantified into the highest-risk group (predicted mortality >80%-85%), we compared the predicted with the actual outcome to determine how well these scores predicted futility. The cohort comprised 64 patients. Of those, 24 (38%) underwent open repair, 36 (56%) underwent endovascular repair, and 4 (6%) received only comfort care. Overall mortality was 30% (open repair, 26%; endovascular repair, 24%; no repair, 100%). As assessed by the scoring systems, 5% to 35% of patients were categorized as high-mortality risk. Intersystem agreement was poor, with κ values ranging from 0.06 to 0.79. Actual mortality was lower than the predicted mortality (50%-70% vs 78%-100%) for all scoring systems, with each scoring system overestimating mortality by 10% to 50%. Mortality rates for patients not designated into the high-risk cohort were dramatically lower, ranging from 7% to 29%. Futility, defined as 100% mortality, was predicted in five of 63 patients with the Hardman Index and in two of 63 of the University of Washington score. Of these, surgery was not offered to one of five and one of two patients, respectively. If one of these two models were used to withhold operative intervention, the mortality of these patients would have been 100%. The actual mortality for these patients was 60% and 50%, respectively. Clinical algorithms for predicting mortality after rAAA were not useful for predicting futility. Most patients with rAAA were not classified in the highest-risk group by the clinical decision models. Among patients identified as highest risk, predicted mortality was overestimated compared with actual mortality. The data from this study support the limited value to surgeons of the currently published algorithms. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  10. A Predictive Algorithm to Detect Opioid Use Disorder

    PubMed Central

    Lee, Chee; Sharma, Maneesh; Kantorovich, Svetlana

    2018-01-01

    Purpose: The purpose of this study was to determine the clinical utility of an algorithm-based decision tool designed to assess risk associated with opioid use in the primary care setting. Methods: A prospective, longitudinal study was conducted to assess the utility of precision medicine testing in 1822 patients across 18 family medicine/primary care clinics in the United States. Using the profile, patients were categorized into low, moderate, and high risk for opioid use. Physicians who ordered testing were asked to complete patient evaluations and document their actions, decisions, and perceptions regarding the utility of the precision medicine tests. Results: Approximately 47% of primary care physicians surveyed used the profile to guide clinical decision-making. These physicians rated the benefit of the profile on patient care an average of 3.6 on a 5-point scale (1 indicating no benefit and 5 indicating significant benefit). Eighty-eight percent of all clinicians surveyed felt the test exhibited some benefit to their patient care. The most frequent utilization for the profile was to guide a change in opioid prescribed. Physicians reported greater benefit of profile utilization for minority patients. Patients whose treatment was guided by the profile had pain levels that were reduced, on average, 2.7 levels on the numeric rating scale. Conclusions: The profile provided primary care physicians with a useful tool to stratify the risk of opioid use disorder and was rated as beneficial for decision-making and patient improvement by the majority of physicians surveyed. Physicians reported the profile resulted in greater clinical improvement for minorities, highlighting the objective use of this profile to guide judicial use of opioids in high-risk patients. Significantly, when physicians used the profile to guide treatment decisions, patient-reported pain was greatly reduced. PMID:29383324

  11. A Predictive Algorithm to Detect Opioid Use Disorder: What Is the Utility in a Primary Care Setting?

    PubMed

    Lee, Chee; Sharma, Maneesh; Kantorovich, Svetlana; Brenton, Ashley

    2018-01-01

    The purpose of this study was to determine the clinical utility of an algorithm-based decision tool designed to assess risk associated with opioid use in the primary care setting. A prospective, longitudinal study was conducted to assess the utility of precision medicine testing in 1822 patients across 18 family medicine/primary care clinics in the United States. Using the profile, patients were categorized into low, moderate, and high risk for opioid use. Physicians who ordered testing were asked to complete patient evaluations and document their actions, decisions, and perceptions regarding the utility of the precision medicine tests. Approximately 47% of primary care physicians surveyed used the profile to guide clinical decision-making. These physicians rated the benefit of the profile on patient care an average of 3.6 on a 5-point scale (1 indicating no benefit and 5 indicating significant benefit). Eighty-eight percent of all clinicians surveyed felt the test exhibited some benefit to their patient care. The most frequent utilization for the profile was to guide a change in opioid prescribed. Physicians reported greater benefit of profile utilization for minority patients. Patients whose treatment was guided by the profile had pain levels that were reduced, on average, 2.7 levels on the numeric rating scale. The profile provided primary care physicians with a useful tool to stratify the risk of opioid use disorder and was rated as beneficial for decision-making and patient improvement by the majority of physicians surveyed. Physicians reported the profile resulted in greater clinical improvement for minorities, highlighting the objective use of this profile to guide judicial use of opioids in high-risk patients. Significantly, when physicians used the profile to guide treatment decisions, patient-reported pain was greatly reduced.

  12. Food Culture, Preferences and Ethics in Dysphagia Management.

    PubMed

    Kenny, Belinda

    2015-11-01

    Adults with dysphagia experience difficulties swallowing food and fluids with potentially harmful health and psychosocial consequences. Speech pathologists who manage patients with dysphagia are frequently required to address ethical issues when patients' food culture and/ or preferences are inconsistent with recommended diets. These issues incorporate complex links between food, identity and social participation. A composite case has been developed to reflect ethical issues identified by practising speech pathologists for the purposes of illustrating ethical concerns in dysphagia management. The case examines a speech pathologist's role in supporting patient autonomy when patients and carers express different goals and values. The case presents a 68-year-old man of Australian/Italian heritage with severe swallowing impairment and strong values attached to food preferences. The case is examined through application of the dysphagia algorithm, a tool for shared decision-making when patients refuse dietary modifications. Case analysis revealed the benefits and challenges of shared decision-making processes in dysphagia management. Four health professional skills and attributes were identified as synonymous with shared decision making: communication, imagination, courage and reflection. © 2015 John Wiley & Sons Ltd.

  13. Connecting clinical and actuarial prediction with rule-based methods.

    PubMed

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  14. Evolution with Reinforcement Learning in Negotiation

    PubMed Central

    Zou, Yi; Zhan, Wenjie; Shao, Yuan

    2014-01-01

    Adaptive behavior depends less on the details of the negotiation process and makes more robust predictions in the long term as compared to in the short term. However, the extant literature on population dynamics for behavior adjustment has only examined the current situation. To offset this limitation, we propose a synergy of evolutionary algorithm and reinforcement learning to investigate long-term collective performance and strategy evolution. The model adopts reinforcement learning with a tradeoff between historical and current information to make decisions when the strategies of agents evolve through repeated interactions. The results demonstrate that the strategies in populations converge to stable states, and the agents gradually form steady negotiation habits. Agents that adopt reinforcement learning perform better in payoff, fairness, and stableness than their counterparts using classic evolutionary algorithm. PMID:25048108

  15. Evolution with reinforcement learning in negotiation.

    PubMed

    Zou, Yi; Zhan, Wenjie; Shao, Yuan

    2014-01-01

    Adaptive behavior depends less on the details of the negotiation process and makes more robust predictions in the long term as compared to in the short term. However, the extant literature on population dynamics for behavior adjustment has only examined the current situation. To offset this limitation, we propose a synergy of evolutionary algorithm and reinforcement learning to investigate long-term collective performance and strategy evolution. The model adopts reinforcement learning with a tradeoff between historical and current information to make decisions when the strategies of agents evolve through repeated interactions. The results demonstrate that the strategies in populations converge to stable states, and the agents gradually form steady negotiation habits. Agents that adopt reinforcement learning perform better in payoff, fairness, and stableness than their counterparts using classic evolutionary algorithm.

  16. Autonomous mechanism of internal choice estimate underlies decision inertia.

    PubMed

    Akaishi, Rei; Umeda, Kazumasa; Nagase, Asako; Sakai, Katsuyuki

    2014-01-08

    Our choice is influenced by choices we made in the past, but the mechanism responsible for the choice bias remains elusive. Here we show that the history-dependent choice bias can be explained by an autonomous learning rule whereby an estimate of the likelihood of a choice to be made is updated in each trial by comparing between the actual and expected choices. We found that in perceptual decision making without performance feedback, a decision on an ambiguous stimulus is repeated on the subsequent trial more often than a decision on a salient stimulus. This inertia of decision was not accounted for by biases in motor response, sensory processing, or attention. The posterior cingulate cortex and frontal eye field represent choice prediction error and choice estimate in the learning algorithm, respectively. Interactions between the two regions during the intertrial interval are associated with decision inertia on a subsequent trial. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. An algorithm that improves speech intelligibility in noise for normal-hearing listeners.

    PubMed

    Kim, Gibak; Lu, Yang; Hu, Yi; Loizou, Philipos C

    2009-09-01

    Traditional noise-suppression algorithms have been shown to improve speech quality, but not speech intelligibility. Motivated by prior intelligibility studies of speech synthesized using the ideal binary mask, an algorithm is proposed that decomposes the input signal into time-frequency (T-F) units and makes binary decisions, based on a Bayesian classifier, as to whether each T-F unit is dominated by the target or the masker. Speech corrupted at low signal-to-noise ratio (SNR) levels (-5 and 0 dB) using different types of maskers is synthesized by this algorithm and presented to normal-hearing listeners for identification. Results indicated substantial improvements in intelligibility (over 60% points in -5 dB babble) over that attained by human listeners with unprocessed stimuli. The findings from this study suggest that algorithms that can estimate reliably the SNR in each T-F unit can improve speech intelligibility.

  18. Application of Particle Swarm Optimization Algorithm in the Heating System Planning Problem

    PubMed Central

    Ma, Rong-Jiang; Yu, Nan-Yang; Hu, Jun-Yi

    2013-01-01

    Based on the life cycle cost (LCC) approach, this paper presents an integral mathematical model and particle swarm optimization (PSO) algorithm for the heating system planning (HSP) problem. The proposed mathematical model minimizes the cost of heating system as the objective for a given life cycle time. For the particularity of HSP problem, the general particle swarm optimization algorithm was improved. An actual case study was calculated to check its feasibility in practical use. The results show that the improved particle swarm optimization (IPSO) algorithm can more preferably solve the HSP problem than PSO algorithm. Moreover, the results also present the potential to provide useful information when making decisions in the practical planning process. Therefore, it is believed that if this approach is applied correctly and in combination with other elements, it can become a powerful and effective optimization tool for HSP problem. PMID:23935429

  19. Enhanced Handover Decision Algorithm in Heterogeneous Wireless Network

    PubMed Central

    Abdullah, Radhwan Mohamed; Zukarnain, Zuriati Ahmad

    2017-01-01

    Transferring a huge amount of data between different network locations over the network links depends on the network’s traffic capacity and data rate. Traditionally, a mobile device may be moved to achieve the operations of vertical handover, considering only one criterion, that is the Received Signal Strength (RSS). The use of a single criterion may cause service interruption, an unbalanced network load and an inefficient vertical handover. In this paper, we propose an enhanced vertical handover decision algorithm based on multiple criteria in the heterogeneous wireless network. The algorithm consists of three technology interfaces: Long-Term Evolution (LTE), Worldwide interoperability for Microwave Access (WiMAX) and Wireless Local Area Network (WLAN). It also employs three types of vertical handover decision algorithms: equal priority, mobile priority and network priority. The simulation results illustrate that the three types of decision algorithms outperform the traditional network decision algorithm in terms of handover number probability and the handover failure probability. In addition, it is noticed that the network priority handover decision algorithm produces better results compared to the equal priority and the mobile priority handover decision algorithm. Finally, the simulation results are validated by the analytical model. PMID:28708067

  20. The minimally invasive spinal deformity surgery algorithm: a reproducible rational framework for decision making in minimally invasive spinal deformity surgery.

    PubMed

    Mummaneni, Praveen V; Shaffrey, Christopher I; Lenke, Lawrence G; Park, Paul; Wang, Michael Y; La Marca, Frank; Smith, Justin S; Mundis, Gregory M; Okonkwo, David O; Moal, Bertrand; Fessler, Richard G; Anand, Neel; Uribe, Juan S; Kanter, Adam S; Akbarnia, Behrooz; Fu, Kai-Ming G

    2014-05-01

    Minimally invasive surgery (MIS) is an alternative to open deformity surgery for the treatment of patients with adult spinal deformity. However, at this time MIS techniques are not as versatile as open deformity techniques, and MIS techniques have been reported to result in suboptimal sagittal plane correction or pseudarthrosis when used for severe deformities. The minimally invasive spinal deformity surgery (MISDEF) algorithm was created to provide a framework for rational decision making for surgeons who are considering MIS versus open spine surgery. A team of experienced spinal deformity surgeons developed the MISDEF algorithm that incorporates a patient's preoperative radiographic parameters and leads to one of 3 general plans ranging from MIS direct or indirect decompression to open deformity surgery with osteotomies. The authors surveyed fellowship-trained spine surgeons experienced with spinal deformity surgery to validate the algorithm using a set of 20 cases to establish interobserver reliability. They then resurveyed the same surgeons 2 months later with the same cases presented in a different sequence to establish intraobserver reliability. Responses were collected and tabulated. Fleiss' analysis was performed using MATLAB software. Over a 3-month period, 11 surgeons completed the surveys. Responses for MISDEF algorithm case review demonstrated an interobserver kappa of 0.58 for the first round of surveys and an interobserver kappa of 0.69 for the second round of surveys, consistent with substantial agreement. In at least 10 cases there was perfect agreement between the reviewing surgeons. The mean intraobserver kappa for the 2 surveys was 0.86 ± 0.15 (± SD) and ranged from 0.62 to 1. The use of the MISDEF algorithm provides consistent and straightforward guidance for surgeons who are considering either an MIS or an open approach for the treatment of patients with adult spinal deformity. The MISDEF algorithm was found to have substantial inter- and intraobserver agreement. Although further studies are needed, the application of this algorithm could provide a platform for surgeons to achieve the desired goals of surgery.

  1. Integrated consensus-based frameworks for unmanned vehicle routing and targeting assignment

    NASA Astrophysics Data System (ADS)

    Barnawi, Waleed T.

    Unmanned aerial vehicles (UAVs) are increasingly deployed in complex and dynamic environments to perform multiple tasks cooperatively with other UAVs that contribute to overarching mission effectiveness. Studies by the Department of Defense (DoD) indicate future operations may include anti-access/area-denial (A2AD) environments which limit human teleoperator decision-making and control. This research addresses the problem of decentralized vehicle re-routing and task reassignments through consensus-based UAV decision-making. An Integrated Consensus-Based Framework (ICF) is formulated as a solution to the combined single task assignment problem and vehicle routing problem. The multiple assignment and vehicle routing problem is solved with the Integrated Consensus-Based Bundle Framework (ICBF). The frameworks are hierarchically decomposed into two levels. The bottom layer utilizes the renowned Dijkstra's Algorithm. The top layer addresses task assignment with two methods. The single assignment approach is called the Caravan Auction Algorithm (CarA) Algorithm. This technique extends the Consensus-Based Auction Algorithm (CBAA) to provide awareness for task completion by agents and adopt abandoned tasks. The multiple assignment approach called the Caravan Auction Bundle Algorithm (CarAB) extends the Consensus-Based Bundle Algorithm (CBBA) by providing awareness for lost resources, prioritizing remaining tasks, and adopting abandoned tasks. Research questions are investigated regarding the novelty and performance of the proposed frameworks. Conclusions regarding the research questions will be provided through hypothesis testing. Monte Carlo simulations will provide evidence to support conclusions regarding the research hypotheses for the proposed frameworks. The approach provided in this research addresses current and future military operations for unmanned aerial vehicles. However, the general framework implied by the proposed research is adaptable to any unmanned vehicle. Civil applications that involve missions where human observability would be limited could benefit from the independent UAV task assignment, such as exploration and fire surveillance are also notable uses for this approach.

  2. The Interface of Clinical Decision-Making With Study Protocols for Knowledge Translation From a Walking Recovery Trial.

    PubMed

    Hershberg, Julie A; Rose, Dorian K; Tilson, Julie K; Brutsch, Bettina; Correa, Anita; Gallichio, Joann; McLeod, Molly; Moore, Craig; Wu, Sam; Duncan, Pamela W; Behrman, Andrea L

    2017-01-01

    Despite efforts to translate knowledge into clinical practice, barriers often arise in adapting the strict protocols of a randomized, controlled trial (RCT) to the individual patient. The Locomotor Experience Applied Post-Stroke (LEAPS) RCT demonstrated equal effectiveness of 2 intervention protocols for walking recovery poststroke; both protocols were more effective than usual care physical therapy. The purpose of this article was to provide knowledge-translation tools to facilitate implementation of the LEAPS RCT protocols into clinical practice. Participants from 2 of the trial's intervention arms: (1) early Locomotor Training Program (LTP) and (2) Home Exercise Program (HEP) were chosen for case presentation. The two cases illustrate how the protocols are used in synergy with individual patient presentations and clinical expertise. Decision algorithms and guidelines for progression represent the interface between implementation of an RCT standardized intervention protocol and clinical decision-making. In each case, the participant presents with a distinct clinical challenge that the therapist addresses by integrating the participant's unique presentation with the therapist's expertise while maintaining fidelity to the LEAPS protocol. Both participants progressed through an increasingly challenging intervention despite their own unique presentation. Decision algorithms and exercise progression for the LTP and HEP protocols facilitate translation of the RCT protocol to the real world of clinical practice. The two case examples to facilitate translation of the LEAPS RCT into clinical practice by enhancing understanding of the protocols, their progression, and their application to individual participants.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, available at: http://links.lww.com/JNPT/A147).

  3. Optimizing in a complex world: A statistician's role in decision making

    DOE PAGES

    Anderson-Cook, Christine M.

    2016-08-09

    As applied statisticians increasingly participate as active members of problem-solving and decision-making teams, our role continues to evolve. Historically, we may have been seen as those who can help with data collection strategies or answer a specific question from a set of data. Nowadays, we are or strive to be more deeply involved throughout the entire problem-solving process. An emerging role is to provide a set of leading choices from which subject matter experts and managers can choose to make informed decisions. A key to success is to provide vehicles for understanding the trade-offs between candidates and interpreting the meritsmore » of each choice in the context of the decision-makers priorities. To achieve this objective, it is helpful to be able (a) to help subject matter experts identify quantitative criteria that match their priorities, (b) eliminate non-competitive choices through the use of a Pareto front, and (c) provide summary tools from which the trade-offs between alternatives can be quantitatively evaluated and discussed. A structured but flexible process for contributing to team decisions is described for situations when all choices can easily be enumerated as well as when a search algorithm to explore a vast number of potential candidates is required. In conclusion, a collection of diverse examples ranging from model selection, through multiple response optimization, and designing an experiment illustrate the approach.« less

  4. Optimizing in a complex world: A statistician's role in decision making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine M.

    As applied statisticians increasingly participate as active members of problem-solving and decision-making teams, our role continues to evolve. Historically, we may have been seen as those who can help with data collection strategies or answer a specific question from a set of data. Nowadays, we are or strive to be more deeply involved throughout the entire problem-solving process. An emerging role is to provide a set of leading choices from which subject matter experts and managers can choose to make informed decisions. A key to success is to provide vehicles for understanding the trade-offs between candidates and interpreting the meritsmore » of each choice in the context of the decision-makers priorities. To achieve this objective, it is helpful to be able (a) to help subject matter experts identify quantitative criteria that match their priorities, (b) eliminate non-competitive choices through the use of a Pareto front, and (c) provide summary tools from which the trade-offs between alternatives can be quantitatively evaluated and discussed. A structured but flexible process for contributing to team decisions is described for situations when all choices can easily be enumerated as well as when a search algorithm to explore a vast number of potential candidates is required. In conclusion, a collection of diverse examples ranging from model selection, through multiple response optimization, and designing an experiment illustrate the approach.« less

  5. Anticipation of the Impact of Human Papillomavirus on Clinical Decision Making for the Head and Neck Cancer Patient.

    PubMed

    Gillison, Maura L; Restighini, Carlo

    2015-12-01

    Human papillomavirus (HPV) is the cause of a distinct subset of oropharyngeal cancer rising in incidence in the United States and other developed countries. This increased incidence, combined with the strong effect of tumor HPV status on survival, has had a profound effect on the head and neck cancer discipline. The multidisciplinary field of head and neck cancer is in the midst of re-evaluating evidence-based algorithms for clinical decision making, developed from clinical trials conducted in an era when HPV-negative cancer predominated. This article reviews relationships between tumor HPV status and gender, cancer incidence trends, overall survival, treatment response, racial disparities, tumor staging, risk stratification, survival post disease progression, and clinical trial design. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology

    NASA Astrophysics Data System (ADS)

    Morgan, T. W.; Thurgood, R. L.

    1984-05-01

    This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.

  7. Is it worth changing pattern recognition methods for structural health monitoring?

    NASA Astrophysics Data System (ADS)

    Bull, L. A.; Worden, K.; Cross, E. J.; Dervilis, N.

    2017-05-01

    The key element of this work is to demonstrate alternative strategies for using pattern recognition algorithms whilst investigating structural health monitoring. This paper looks to determine if it makes any difference in choosing from a range of established classification techniques: from decision trees and support vector machines, to Gaussian processes. Classification algorithms are tested on adjustable synthetic data to establish performance metrics, then all techniques are applied to real SHM data. To aid the selection of training data, an informative chain of artificial intelligence tools is used to explore an active learning interaction between meaningful clusters of data.

  8. A Web-Based Search Service to Support Imaging Spectrometer Instrument Operations

    NASA Technical Reports Server (NTRS)

    Smith, Alexander; Thompson, David R.; Sayfi, Elias; Xing, Zhangfan; Castano, Rebecca

    2013-01-01

    Imaging spectrometers yield rich and informative data products, but interpreting them demands time and expertise. There is a continual need for new algorithms and methods for rapid first-draft analyses to assist analysts during instrument opera-tions. Intelligent data analyses can summarize scenes to draft geologic maps, searching images to direct op-erator attention to key features. This validates data quality while facilitating rapid tactical decision making to select followup targets. Ideally these algorithms would operate in seconds, never grow bored, and be free from observation bias about the kinds of mineral-ogy that will be found.

  9. The art and science of switching antipsychotic medications, part 2.

    PubMed

    Weiden, Peter J; Miller, Alexander L; Lambert, Tim J; Buckley, Peter F

    2007-01-01

    In the presentation "Switching and Metabolic Syndrome," Weiden summarizes reasons to switch antipsychotics, highlighting weight gain and other metabolic adverse events as recent treatment targets. In "Texas Medication Algorithm Project (TMAP)," Miller reviews the TMAP study design, discusses results related to the algorithm versus treatment as usual, and concludes with the implications of the study. Lambert's presentation, "Dosing and Titration Strategies to Optimize Patient Outcome When Switching Antipsychotic Therapy," reviews the decision-making process when switching patients' medication, addresses dosing and titration strategies to effectively transition between medications, and examines other factors to consider when switching pharmacotherapy.

  10. Distributed Load Shedding over Directed Communication Networks with Time Delays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Tao; Wu, Di

    When generation is insufficient to support all loads under emergencies, effective and efficient load shedding needs to be deployed in order to maintain the supply-demand balance. This paper presents a distributed load shedding algorithm, which makes efficient decision based on the discovered global information. In the global information discovery process, each load only communicates with its neighboring load via directed communication links possibly with arbitrarily large but bounded time varying communication delays. We propose a novel distributed information discovery algorithm based on ratio consensus. Simulation results are used to validate the proposed method.

  11. Algorithmic tools for interpreting vital signs.

    PubMed

    Rathbun, Melina C; Ruth-Sahd, Lisa A

    2009-07-01

    Today's complex world of nursing practice challenges nurse educators to develop teaching methods that promote critical thinking skills and foster quick problem solving in the novice nurse. Traditional pedagogies previously used in the classroom and clinical setting are no longer adequate to prepare nursing students for entry into practice. In addition, educators have expressed frustration when encouraging students to apply newly learned theoretical content to direct the care of assigned patients in the clinical setting. This article presents algorithms as an innovative teaching strategy to guide novice student nurses in the interpretation and decision making related to vital sign assessment in an acute care setting.

  12. Decision based on big data research for non-small cell lung cancer in medical artificial system in developing country.

    PubMed

    Wu, Jia; Tan, Yanlin; Chen, Zhigang; Zhao, Ming

    2018-06-01

    Non-small cell lung cancer (NSCLC) is a high risk cancer and is usually scanned by PET-CT for testing, predicting and then give the treatment methods. However, in the actual hospital system, at least 640 images must be generated for each patient through PET-CT scanning. Especially in developing countries, a huge number of patients in NSCLC are attended by doctors. Artificial system can predict and make decision rapidly. According to explore and research artificial medical system, the selection of artificial observations also can result in low work efficiency for doctors. In this study, data information of 2,789,675 patients in three hospitals in China are collected, compiled, and used as the research basis; these data are obtained through image acquisition and diagnostic parameter machine decision-making method on the basis of the machine diagnosis and medical system design model of adjuvant therapy. By combining image and diagnostic parameters, the machine decision diagnosis auxiliary algorithm is established. Experimental result shows that the accuracy has reached 77% in NSCLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. When Does Reward Maximization Lead to Matching Law?

    PubMed Central

    Sakai, Yutaka; Fukai, Tomoki

    2008-01-01

    What kind of strategies subjects follow in various behavioral circumstances has been a central issue in decision making. In particular, which behavioral strategy, maximizing or matching, is more fundamental to animal's decision behavior has been a matter of debate. Here, we prove that any algorithm to achieve the stationary condition for maximizing the average reward should lead to matching when it ignores the dependence of the expected outcome on subject's past choices. We may term this strategy of partial reward maximization “matching strategy”. Then, this strategy is applied to the case where the subject's decision system updates the information for making a decision. Such information includes subject's past actions or sensory stimuli, and the internal storage of this information is often called “state variables”. We demonstrate that the matching strategy provides an easy way to maximize reward when combined with the exploration of the state variables that correctly represent the crucial information for reward maximization. Our results reveal for the first time how a strategy to achieve matching behavior is beneficial to reward maximization, achieving a novel insight into the relationship between maximizing and matching. PMID:19030101

  14. Autonomous spacecraft landing through human pre-attentive vision.

    PubMed

    Schiavone, Giuseppina; Izzo, Dario; Simões, Luís F; de Croon, Guido C H E

    2012-06-01

    In this work, we exploit a computational model of human pre-attentive vision to guide the descent of a spacecraft on extraterrestrial bodies. Providing the spacecraft with high degrees of autonomy is a challenge for future space missions. Up to present, major effort in this research field has been concentrated in hazard avoidance algorithms and landmark detection, often by reference to a priori maps, ranked by scientists according to specific scientific criteria. Here, we present a bio-inspired approach based on the human ability to quickly select intrinsically salient targets in the visual scene; this ability is fundamental for fast decision-making processes in unpredictable and unknown circumstances. The proposed system integrates a simple model of the spacecraft and optimality principles which guarantee minimum fuel consumption during the landing procedure; detected salient sites are used for retargeting the spacecraft trajectory, under safety and reachability conditions. We compare the decisions taken by the proposed algorithm with that of a number of human subjects tested under the same conditions. Our results show how the developed algorithm is indistinguishable from the human subjects with respect to areas, occurrence and timing of the retargeting.

  15. A decision support system and rule-based algorithm to augment the human interpretation of the 12-lead electrocardiogram.

    PubMed

    Cairns, Andrew W; Bond, Raymond R; Finlay, Dewar D; Guldenring, Daniel; Badilini, Fabio; Libretti, Guido; Peace, Aaron J; Leslie, Stephen J

    The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. To improve interpretation accuracy and reduce missed co-abnormalities. The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Screening Algorithm to Guide Decisions on Whether to Conduct a Health Impact Assessment

    EPA Pesticide Factsheets

    Provides a visual aid in the form of a decision algorithm that helps guide discussions about whether to proceed with an HIA. The algorithm can help structure, standardize, and document the decision process.

  17. Machine Learning in the Presence of an Adversary: Attacking and Defending the SpamBayes Spam Filter

    DTIC Science & Technology

    2008-05-20

    Machine learning techniques are often used for decision making in security critical applications such as intrusion detection and spam filtering...filter. The defenses shown in this thesis are able to work against the attacks developed against SpamBayes and are sufficiently generic to be easily extended into other statistical machine learning algorithms.

  18. Distributed topology control algorithm for multihop wireless netoworks

    NASA Technical Reports Server (NTRS)

    Borbash, S. A.; Jennings, E. H.

    2002-01-01

    We present a network initialization algorithmfor wireless networks with distributed intelligence. Each node (agent) has only local, incomplete knowledge and it must make local decisions to meet a predefined global objective. Our objective is to use power control to establish a topology based onthe relative neighborhood graph which has good overall performance in terms of power usage, low interference, and reliability.

  19. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  20. Delft-FEWS:A Decision Making Platform to Intergrate Data, Model, Algorithm for Large-Scale River Basin Water Management

    NASA Astrophysics Data System (ADS)

    Yang, T.; Welles, E.

    2017-12-01

    In this paper, we introduce a flood forecasting and decision making platform, named Delft-FEWS, which has been developed over years at the Delft Hydraulics and now at Deltares. The philosophy of Delft-FEWS is to provide water managers and operators with an open shell tool, which allows the integratation of a variety of hydrological, hydraulics, river routing, and reservoir models with hydrometerological forecasts data. Delft-FEWS serves as an powerful tool for both basin-scale and national-scale water resources management. The essential novelty of Delft-FEWS is to change the flood forecasting and water resources management from a single model or agency centric paradigm to a intergrated framework, in which different model, data, algorithm and stakeholders are strongly linked together. The paper will start with the challenges in water resources managment, and the concept and philosophy of Delft-FEWS. Then, the details of data handling and linkages of Delft-FEWS with different hydrological, hydraulic, and reservoir models, etc. Last, several cases studies and applications of Delft-FEWS will be demonstrated, including the National Weather Service and the Bonneville Power Administration in USA, and a national application in the water board in the Netherland.

  1. Optimizing decision making at the end of life of a product

    NASA Astrophysics Data System (ADS)

    Gonzalez-Torre, Beatriz; Adenso-Diaz, Belarmino

    2004-02-01

    European environmental legislation has significantly evolved over the last few years, forcing manufacturers to be more environmentally aware and to introduce ecological criteria in their traditional practices. One of the most important goals of this set of regulations is to reduce the amount of solid waste generated per unit of time by promoting recycling, repair, reuse and other recovery strategies at the product end of life (EOL). However, one of the most difficult steps for manufacturers is that of deciding which of these options or which combination of them should be implemented to get the maximum recovery value taking into account the specific characteristics of each product. In this paper, a recurrent algorithm is proposed to determine the optimal end-of-life strategy. On the basis of the product bill of materials and its graphical CAD/CAM representation, the model will determine to what extent the product should be disassembled and what the final end of each disassembled part should be (reuse, recycling or disposal). The paper starts by presenting an overview of the model, to then focus on the CAD-integrated algorithm for determining the optimum disassembly sequence, a necessary step in EOL decision-making.

  2. Endoscopic feature tracking for augmented-reality assisted prosthesis selection in mitral valve repair

    NASA Astrophysics Data System (ADS)

    Engelhardt, Sandy; Kolb, Silvio; De Simone, Raffaele; Karck, Matthias; Meinzer, Hans-Peter; Wolf, Ivo

    2016-03-01

    Mitral valve annuloplasty describes a surgical procedure where an artificial prosthesis is sutured onto the anatomical structure of the mitral annulus to re-establish the valve's functionality. Choosing an appropriate commercially available ring size and shape is a difficult decision the surgeon has to make intraoperatively according to his experience. In our augmented-reality framework, digitalized ring models are superimposed onto endoscopic image streams without using any additional hardware. To place the ring model on the proper position within the endoscopic image plane, a pose estimation is performed that depends on the localization of sutures placed by the surgeon around the leaflet origins and punctured through the stiffer structure of the annulus. In this work, the tissue penetration points are tracked by the real-time capable Lucas Kanade optical flow algorithm. The accuracy and robustness of this tracking algorithm is investigated with respect to the question whether outliers influence the subsequent pose estimation. Our results suggest that optical flow is very stable for a variety of different endoscopic scenes and tracking errors do not affect the position of the superimposed virtual objects in the scene, making this approach a viable candidate for annuloplasty augmented reality-enhanced decision support.

  3. Decision-making for complex scapula and ipsilateral clavicle fractures: a review.

    PubMed

    Hess, Florian; Zettl, Ralph; Smolen, Daniel; Knoth, Christoph

    2018-03-23

    Complex scapula with ipsilateral clavicle fracures remains a challange and treatment recommendations are still missing.  This review provides an overview of the evolution of the definition, classification and treatment strategies for complex scapula and ipsilateral clavicle fractures. As with other rare conditions, consensus has not been reached on the most suitable management strategies to treat these patients. The aim of this review is twofold: to compile and summarize the currently available literature on this topic, and to recommend treatment approaches. Included in the review are the following topics: biomechanics of scapula and ipsilateral clavicle fractures, preoperative radiological evaluation, surgical treatment of the clavicle only, surgical treatment of both the clavicle and scapula, and nonsurgical treatment options. A decision-making algorithm is proposed for different treatment strategies based on pre-operative parameters, and an example of a case treated our institution is presented to illustrate use of the algorithm. The role of instability in complex scapula with ipsilateral clavicle fractures remains unclear. The question of stability is preoperatively less relevant than the question of whether the dislocated fragments lead to compromised shoulder function.

  4. A mathematical framework for combining decisions of multiple experts toward accurate and remote diagnosis of malaria using tele-microscopy.

    PubMed

    Mavandadi, Sam; Feng, Steve; Yu, Frank; Dimitrov, Stoyan; Nielsen-Saines, Karin; Prescott, William R; Ozcan, Aydogan

    2012-01-01

    We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment reveal that even highly trained medical experts are not always self-consistent in their diagnostic decisions and that there exists a fair level of disagreement among experts, even for binary decisions (i.e., infected vs. uninfected). To tackle this general medical diagnosis problem, we propose a probabilistic algorithm to fuse the decisions made by trained medical experts to robustly achieve higher levels of accuracy when compared to individual experts making such decisions. By modelling the decisions of experts as a three component mixture model and solving for the underlying parameters using the Expectation Maximisation algorithm, we demonstrate the efficacy of our approach which significantly improves the overall diagnostic accuracy of malaria infected cells. Additionally, we present a mathematical framework for performing 'slide-level' diagnosis by using individual 'cell-level' diagnosis data, shedding more light on the statistical rules that should govern the routine practice in examination of e.g., thin blood smear samples. This framework could be generalized for various other tele-pathology needs, and can be used by trained experts within an efficient tele-medicine platform.

  5. Simulation of empty container logistic management at depot

    NASA Astrophysics Data System (ADS)

    Sze, San-Nah; Sek, Siaw-Ying Doreen; Chiew, Kang-Leng; Tiong, Wei-King

    2017-07-01

    This study focuses on the empty container management problem in a deficit regional area. Deficit area is the area having more export activities than the import activities, which always have a shortage of empty container. This environment has challenged the trading companies in the decision making in distributing the empty containers. A simulation model that fit to the environment is developed. Besides, a simple heuristic algorithm with some hard and soft constraints consideration are proposed to plan the logistic of empty container supply. Then, the feasible route with the minimum cost will be determined by applying the proposed heuristic algorithm. The heuristic algorithm can be divided into three main phases which are data sorting, data assigning and time window updating.

  6. Adaptive awareness for personal and small group decision making.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.

    2003-12-01

    Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less

  7. A De-centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper, we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is decentralized, scalable, and overlaps the node coordination time with that of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  8. A De-Centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is de-centralized, scalable, and overlaps the node coordination time of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  9. Atlas-guided volumetric diffuse optical tomography enhanced by generalized linear model analysis to image risk decision-making responses in young adults.

    PubMed

    Lin, Zi-Jing; Li, Lin; Cazzell, Mary; Liu, Hanli

    2014-08-01

    Diffuse optical tomography (DOT) is a variant of functional near infrared spectroscopy and has the capability of mapping or reconstructing three dimensional (3D) hemodynamic changes due to brain activity. Common methods used in DOT image analysis to define brain activation have limitations because the selection of activation period is relatively subjective. General linear model (GLM)-based analysis can overcome this limitation. In this study, we combine the atlas-guided 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with risk decision-making processes. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The Balloon Analog Risk Task (BART) is a valid experimental model and has been commonly used to assess human risk-taking actions and tendencies while facing risks. We have used the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making from 37 human participants (22 males and 15 females). Voxel-wise GLM analysis was performed after a human brain atlas template and a depth compensation algorithm were combined to form atlas-guided DOT images. In this work, we wish to demonstrate the excellence of using voxel-wise GLM analysis with DOT to image and study cognitive functions in response to risk decision-making. Results have shown significant hemodynamic changes in the dorsal lateral prefrontal cortex (DLPFC) during the active-choice mode and a different activation pattern between genders; these findings correlate well with published literature in functional magnetic resonance imaging (fMRI) and fNIRS studies. Copyright © 2014 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.

  10. The Structural Consequences of Big Data-Driven Education.

    PubMed

    Zeide, Elana

    2017-06-01

    Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved-and perhaps unresolvable-issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools' pedagogical decision-making, and, in doing so, change fundamental aspects of America's education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers' academic autonomy, obscure student evaluation, and reduce parents' and students' ability to participate or challenge education decision-making. Third, big data-driven tools define what "counts" as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education's crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.

  11. "Radio-oncomics" : The potential of radiomics in radiation oncology.

    PubMed

    Peeken, Jan Caspar; Nüsslin, Fridtjof; Combs, Stephanie E

    2017-10-01

    Radiomics, a recently introduced concept, describes quantitative computerized algorithm-based feature extraction from imaging data including computer tomography (CT), magnetic resonance imaging (MRT), or positron-emission tomography (PET) images. For radiation oncology it offers the potential to significantly influence clinical decision-making and thus therapy planning and follow-up workflow. After image acquisition, image preprocessing, and defining regions of interest by structure segmentation, algorithms are applied to calculate shape, intensity, texture, and multiscale filter features. By combining multiple features and correlating them with clinical outcome, prognostic models can be created. Retrospective studies have proposed radiomics classifiers predicting, e. g., overall survival, radiation treatment response, distant metastases, or radiation-related toxicity. Besides, radiomics features can be correlated with genomic information ("radiogenomics") and could be used for tumor characterization. Distinct patterns based on data-based as well as genomics-based features will influence radiation oncology in the future. Individualized treatments in terms of dose level adaption and target volume definition, as well as other outcome-related parameters will depend on radiomics and radiogenomics. By integration of various datasets, the prognostic power can be increased making radiomics a valuable part of future precision medicine approaches. This perspective demonstrates the evidence for the radiomics concept in radiation oncology. The necessity of further studies to integrate radiomics classifiers into clinical decision-making and the radiation therapy workflow is emphasized.

  12. Remote-sensing-based rapid assessment of flood crop loss to support USDA flooding decision-making

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Yang, Z.; Hipple, J.; Shrestha, R.

    2016-12-01

    Floods often cause significant crop loss in the United States. Timely and objective assessment of flood-related crop loss is very important for crop monitoring and risk management in agricultural and disaster-related decision-making in USDA. Among all flood-related information, crop yield loss is particularly important. Decision on proper mitigation, relief, and monetary compensation relies on it. Currently USDA mostly relies on field surveys to obtain crop loss information and compensate farmers' loss claim. Such methods are expensive, labor intensive, and time consumptive, especially for a large flood that affects a large geographic area. Recent studies have demonstrated that Earth observation (EO) data are useful in post-flood crop loss assessment for a large geographic area objectively, timely, accurately, and cost effectively. There are three stages of flood damage assessment, including rapid assessment, early recovery assessment, and in-depth assessment. EO-based flood assessment methods currently rely on the time-series of vegetation index to assess the yield loss. Such methods are suitable for in-depth assessment but are less suitable for rapid assessment since the after-flood vegetation index time series is not available. This presentation presents a new EO-based method for the rapid assessment of crop yield loss immediately after a flood event to support the USDA flood decision making. The method is based on the historic records of flood severity, flood duration, flood date, crop type, EO-based both before- and immediate-after-flood crop conditions, and corresponding crop yield loss. It hypotheses that a flood of same severity occurring at the same pheonological stage of a crop will cause the similar damage to the crop yield regardless the flood years. With this hypothesis, a regression-based rapid assessment algorithm can be developed by learning from historic records of flood events and corresponding crop yield loss. In this study, historic records of MODIS-based flood and vegetation products and USDA/NASS crop type and crop yield data are used to train the regression-based rapid assessment algorithm. Validation of the rapid assessment algorithm indicates it can predict the yield loss at 90% accuracy, which is accurate enough to support USDA on flood-related quick response and mitigation.

  13. Sensorimotor Learning Biases Choice Behavior: A Learning Neural Field Model for Decision Making

    PubMed Central

    Schöner, Gregor; Gail, Alexander

    2012-01-01

    According to a prominent view of sensorimotor processing in primates, selection and specification of possible actions are not sequential operations. Rather, a decision for an action emerges from competition between different movement plans, which are specified and selected in parallel. For action choices which are based on ambiguous sensory input, the frontoparietal sensorimotor areas are considered part of the common underlying neural substrate for selection and specification of action. These areas have been shown capable of encoding alternative spatial motor goals in parallel during movement planning, and show signatures of competitive value-based selection among these goals. Since the same network is also involved in learning sensorimotor associations, competitive action selection (decision making) should not only be driven by the sensory evidence and expected reward in favor of either action, but also by the subject's learning history of different sensorimotor associations. Previous computational models of competitive neural decision making used predefined associations between sensory input and corresponding motor output. Such hard-wiring does not allow modeling of how decisions are influenced by sensorimotor learning or by changing reward contingencies. We present a dynamic neural field model which learns arbitrary sensorimotor associations with a reward-driven Hebbian learning algorithm. We show that the model accurately simulates the dynamics of action selection with different reward contingencies, as observed in monkey cortical recordings, and that it correctly predicted the pattern of choice errors in a control experiment. With our adaptive model we demonstrate how network plasticity, which is required for association learning and adaptation to new reward contingencies, can influence choice behavior. The field model provides an integrated and dynamic account for the operations of sensorimotor integration, working memory and action selection required for decision making in ambiguous choice situations. PMID:23166483

  14. Application of Domain Knowledge to Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted, and possible reassessments resulting in new policy are made. In this regard we implemented a machine learning algorithm which automatically defined business rules based on expert assessment of the quality of operational decisions as recorded during deployment.

  15. Decision tree and ensemble learning algorithms with their applications in bioinformatics.

    PubMed

    Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping

    2011-01-01

    Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.

  16. Improved multi-objective ant colony optimization algorithm and its application in complex reasoning

    NASA Astrophysics Data System (ADS)

    Wang, Xinqing; Zhao, Yang; Wang, Dong; Zhu, Huijie; Zhang, Qing

    2013-09-01

    The problem of fault reasoning has aroused great concern in scientific and engineering fields. However, fault investigation and reasoning of complex system is not a simple reasoning decision-making problem. It has become a typical multi-constraint and multi-objective reticulate optimization decision-making problem under many influencing factors and constraints. So far, little research has been carried out in this field. This paper transforms the fault reasoning problem of complex system into a paths-searching problem starting from known symptoms to fault causes. Three optimization objectives are considered simultaneously: maximum probability of average fault, maximum average importance, and minimum average complexity of test. Under the constraints of both known symptoms and the causal relationship among different components, a multi-objective optimization mathematical model is set up, taking minimizing cost of fault reasoning as the target function. Since the problem is non-deterministic polynomial-hard(NP-hard), a modified multi-objective ant colony algorithm is proposed, in which a reachability matrix is set up to constrain the feasible search nodes of the ants and a new pseudo-random-proportional rule and a pheromone adjustment mechinism are constructed to balance conflicts between the optimization objectives. At last, a Pareto optimal set is acquired. Evaluation functions based on validity and tendency of reasoning paths are defined to optimize noninferior set, through which the final fault causes can be identified according to decision-making demands, thus realize fault reasoning of the multi-constraint and multi-objective complex system. Reasoning results demonstrate that the improved multi-objective ant colony optimization(IMACO) can realize reasoning and locating fault positions precisely by solving the multi-objective fault diagnosis model, which provides a new method to solve the problem of multi-constraint and multi-objective fault diagnosis and reasoning of complex system.

  17. Normal pressure hydrocephalus: survey on contemporary diagnostic algorithms and therapeutic decision-making in clinical practice.

    PubMed

    Krauss, J K; Halve, B

    2004-04-01

    There is no agreement on the best diagnostic criteria for selecting patients with normal pressure hydrocephalus (NPH) for CSF shunting. The primary objective of the present study was to provide a contemporary survey on diagnostic algorithms and therapeutic decision-making in clinical practice. The secondary objective was to estimate the incidence of NPH. Standardized questionnaires with sections on the incidence of NPH and the frequency of shunting, evaluation of clinical symptoms, and signs, diagnostic studies, therapeutic decision-making and operative techniques, postoperative outcome and complications, and the profiles of different centers, were sent to 82 neurosurgical centers in Germany known to participate in the care of patients with NPH. Data were analyzed from 49 of 53 centers which responded to the survey (65%). The estimated annual incidence of NPH was 1.8 cases/100.000 inhabitants. Gait disturbance was defined as the most important sign of NPH (61%). There was a wide variety in the choice of diagnostic tests. Cisternography was performed routinely only in single centers. Diagnostic CSF removal was used with varying frequency by all centers except one, but the amount of CSF removed by lumbar puncture differed markedly between centers. There was poor agreement on criteria for evaluation of continuous intracranial pressure recordings regarding both the amplitude and the relative frequency of B-waves. Both periventricular and deep white matter lesions were present in about 50% of patients being shunted, indicating that vascular comorbidity in NPH patients has gained more acceptance. Programmable shunts were used by more than half of the centers, and newer valve types such as gravitational valves have become more popular. According to the present survey, new diagnostic and therapeutic concepts on NPH have penetrated daily routine to a certain extent. Wide variability, however, still exists among different neurosurgical centers.

  18. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  19. Using evidence-based algorithms to improve clinical decision making: the case of a first-time anterior shoulder dislocation.

    PubMed

    Federer, Andrew E; Taylor, Dean C; Mather, Richard C

    2013-09-01

    Decision making in health care has evolved substantially over the last century. Up until the late 1970s, medical decision making was predominantly intuitive and anecdotal. It was based on trial and error and involved high levels of problem solving. The 1980s gave way to empirical medicine, which was evidence based probabilistic, and involved pattern recognition and less problem solving. Although this represented a major advance in the quality of medical decision making, limitations existed. The advantages of the gold standard of the randomized controlled clinical trial (RCT) are well-known and this technique is irreplaceable in its ability to answer critical clinical questions. However, the RCT does have drawbacks. RCTs are expensive and can only capture a snapshot in time. As treatments change and new technologies emerge, new expensive clinical trials must be undertaken to reevaluate them. Furthermore, in order to best evaluate a single intervention, other factors must be controlled. In addition, the study population may not match that of another organization or provider. Although evidence-based medicine has provided powerful data for clinicians, effectively and efficiently tailoring it to the individual has not yet evolved. We are now in a period of transition from this evidence-based era to one dominated by the personalization and customization of care. It will be fueled by policy decisions to shift financial responsibility to the patient, creating a powerful and sophisticated consumer, unlike any patient we have known before. The challenge will be to apply medical evidence and personal preferences to medical decisions and deliver it efficiently in the increasingly busy clinical setting. In this article, we provide a robust review of the concepts of customized care and some of techniques to deliver it. We will illustrate this through a personalized decision model for the treatment decision after a first-time anterior shoulder dislocation.

  20. Amoeba-inspired Tug-of-War algorithms for exploration-exploitation dilemma in extended Bandit Problem.

    PubMed

    Aono, Masashi; Kim, Song-Ju; Hara, Masahiko; Munakata, Toshinori

    2014-03-01

    The true slime mold Physarum polycephalum, a single-celled amoeboid organism, is capable of efficiently allocating a constant amount of intracellular resource to its pseudopod-like branches that best fit the environment where dynamic light stimuli are applied. Inspired by the resource allocation process, the authors formulated a concurrent search algorithm, called the Tug-of-War (TOW) model, for maximizing the profit in the multi-armed Bandit Problem (BP). A player (gambler) of the BP should decide as quickly and accurately as possible which slot machine to invest in out of the N machines and faces an "exploration-exploitation dilemma." The dilemma is a trade-off between the speed and accuracy of the decision making that are conflicted objectives. The TOW model maintains a constant intracellular resource volume while collecting environmental information by concurrently expanding and shrinking its branches. The conservation law entails a nonlocal correlation among the branches, i.e., volume increment in one branch is immediately compensated by volume decrement(s) in the other branch(es). Owing to this nonlocal correlation, the TOW model can efficiently manage the dilemma. In this study, we extend the TOW model to apply it to a stretched variant of BP, the Extended Bandit Problem (EBP), which is a problem of selecting the best M-tuple of the N machines. We demonstrate that the extended TOW model exhibits better performances for 2-tuple-3-machine and 2-tuple-4-machine instances of EBP compared with the extended versions of well-known algorithms for BP, the ϵ-Greedy and SoftMax algorithms, particularly in terms of its short-term decision-making capability that is essential for the survival of the amoeba in a hostile environment. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Do we face a fourth paradigm shift in medicine--algorithms in education?

    PubMed

    Eitel, F; Kanz, K G; Hortig, E; Tesche, A

    2000-08-01

    Medicine has evolved toward rationalization since the Enlightenment, favouring quantitative measures. Now, a paradigm shift toward control through formalization can be observed in health care whose structures and processes are subjected to increasing standardization. However, educational reforms and curricula do not yet adequately respond to this shift. The aim of this article is to describe innovative approaches in medical education for adapting to these changes. The study design is a descriptive case report relying on a literature review and on a reform project's evaluation. Concept mapping is used to graphically represent relationships among concepts, i.e. defined terms from educational literature. Definitions of 'concept map', 'guideline' and 'algorithm' are presented. A prototypical algorithm for organizational decision making in the project's instructional design is shown. Evaluation results of intrinsic learning motivation are demonstrated: intrinsic learning motivation depends upon students' perception of their competence exhibiting path coefficients varying from 0.42 to 0.51. Perception of competence varies with the type of learning environment. An innovative educational format, called 'evidence-based learning (EBL)' is deduced from these findings and described here. Effects of formalization consist of structuring decision making about implementation of different learning environments or about minimizing variance in teaching or learning. Unintended effects of formalization such as implementation problems and bureaucracy are discussed. Formalized tools for designing medical education are available. Specific instructional designs influence students' learning motivation. Concept maps are suitable for controlling educational quality, thus enabling the paradigm shift in medical education.

  2. Semantics of directly manipulating spatializations.

    PubMed

    Hu, Xinran; Bradel, Lauren; Maiti, Dipayan; House, Leanna; North, Chris; Leman, Scotland

    2013-12-01

    When high-dimensional data is visualized in a 2D plane by using parametric projection algorithms, users may wish to manipulate the layout of the data points to better reflect their domain knowledge or to explore alternative structures. However, few users are well-versed in the algorithms behind the visualizations, making parameter tweaking more of a guessing game than a series of decisive interactions. Translating user interactions into algorithmic input is a key component of Visual to Parametric Interaction (V2PI) [13]. Instead of adjusting parameters, users directly move data points on the screen, which then updates the underlying statistical model. However, we have found that some data points that are not moved by the user are just as important in the interactions as the data points that are moved. Users frequently move some data points with respect to some other 'unmoved' data points that they consider as spatially contextual. However, in current V2PI interactions, these points are not explicitly identified when directly manipulating the moved points. We design a richer set of interactions that makes this context more explicit, and a new algorithm and sophisticated weighting scheme that incorporates the importance of these unmoved data points into V2PI.

  3. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  4. Visual saliency-based fast intracoding algorithm for high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Zhou, Xin; Shi, Guangming; Zhou, Wei; Duan, Zhemin

    2017-01-01

    Intraprediction has been significantly improved in high efficiency video coding over H.264/AVC with quad-tree-based coding unit (CU) structure from size 64×64 to 8×8 and more prediction modes. However, these techniques cause a dramatic increase in computational complexity. An intracoding algorithm is proposed that consists of perceptual fast CU size decision algorithm and fast intraprediction mode decision algorithm. First, based on the visual saliency detection, an adaptive and fast CU size decision method is proposed to alleviate intraencoding complexity. Furthermore, a fast intraprediction mode decision algorithm with step halving rough mode decision method and early modes pruning algorithm is presented to selectively check the potential modes and effectively reduce the complexity of computation. Experimental results show that our proposed fast method reduces the computational complexity of the current HM to about 57% in encoding time with only 0.37% increases in BD rate. Meanwhile, the proposed fast algorithm has reasonable peak signal-to-noise ratio losses and nearly the same subjective perceptual quality.

  5. To Issue of Mathematical Management Methods Applied for Investment-Building Complex under Conditions of Economic Crisis

    NASA Astrophysics Data System (ADS)

    Novikova, V.; Nikolaeva, O.

    2017-11-01

    In the article the authors consider a cognitive management method of the investment-building complex in the crisis conditions. The factors influencing the choice of an investment strategy are studied, the basic lines of the activity in the field of crisis-management from a position of mathematical modelling are defined. The general approach to decision-making on investment in real assets on the basis of the discrete systems based on the optimum control theory is offered. With the use of a discrete maximum principle the task in view of the decision is found. The numerical algorithm to define the optimum control is formulated by investments. Analytical decisions for the case of constant profitability of the basic means are obtained.

  6. DoD Research and Engineering Enterprise

    DTIC Science & Technology

    2014-05-01

    Secretary of Defense Hagel, Pentagon Press Briefing Room, February 24, 2014 Technological superiority has been central to the strategy of the...understand the environment, to software algorithms that can make a decision or seek human assistance. Through autonomy, we should be able to greatly reduce...computers are a commercial product 1 , and quantum key distribution for data encryption is nearly a commercial product. These two applications are

  7. Simulation Experiments with Goal-Seeking Adaptive Elements.

    DTIC Science & Technology

    1984-02-01

    when it comes to cognition and particularly bad when it comes to remote sensing, goal seeking, adaptation and decision making, where brains excel. In...Erlbaum 1981, 161-187 Hinton, G. E., & Sejnowski, T. J. Analyzing Cooperative Computation. Proceedings of the Fifth Annual Conference of the Cognitive ...Algorithms and Applications. Springer-Verlag, 1981. Lenat, D. B., Hayes-Roth, F., Klahr, P. Cognitive economy. Stanford Heuristic Program- ming Project HPP

  8. Humans and Autonomy: Implications of Shared Decision Making for Military Operations

    DTIC Science & Technology

    2017-01-01

    and machine learning transparency are identified as future research opportunities. 15. SUBJECT TERMS autonomy, human factors, intelligent agents...network as either the mission changes or an agent becomes disabled (DSB 2012). Fig. 2 Control structures for human agent teams. Robots without tools... learning (ML) algorithms monitor progress. However, operators have final executive authority; they are able to tweak the plan or choose an option

  9. 11.2 YIP Human In the Loop Statistical RelationalLearners

    DTIC Science & Technology

    2017-10-23

    learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action

  10. Knowledge discovery from patients' behavior via clustering-classification algorithms based on weighted eRFM and CLV model: An empirical study in public health care services.

    PubMed

    Zare Hosseini, Zeinab; Mohammadzadeh, Mahdi

    2016-01-01

    The rapid growing of information technology (IT) motivates and makes competitive advantages in health care industry. Nowadays, many hospitals try to build a successful customer relationship management (CRM) to recognize target and potential patients, increase patient loyalty and satisfaction and finally maximize their profitability. Many hospitals have large data warehouses containing customer demographic and transactions information. Data mining techniques can be used to analyze this data and discover hidden knowledge of customers. This research develops an extended RFM model, namely RFML (added parameter: Length) based on health care services for a public sector hospital in Iran with the idea that there is contrast between patient and customer loyalty, to estimate customer life time value (CLV) for each patient. We used Two-step and K-means algorithms as clustering methods and Decision tree (CHAID) as classification technique to segment the patients to find out target, potential and loyal customers in order to implement strengthen CRM. Two approaches are used for classification: first, the result of clustering is considered as Decision attribute in classification process and second, the result of segmentation based on CLV value of patients (estimated by RFML) is considered as Decision attribute. Finally the results of CHAID algorithm show the significant hidden rules and identify existing patterns of hospital consumers.

  11. Knowledge discovery from patients’ behavior via clustering-classification algorithms based on weighted eRFM and CLV model: An empirical study in public health care services

    PubMed Central

    Zare Hosseini, Zeinab; Mohammadzadeh, Mahdi

    2016-01-01

    The rapid growing of information technology (IT) motivates and makes competitive advantages in health care industry. Nowadays, many hospitals try to build a successful customer relationship management (CRM) to recognize target and potential patients, increase patient loyalty and satisfaction and finally maximize their profitability. Many hospitals have large data warehouses containing customer demographic and transactions information. Data mining techniques can be used to analyze this data and discover hidden knowledge of customers. This research develops an extended RFM model, namely RFML (added parameter: Length) based on health care services for a public sector hospital in Iran with the idea that there is contrast between patient and customer loyalty, to estimate customer life time value (CLV) for each patient. We used Two-step and K-means algorithms as clustering methods and Decision tree (CHAID) as classification technique to segment the patients to find out target, potential and loyal customers in order to implement strengthen CRM. Two approaches are used for classification: first, the result of clustering is considered as Decision attribute in classification process and second, the result of segmentation based on CLV value of patients (estimated by RFML) is considered as Decision attribute. Finally the results of CHAID algorithm show the significant hidden rules and identify existing patterns of hospital consumers. PMID:27610177

  12. Fuzzy logic and optical correlation-based face recognition method for patient monitoring application in home video surveillance

    NASA Astrophysics Data System (ADS)

    Elbouz, Marwa; Alfalou, Ayman; Brosseau, Christian

    2011-06-01

    Home automation is being implemented into more and more domiciles of the elderly and disabled in order to maintain their independence and safety. For that purpose, we propose and validate a surveillance video system, which detects various posture-based events. One of the novel points of this system is to use adapted Vander-Lugt correlator (VLC) and joint-transfer correlator (JTC) techniques to make decisions on the identity of a patient and his three-dimensional (3-D) positions in order to overcome the problem of crowd environment. We propose a fuzzy logic technique to get decisions on the subject's behavior. Our system is focused on the goals of accuracy, convenience, and cost, which in addition does not require any devices attached to the subject. The system permits one to study and model subject responses to behavioral change intervention because several levels of alarm can be incorporated according different situations considered. Our algorithm performs a fast 3-D recovery of the subject's head position by locating eyes within the face image and involves a model-based prediction and optical correlation techniques to guide the tracking procedure. The object detection is based on (hue, saturation, value) color space. The system also involves an adapted fuzzy logic control algorithm to make a decision based on information given to the system. Furthermore, the principles described here are applicable to a very wide range of situations and robust enough to be implementable in ongoing experiments.

  13. Emergency Physicians’ Perceptions and Decision-making Processes Regarding Patients Presenting with Palpitations

    PubMed Central

    Probst, Marc A.; Kanzaria, Hemal K.; Hoffman, Jerome R.; Mower, William R.; Moheimani, Roya S.; Sun, Benjamin C.; Quigley, Denise D.

    2015-01-01

    Background Palpitations are a common emergency department (ED) complaint, yet relatively little research exists on this topic from an emergency care perspective. Objectives We sought to describe the perceptions and clinical decision-making processes of emergency physicians (EP) surrounding patients with palpitations. Methods We conducted 21 semistructured interviews with a convenience sample of EPs. We recruited participants from academic and community practice settings from four regions of the US. The transcribed interviews were analyzed using a combination of structural coding and grounded theory approaches with ATLAS.ti, a qualitative data analysis software program. Results EPs perceive palpitations to be a common but generally benign chief complaint. EPs' clinical approach to palpitations, with regards to testing, treatment and ED management, can be classified as relating to one or more of the following themes: (1) risk-stratification, (2) diagnostic categorization, (3) algorithmic management, and (4) case-specific gestalt. With regard to disposition decisions, four main themes emerged: (1) presence of a serious diagnosis, (2) perceived need for further cardiac testing/monitoring, (3) presence of key associated symptoms, (4) request of other physician or patient desire. The inter-rater reliability exercise yielded a Fleiss' kappa measure of 0.69, indicating substantial agreement between coders. Conclusion EPs perceive palpitations to be a common but generally benign chief complaint. EPs rely on one, or more, of four main clinical approaches to manage these patients. These findings could help guide future efforts at developing risk-stratification tools and clinical algorithms for patients with palpitations. PMID:25943288

  14. The Applications of Genetic Algorithms in Medicine.

    PubMed

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-11-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.].

  15. The Applications of Genetic Algorithms in Medicine

    PubMed Central

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-01-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.] PMID:26676060

  16. Log-linear model based behavior selection method for artificial fish swarm algorithm.

    PubMed

    Huang, Zhehuang; Chen, Yidong

    2015-01-01

    Artificial fish swarm algorithm (AFSA) is a population based optimization technique inspired by social behavior of fishes. In past several years, AFSA has been successfully applied in many research and application areas. The behavior of fishes has a crucial impact on the performance of AFSA, such as global exploration ability and convergence speed. How to construct and select behaviors of fishes are an important task. To solve these problems, an improved artificial fish swarm algorithm based on log-linear model is proposed and implemented in this paper. There are three main works. Firstly, we proposed a new behavior selection algorithm based on log-linear model which can enhance decision making ability of behavior selection. Secondly, adaptive movement behavior based on adaptive weight is presented, which can dynamically adjust according to the diversity of fishes. Finally, some new behaviors are defined and introduced into artificial fish swarm algorithm at the first time to improve global optimization capability. The experiments on high dimensional function optimization showed that the improved algorithm has more powerful global exploration ability and reasonable convergence speed compared with the standard artificial fish swarm algorithm.

  17. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    PubMed

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  18. The solution of target assignment problem in command and control decision-making behaviour simulation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Huai, Wenqing; Wang, Shaodan

    2017-08-01

    C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.

  19. Knowledge discovery of drug data on the example of adverse reaction prediction

    PubMed Central

    2014-01-01

    Background Antibiotics are the widely prescribed drugs for children and most likely to be related with adverse reactions. Record on adverse reactions and allergies from antibiotics considerably affect the prescription choices. We consider this a biomedical decision-making problem and explore hidden knowledge in survey results on data extracted from a big data pool of health records of children, from the Health Center of Osijek, Eastern Croatia. Results We applied and evaluated a k-means algorithm to the dataset to generate some clusters which have similar features. Our results highlight that some type of antibiotics form different clusters, which insight is most helpful for the clinician to support better decision-making. Conclusions Medical professionals can investigate the clusters which our study revealed, thus gaining useful knowledge and insight into this data for their clinical studies. PMID:25079450

  20. Decision-Making Strategy for Rectal Cancer Management Using Radiation Therapy for Elderly or Comorbid Patients.

    PubMed

    Wang, Shang-Jui; Hathout, Lara; Malhotra, Usha; Maloney-Patel, Nell; Kilic, Sarah; Poplin, Elizabeth; Jabbour, Salma K

    2018-03-15

    Rectal cancer predominantly affects patients older than 70 years, with peak incidence at age 80 to 85 years. However, the standard treatment paradigm for rectal cancer oftentimes cannot be feasibly applied to these patients owing to frailty or comorbid conditions. There are currently little information and no treatment guidelines to help direct therapy for patients who are elderly and/or have significant comorbidities, because most are not included or specifically studied in clinical trials. More recently various alternative treatment options have been brought to light that may potentially be utilized in this group of patients. This critical review examines the available literature on alternative therapies for rectal cancer and proposes a treatment algorithm to help guide clinicians in treatment decision making for elderly and comorbid patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Effective heart disease prediction system using data mining techniques.

    PubMed

    Singh, Poornima; Singh, Sanjay; Pandi-Jain, Gayatri S

    2018-01-01

    The health care industries collect huge amounts of data that contain some hidden information, which is useful for making effective decisions. For providing appropriate results and making effective decisions on data, some advanced data mining techniques are used. In this study, an effective heart disease prediction system (EHDPS) is developed using neural network for predicting the risk level of heart disease. The system uses 15 medical parameters such as age, sex, blood pressure, cholesterol, and obesity for prediction. The EHDPS predicts the likelihood of patients getting heart disease. It enables significant knowledge, eg, relationships between medical factors related to heart disease and patterns, to be established. We have employed the multilayer perceptron neural network with backpropagation as the training algorithm. The obtained results have illustrated that the designed diagnostic system can effectively predict the risk level of heart diseases.

  2. The incremental impact of cardiac MRI on clinical decision-making.

    PubMed

    Rajwani, Adil; Stewart, Michael J; Richardson, James D; Child, Nicholas M; Maredia, Neil

    2016-01-01

    Despite a significant expansion in the use of cardiac MRI (CMR), there is inadequate evaluation of its incremental impact on clinical decision-making over and above other well-established modalities. We sought to determine the incremental utility of CMR in routine practice. 629 consecutive CMR studies referred by 44 clinicians from 9 institutions were evaluated. Pre-defined algorithms were used to determine the incremental influence on diagnostic thinking, influence on clinical management and thus the overall clinical utility. Studies were also subdivided and evaluated according to the indication for CMR. CMR provided incremental information to the clinician in 85% of cases, with incremental influence on diagnostic thinking in 85% of cases and incremental impact on management in 42% of cases. The overall incremental utility of CMR exceeded 90% in 7 out of the 13 indications, whereas in settings such as the evaluation of unexplained ventricular arrhythmia or mild left ventricular systolic dysfunction, this was <50%. CMR was frequently able to inform and influence decision-making in routine clinical practice, even with analyses that accepted only incremental clinical information and excluded a redundant duplication of imaging. Significant variations in yield were noted according to the indication for CMR. These data support a wider integration of CMR services into cardiac imaging departments. These data are the first to objectively evaluate the incremental value of a UK CMR service in clinical decision-making. Such data are essential when seeking justification for a CMR service.

  3. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    PubMed

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Clinician time used for decision making: a best case workflow study using cardiovascular risk assessments and Ask Mayo Expert algorithmic care process models.

    PubMed

    North, Frederick; Fox, Samuel; Chaudhry, Rajeev

    2016-07-20

    Risk calculation is increasingly used in lipid management, congestive heart failure, and atrial fibrillation. The risk scores are then used for decisions about statin use, anticoagulation, and implantable defibrillator use. Calculating risks for patients and making decisions based on these risks is often done at the point of care and is an additional time burden for clinicians that can be decreased by automating the tasks and using clinical decision-making support. Using Morae Recorder software, we timed 30 healthcare providers tasked with calculating the overall risk of cardiovascular events, sudden death in heart failure, and thrombotic event risk in atrial fibrillation. Risk calculators used were the American College of Cardiology Atherosclerotic Cardiovascular Disease risk calculator (AHA-ASCVD risk), Seattle Heart Failure Model (SHFM risk), and CHA2DS2VASc. We also timed the 30 providers using Ask Mayo Expert care process models for lipid management, heart failure management, and atrial fibrillation management based on the calculated risk scores. We used the Mayo Clinic primary care panel to estimate time for calculating an entire panel risk. Mean provider times to complete the CHA2DS2VASc, AHA-ASCVD risk, and SHFM were 36, 45, and 171 s respectively. For decision making about atrial fibrillation, lipids, and heart failure, the mean times (including risk calculations) were 85, 110, and 347 s respectively. Even under best case circumstances, providers take a significant amount of time to complete risk assessments. For a complete panel of patients this can lead to hours of time required to make decisions about prescribing statins, use of anticoagulation, and medications for heart failure. Informatics solutions are needed to capture data in the medical record and serve up automatically calculated risk assessments to physicians and other providers at the point of care.

  5. Hybrid analysis for indicating patients with breast cancer using temperature time series.

    PubMed

    Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura

    2016-07-01

    Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an average accuracy of 95.38% was obtained. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Fast Image Texture Classification Using Decision Trees

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2011-01-01

    Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.

  7. Effectiveness of physical therapy for patients with neck pain: an individualized approach using a clinical decision-making algorithm.

    PubMed

    Wang, Wendy T J; Olson, Sharon L; Campbell, Anne H; Hanten, William P; Gleeson, Peggy B

    2003-03-01

    The purpose of this study was to determine the effectiveness of an individualized physical therapy intervention in treating neck pain based on a clinical reasoning algorithm. Treatment effectiveness was examined by assessing changes in impairment, physical performance, and disability in response to intervention. One treatment group of 30 patients with neck pain completed physical therapy treatment. The control group of convenience was formed by a cohort group of 27 subjects who also had neck pain but did not receive treatment for various reasons. There were no significant differences between groups in demographic data and the initial test scores of the outcome measures. A quasi-experimental, nonequivalent, pretest-posttest control group design was used. A physical therapist rendered an eclectic intervention to the treatment group based on a clinical decision-making algorithm. Treatment outcome measures included the following five dependent variables: cervical range of motion, numeric pain rating, timed weighted overhead endurance, the supine capital flexion endurance test, and the Patient Specific Functional Scale. Both the treatment and control groups completed the initial and follow-up examinations, with an average duration of 4 wk between tests. Five mixed analyses of variance with follow-up tests showed a significant difference for all outcome measures in the treatment group compared with the control group. After an average 4 wk of physical therapy intervention, patients in the treatment group demonstrated statistically significant increases of cervical range of motion, decrease of pain, increases of physical performance measures, and decreases in the level of disability. The control group showed no differences in all five outcome variables between the initial and follow-up test scores. This study delineated algorithm-based clinical reasoning strategies for evaluating and treating patients with cervical pain. The algorithm can help clinicians classify patients with cervical pain into clinical patterns and provides pattern-specific guidelines for physical therapy interventions. An organized and specific physical therapy program was effective in improving the status of patients with neck pain.

  8. Development of a computer-based clinical decision support tool for selecting appropriate rehabilitation interventions for injured workers.

    PubMed

    Gross, Douglas P; Zhang, Jing; Steenstra, Ivan; Barnsley, Susan; Haws, Calvin; Amell, Tyler; McIntosh, Greg; Cooper, Juliette; Zaiane, Osmar

    2013-12-01

    To develop a classification algorithm and accompanying computer-based clinical decision support tool to help categorize injured workers toward optimal rehabilitation interventions based on unique worker characteristics. Population-based historical cohort design. Data were extracted from a Canadian provincial workers' compensation database on all claimants undergoing work assessment between December 2009 and January 2011. Data were available on: (1) numerous personal, clinical, occupational, and social variables; (2) type of rehabilitation undertaken; and (3) outcomes following rehabilitation (receiving time loss benefits or undergoing repeat programs). Machine learning, concerned with the design of algorithms to discriminate between classes based on empirical data, was the foundation of our approach to build a classification system with multiple independent and dependent variables. The population included 8,611 unique claimants. Subjects were predominantly employed (85 %) males (64 %) with diagnoses of sprain/strain (44 %). Baseline clinician classification accuracy was high (ROC = 0.86) for selecting programs that lead to successful return-to-work. Classification performance for machine learning techniques outperformed the clinician baseline classification (ROC = 0.94). The final classifiers were multifactorial and included the variables: injury duration, occupation, job attachment status, work status, modified work availability, pain intensity rating, self-rated occupational disability, and 9 items from the SF-36 Health Survey. The use of machine learning classification techniques appears to have resulted in classification performance better than clinician decision-making. The final algorithm has been integrated into a computer-based clinical decision support tool that requires additional validation in a clinical sample.

  9. Evolutionary and Neural Computing Based Decision Support System for Disease Diagnosis from Clinical Data Sets in Medical Practice.

    PubMed

    Sudha, M

    2017-09-27

    As a recent trend, various computational intelligence and machine learning approaches have been used for mining inferences hidden in the large clinical databases to assist the clinician in strategic decision making. In any target data the irrelevant information may be detrimental, causing confusion for the mining algorithm and degrades the prediction outcome. To address this issue, this study attempts to identify an intelligent approach to assist disease diagnostic procedure using an optimal set of attributes instead of all attributes present in the clinical data set. In this proposed Application Specific Intelligent Computing (ASIC) decision support system, a rough set based genetic algorithm is employed in pre-processing phase and a back propagation neural network is applied in training and testing phase. ASIC has two phases, the first phase handles outliers, noisy data, and missing values to obtain a qualitative target data to generate appropriate attribute reduct sets from the input data using rough computing based genetic algorithm centred on a relative fitness function measure. The succeeding phase of this system involves both training and testing of back propagation neural network classifier on the selected reducts. The model performance is evaluated with widely adopted existing classifiers. The proposed ASIC system for clinical decision support has been tested with breast cancer, fertility diagnosis and heart disease data set from the University of California at Irvine (UCI) machine learning repository. The proposed system outperformed the existing approaches attaining the accuracy rate of 95.33%, 97.61%, and 93.04% for breast cancer, fertility issue and heart disease diagnosis.

  10. Using an improved association rules mining optimization algorithm in web-based mobile-learning system

    NASA Astrophysics Data System (ADS)

    Huang, Yin; Chen, Jianhua; Xiong, Shaojun

    2009-07-01

    Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.

  11. A prediction algorithm for first onset of major depression in the general population: development and validation.

    PubMed

    Wang, JianLi; Sareen, Jitender; Patten, Scott; Bolton, James; Schmitz, Norbert; Birney, Arden

    2014-05-01

    Prediction algorithms are useful for making clinical decisions and for population health planning. However, such prediction algorithms for first onset of major depression do not exist. The objective of this study was to develop and validate a prediction algorithm for first onset of major depression in the general population. Longitudinal study design with approximate 3-year follow-up. The study was based on data from a nationally representative sample of the US general population. A total of 28 059 individuals who participated in Waves 1 and 2 of the US National Epidemiologic Survey on Alcohol and Related Conditions and who had not had major depression at Wave 1 were included. The prediction algorithm was developed using logistic regression modelling in 21 813 participants from three census regions. The algorithm was validated in participants from the 4th census region (n=6246). Major depression occurred since Wave 1 of the National Epidemiologic Survey on Alcohol and Related Conditions, assessed by the Alcohol Use Disorder and Associated Disabilities Interview Schedule-diagnostic and statistical manual for mental disorders IV. A prediction algorithm containing 17 unique risk factors was developed. The algorithm had good discriminative power (C statistics=0.7538, 95% CI 0.7378 to 0.7699) and excellent calibration (F-adjusted test=1.00, p=0.448) with the weighted data. In the validation sample, the algorithm had a C statistic of 0.7259 and excellent calibration (Hosmer-Lemeshow χ(2)=3.41, p=0.906). The developed prediction algorithm has good discrimination and calibration capacity. It can be used by clinicians, mental health policy-makers and service planners and the general public to predict future risk of having major depression. The application of the algorithm may lead to increased personalisation of treatment, better clinical decisions and more optimal mental health service planning.

  12. Trade-off decisions in distribution utility management

    NASA Astrophysics Data System (ADS)

    Slavickas, Rimas Anthony

    As a result of the "unbundling" of traditional monopolistic electricity generation and transmission enterprises into a free-market economy, power distribution utilities are faced with very difficult decisions pertaining to electricity supply options and quality of service to the customers. The management of distribution utilities has become increasingly complex, versatile, and dynamic to the extent that conventional, non-automated management tools are almost useless and obsolete. This thesis presents a novel and unified approach to managing electricity supply options and quality of service to customers. The technique formulates the problem in terms of variables, parameters, and constraints. An advanced Mixed Integer Programming (MIP) optimization formulation is developed together with novel, logical, decision-making algorithms. These tools enable the utility management to optimize various cost components and assess their time-trend impacts, taking into account the intangible issues such as customer perception, customer expectation, social pressures, and public response to service deterioration. The above concepts are further generalized and a Logical Proportion Analysis (LPA) methodology and associated software have been developed. Solutions using numbers are replaced with solutions using words (character strings) which more closely emulate the human decision-making process and advance the art of decision-making in the power utility environment. Using practical distribution utility operation data and customer surveys, the developments outlined in this thesis are successfully applied to several important utility management problems. These involve the evaluation of alternative electricity supply options, the impact of rate structures on utility business, and the decision of whether to continue to purchase from a main grid or generate locally (partially or totally) by building Non-Utility Generation (NUG).

  13. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  14. Validation workflow for a clinical Bayesian network model in multidisciplinary decision making in head and neck oncology treatment.

    PubMed

    Cypko, Mario A; Stoehr, Matthaeus; Kozniewski, Marcin; Druzdzel, Marek J; Dietz, Andreas; Berliner, Leonard; Lemke, Heinz U

    2017-11-01

    Oncological treatment is being increasingly complex, and therefore, decision making in multidisciplinary teams is becoming the key activity in the clinical pathways. The increased complexity is related to the number and variability of possible treatment decisions that may be relevant to a patient. In this paper, we describe validation of a multidisciplinary cancer treatment decision in the clinical domain of head and neck oncology. Probabilistic graphical models and corresponding inference algorithms, in the form of Bayesian networks, can support complex decision-making processes by providing a mathematically reproducible and transparent advice. The quality of BN-based advice depends on the quality of the model. Therefore, it is vital to validate the model before it is applied in practice. For an example BN subnetwork of laryngeal cancer with 303 variables, we evaluated 66 patient records. To validate the model on this dataset, a validation workflow was applied in combination with quantitative and qualitative analyses. In the subsequent analyses, we observed four sources of imprecise predictions: incorrect data, incomplete patient data, outvoting relevant observations, and incorrect model. Finally, the four problems were solved by modifying the data and the model. The presented validation effort is related to the model complexity. For simpler models, the validation workflow is the same, although it may require fewer validation methods. The validation success is related to the model's well-founded knowledge base. The remaining laryngeal cancer model may disclose additional sources of imprecise predictions.

  15. Operational modelling: the mechanisms influencing TB diagnostic yield in an Xpert® MTB/RIF-based algorithm.

    PubMed

    Dunbar, R; Naidoo, P; Beyers, N; Langley, I

    2017-04-01

    Cape Town, South Africa. To compare the diagnostic yield for smear/culture and Xpert® MTB/RIF algorithms and to investigate the mechanisms influencing tuberculosis (TB) yield. We developed and validated an operational model of the TB diagnostic process, first with the smear/culture algorithm and then with the Xpert algorithm. We modelled scenarios by varying TB prevalence, adherence to diagnostic algorithms and human immunodeficiency virus (HIV) status. This enabled direct comparisons of diagnostic yield in the two algorithms to be made. Routine data showed that diagnostic yield had decreased over the period of the Xpert algorithm roll-out compared to the yield when the smear/culture algorithm was in place. However, modelling yield under identical conditions indicated a 13.3% increase in diagnostic yield from the Xpert algorithm compared to smear/culture. The model demonstrated that the extensive use of culture in the smear/culture algorithm and the decline in TB prevalence are the main factors contributing to not finding an increase in diagnostic yield in the routine data. We demonstrate the benefits of an operational model to determine the effect of scale-up of a new diagnostic algorithm, and recommend that policy makers use operational modelling to make appropriate decisions before new diagnostic algorithms are scaled up.

  16. Performance evaluation of the machine learning algorithms used in inference mechanism of a medical decision support system.

    PubMed

    Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  17. A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition

    PubMed Central

    Schmittmann, Oliver; Schulze Lammers, Peter

    2017-01-01

    Plant-specific herbicide application requires sensor systems for plant recognition and differentiation. A literature review reveals a lack of sensor systems capable of recognizing small weeds in early stages of development (in the two- or four-leaf stage) and crop plants, of making spraying decisions in real time and, in addition, are that are inexpensive and ready for practical use in sprayers. The system described in this work is based on free cascadable and programmable true-color sensors for real-time recognition and identification of individual weed and crop plants. The application of this type of sensor is suitable for municipal areas and farmland with and without crops to perform the site-specific application of herbicides. Initially, databases with reflection properties of plants, natural and artificial backgrounds were created. Crop and weed plants should be recognized by the use of mathematical algorithms and decision models based on these data. They include the characteristic color spectrum, as well as the reflectance characteristics of unvegetated areas and areas with organic material. The CIE-Lab color-space was chosen for color matching because it contains information not only about coloration (a- and b-channel), but also about luminance (L-channel), thus increasing accuracy. Four different decision making algorithms based on different parameters are explained: (i) color similarity (ΔE); (ii) color similarity split in ΔL, Δa and Δb; (iii) a virtual channel ‘d’ and (iv) statistical distribution of the differences of reflection backgrounds and plants. Afterwards, the detection success of the recognition system is described. Furthermore, the minimum weed/plant coverage of the measuring spot was calculated by a mathematical model. Plants with a size of 1–5% of the spot can be recognized, and weeds in the two-leaf stage can be identified with a measuring spot size of 5 cm. By choosing a decision model previously, the detection quality can be increased. Depending on the characteristics of the background, different models are suitable. Finally, the results of field trials on municipal areas (with models of plants), winter wheat fields (with artificial plants) and grassland (with dock) are shown. In each experimental variant, objects and weeds could be recognized. PMID:28786922

  18. A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition.

    PubMed

    Schmittmann, Oliver; Schulze Lammers, Peter

    2017-08-08

    Plant-specific herbicide application requires sensor systems for plant recognition and differentiation. A literature review reveals a lack of sensor systems capable of recognizing small weeds in early stages of development (in the two- or four-leaf stage) and crop plants, of making spraying decisions in real time and, in addition, are that are inexpensive and ready for practical use in sprayers. The system described in this work is based on free cascadable and programmable true-color sensors for real-time recognition and identification of individual weed and crop plants. The application of this type of sensor is suitable for municipal areas and farmland with and without crops to perform the site-specific application of herbicides. Initially, databases with reflection properties of plants, natural and artificial backgrounds were created. Crop and weed plants should be recognized by the use of mathematical algorithms and decision models based on these data. They include the characteristic color spectrum, as well as the reflectance characteristics of unvegetated areas and areas with organic material. The CIE-Lab color-space was chosen for color matching because it contains information not only about coloration (a- and b-channel), but also about luminance (L-channel), thus increasing accuracy. Four different decision making algorithms based on different parameters are explained: (i) color similarity (ΔE); (ii) color similarity split in ΔL, Δa and Δb; (iii) a virtual channel 'd' and (iv) statistical distribution of the differences of reflection backgrounds and plants. Afterwards, the detection success of the recognition system is described. Furthermore, the minimum weed/plant coverage of the measuring spot was calculated by a mathematical model. Plants with a size of 1-5% of the spot can be recognized, and weeds in the two-leaf stage can be identified with a measuring spot size of 5 cm. By choosing a decision model previously, the detection quality can be increased. Depending on the characteristics of the background, different models are suitable. Finally, the results of field trials on municipal areas (with models of plants), winter wheat fields (with artificial plants) and grassland (with dock) are shown. In each experimental variant, objects and weeds could be recognized.

  19. Using Bayes factors for multi-factor, biometric authentication

    NASA Astrophysics Data System (ADS)

    Giffin, A.; Skufca, J. D.; Lao, P. A.

    2015-01-01

    Multi-factor/multi-modal authentication systems are becoming the de facto industry standard. Traditional methods typically use rates that are point estimates and lack a good measure of uncertainty. Additionally, multiple factors are typically fused together in an ad hoc manner. To be consistent, as well as to establish and make proper use of uncertainties, we use a Bayesian method that will update our estimates and uncertainties as new information presents itself. Our algorithm compares competing classes (such as genuine vs. imposter) using Bayes Factors (BF). The importance of this approach is that we not only accept or reject one model (class), but compare it to others to make a decision. We show using a Receiver Operating Characteristic (ROC) curve that using BF for determining class will always perform at least as well as the traditional combining of factors, such as a voting algorithm. As the uncertainty decreases, the BF result continues to exceed the traditional methods result.

  20. New algorithms for optimal reduction of technical risks

    NASA Astrophysics Data System (ADS)

    Todinov, M. T.

    2013-06-01

    The article features exact algorithms for reduction of technical risk by (1) optimal allocation of resources in the case where the total potential loss from several sources of risk is a sum of the potential losses from the individual sources; (2) optimal allocation of resources to achieve a maximum reduction of system failure; and (3) making an optimal choice among competing risky prospects. The article demonstrates that the number of activities in a risky prospect is a key consideration in selecting the risky prospect. As a result, the maximum expected profit criterion, widely used for making risk decisions, is fundamentally flawed, because it does not consider the impact of the number of risk-reward activities in the risky prospects. A popular view, that if a single risk-reward bet with positive expected profit is unacceptable then a sequence of such identical risk-reward bets is also unacceptable, has been analysed and proved incorrect.

  1. Enabling joined-up decision making with geotemporal information

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Ahmed, S. E.; Purves, D. W.; Emmott, S.; Joppa, L. N.; Caldararu, S.; Visconti, P.; Newbold, T.; Formica, A. F.

    2015-12-01

    While the use of geospatial data to assist in decision making is becoming increasingly common, the use of geotemporal information: information that can be indexed by geographical space AND time, is much rarer. I will describe our scientific research and software development efforts intended to advance the availability and use of geotemporal information in general. I will show two recent examples of "stacking" geotemporal information to support land use decision making in the Brazilian Amazon and Kenya, involving data-constrained predictive models and empirically derived datasets of road development, deforestation, carbon, agricultural yields, water purification and poverty alleviation services and will show how we use trade-off analyses and constraint reasoning algorithms to explore the costs and benefits of different decisions. For the Brazilian Amazon we explore tradeoffs involved in different deforestation scenarios, while for Kenya we explore the impacts of conserving forest to support international carbon conservation initiatives (REDD+). I will also illustrate the cloud-based software tools we have developed to enable anyone to access geotemporal information, gridded (e.g. climate) or non-gridded (e.g. protected areas), for the past, present or future and incorporate such information into their analyses (e.g. www.fetchclimate.org), including how we train new predictive models to such data using Bayesian techniques: on this latter point I will show how we combine satellite and ground measured data with predictive models to forecast how crops might respond to climate change.

  2. Value of information analysis for groundwater quality monitoring network design Case study: Eocene Aquifer, Palestine

    NASA Astrophysics Data System (ADS)

    Khader, A.; McKee, M.

    2010-12-01

    Value of information (VOI) analysis evaluates the benefit of collecting additional information to reduce or eliminate uncertainty in a specific decision-making context. It makes explicit any expected potential losses from errors in decision making due to uncertainty and identifies the “best” information collection strategy as one that leads to the greatest expected net benefit to the decision-maker. This study investigates the willingness to pay for groundwater quality monitoring in the Eocene Aquifer, Palestine, which is an unconfined aquifer located in the northern part of the West Bank. The aquifer is being used by 128,000 Palestinians to fulfill domestic and agricultural demands. The study takes into account the consequences of pollution and the options the decision maker might face. Since nitrate is the major pollutant in the aquifer, the consequences of nitrate pollution were analyzed, which mainly consists of the possibility of methemoglobinemia (blue baby syndrome). In this case, the value of monitoring was compared to the costs of treating for methemoglobinemia or the costs of other options like water treatment, using bottled water or importing water from outside the aquifer. And finally, an optimal monitoring network that takes into account the uncertainties in recharge (climate), aquifer properties (hydraulic conductivity), pollutant chemical reaction (decay factor), and the value of monitoring is designed by utilizing a sparse Bayesian modeling algorithm called a relevance vector machine.

  3. Accepting or declining non-invasive ventilation or gastrostomy in amyotrophic lateral sclerosis: patients' perspectives.

    PubMed

    Greenaway, L P; Martin, N H; Lawrence, V; Janssen, A; Al-Chalabi, A; Leigh, P N; Goldstein, L H

    2015-01-01

    The objective was to identify factors associated with decisions made by patients with amyotrophic lateral sclerosis (ALS) to accept or decline non-invasive ventilation (NIV) and/or gastrostomy in a prospective population-based study. Twenty-one people with ALS, recruited from the South-East ALS Register who made an intervention decision during the study timeframe underwent a face-to-face in-depth interview, with or without their informal caregiver present. Sixteen had accepted an intervention (11 accepted gastrostomy, four accepted NIV and one accepted both interventions). Five patients had declined gastrostomy. Thematic analysis revealed three main themes: (1) patient-centric factors (including perceptions of control, acceptance and need, and aspects of fear); (2) external factors (including roles played by healthcare professionals, family, and information provision); and (3) the concept of time (including living in the moment and the notion of 'right thing, right time'). Many aspects of these factors were inter-related. Decision-making processes for the patients were found to be complex and multifaceted and reinforce arguments for individualised (rather than 'algorithm-based') approaches to facilitating decision-making by people with ALS who require palliative interventions.

  4. Ensemble Classifiers for Predicting HIV-1 Resistance from Three Rule-Based Genotypic Resistance Interpretation Systems.

    PubMed

    Raposo, Letícia M; Nobre, Flavio F

    2017-08-30

    Resistance to antiretrovirals (ARVs) is a major problem faced by HIV-infected individuals. Different rule-based algorithms were developed to infer HIV-1 susceptibility to antiretrovirals from genotypic data. However, there is discordance between them, resulting in difficulties for clinical decisions about which treatment to use. Here, we developed ensemble classifiers integrating three interpretation algorithms: Agence Nationale de Recherche sur le SIDA (ANRS), Rega, and the genotypic resistance interpretation system from Stanford HIV Drug Resistance Database (HIVdb). Three approaches were applied to develop a classifier with a single resistance profile: stacked generalization, a simple plurality vote scheme and the selection of the interpretation system with the best performance. The strategies were compared with the Friedman's test and the performance of the classifiers was evaluated using the F-measure, sensitivity and specificity values. We found that the three strategies had similar performances for the selected antiretrovirals. For some cases, the stacking technique with naïve Bayes as the learning algorithm showed a statistically superior F-measure. This study demonstrates that ensemble classifiers can be an alternative tool for clinical decision-making since they provide a single resistance profile from the most commonly used resistance interpretation systems.

  5. What we talk about when we talk about depression: doctor-patient conversations and treatment decision outcomes

    PubMed Central

    Karasz, Alison; Dowrick, Christopher; Byng, Richard; Buszewicz, Marta; Ferri, Lucia; Hartman, Tim C Olde; van Dulmen, Sandra; van Weel-Baumgarten, Evelyn; Reeve, Joanne

    2011-01-01

    Background Efforts to address depression in primary care settings have focused on the introduction of care guidelines emphasising pharmacological treatment. To date, physician adherence remains low. Little is known of the types of information exchange or other negotiations in doctor-patient consultations about depression that influence physician decision making about treatment. Aim The study sought to understand conversational influences on physician decision making about treatment for depression. Design A secondary analysis of consultation data collected in other studies. Using a maximum variation sampling strategy, 30 transcripts of primary care consultations about distress or depression were selected from datasets collected in three countries. Transcripts were analysed to discover factors associated with prescription of medication. Method The study employed two qualitative analysis strategies: a micro-analysis approach, which examines how conversation partners shape the dialogue towards pragmatic goals; and a narrative analysis approach of the problem presentation. Results Patients communicated their conceptual representations of distress at the outset of each consultation. Concepts of depression were communicated through the narrative form of the problem presentation. Three types of narratives were identified: those emphasising symptoms, those emphasising life situations, and mixed narratives. Physician decision making regarding medication treatment was strongly associated with the form of the patient’s narrative. Physicians made few efforts to persuade patients to accept biomedical attributions or treatments. Conclusion Results of the study provide insight into why adherence to depression guidelines remains low. Data indicate that patient agendas drive the ‘action’ in consultations about depression. Physicians appear to be guided by common-sense decision-making algorithms emphasising patients’ views and preferences. PMID:22520683

  6. Research on Attribute Reduction in Hoisting Motor State Recognition of Quayside Container Crane

    NASA Astrophysics Data System (ADS)

    Li, F.; Tang, G.; Hu, X.

    2017-07-01

    In view of too many attributes in hoisting motor state recognition of quayside container crane. Attribute reduction method based on discernibility matrix is introduced to attribute reduction of lifting motor state information table. A method of attribute reduction based on the combination of rough set and genetic algorithm is proposed to deal with the hoisting motor state decision table. Under the condition that the information system's decision-making ability is unchanged, the redundant attribute is deleted. Which reduces the complexity and computation of the recognition process of the hoisting motor. It is possible to realize the fast state recognition.

  7. Modeling human decision making behavior in supervisory control

    NASA Technical Reports Server (NTRS)

    Tulga, M. K.; Sheridan, T. B.

    1977-01-01

    An optimal decision control model was developed, which is based primarily on a dynamic programming algorithm which looks at all the available task possibilities, charts an optimal trajectory, and commits itself to do the first step (i.e., follow the optimal trajectory during the next time period), and then iterates the calculation. A Bayesian estimator was included which estimates the tasks which might occur in the immediate future and provides this information to the dynamic programming routine. Preliminary trials comparing the human subject's performance to that of the optimal model show a great similarity, but indicate that the human skips certain movements which require quick change in strategy.

  8. NASA Biomedical Informatics Capabilities and Needs

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2009-01-01

    To improve on-orbit clinical capabilities by developing and providing operational support for intelligent, robust, reliable, and secure, enterprise-wide and comprehensive health care and biomedical informatics systems with increasing levels of autonomy, for use on Earth, low Earth orbit & exploration class missions. Biomedical Informatics is an emerging discipline that has been defined as the study, invention, and implementation of structures and algorithms to improve communication, understanding and management of medical information. The end objective of biomedical informatics is the coalescing of data, knowledge, and the tools necessary to apply that data and knowledge in the decision-making process, at the time and place that a decision needs to be made.

  9. Mathematical model of design loading vessel

    NASA Astrophysics Data System (ADS)

    Budnik, V. Yu

    2017-10-01

    Transport by ferry is very important in our time. The paper shows the factors that affect the operation of the ferry. The constraints of the designed system were identified. The indicators of quality were articulated. It can be done by means of improving the decision-making process and the choice of the optimum loading options to ensure efficient functioning of Kerch strait ferry line. The algorithm and a mathematical model were developed.

  10. Proceedings of the Second Joint Technology Workshop on Neural Networks and Fuzzy Logic, volume 2

    NASA Technical Reports Server (NTRS)

    Lea, Robert N. (Editor); Villarreal, James A. (Editor)

    1991-01-01

    Documented here are papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by NASA and the University of Texas, Houston. Topics addressed included adaptive systems, learning algorithms, network architectures, vision, robotics, neurobiological connections, speech recognition and synthesis, fuzzy set theory and application, control and dynamics processing, space applications, fuzzy logic and neural network computers, approximate reasoning, and multiobject decision making.

  11. Artificial Intelligence (Al) Center of Excellence at the University of Pennsylvania

    DTIC Science & Technology

    1995-07-01

    Approach and repel behaviors were implemented in order to study higher level behavioral simulation . Parallel algorithms for motion planning (as a...of decision-making accuracy can be specified for this graph-reduction process. We have also developed a mixed qualitative/quantitative simulation ...system, called QobiSIM. QobiSIM has been used to develop a cardiovascular simulation to be incorporated into the TraumAID system. This cardiovascular

  12. Development of a Multileaf Collimator for Proton Radiotherapy

    DTIC Science & Technology

    2011-06-01

    to treat shallow depths was also simulated and commissioned in Eclipse . In order to calibrate the number of simulated protons per MU, a reference ...beam technology for proton radiotherapy, and the fourth year of the project to develop image guided treatment protocols for proton therapy. This...radiotherapy to proton therapy, and to develop a decision-making algorithm to maximize the efficiency of the facility. This report describes the

  13. Using wound care algorithms: a content validation study.

    PubMed

    Beitz, J M; van Rijswijk, L

    1999-09-01

    Valid and reliable heuristic devices facilitating optimal wound care are lacking. The objectives of this study were to establish content validation data for a set of wound care algorithms, to identify their associated strengths and weaknesses, and to gain insight into the wound care decision-making process. Forty-four registered nurse wound care experts were surveyed and interviewed at national and regional educational meetings. Using a cross-sectional study design and an 83-item, 4-point Likert-type scale, this purposive sample was asked to quantify the degree of validity of the algorithms' decisions and components. Participants' comments were tape-recorded, transcribed, and themes were derived. On a scale of 1 to 4, the mean score of the entire instrument was 3.47 (SD +/- 0.87), the instrument's Content Validity Index was 0.86, and the individual Content Validity Index of 34 of 44 participants was > 0.8. Item scores were lower for those related to packing deep wounds (P < .001). No other significant differences were observed. Qualitative data analysis revealed themes of difficulty associated with wound assessment and care issues, that is, the absence of valid and reliable definitions. The wound care algorithms studied proved valid. However, the lack of valid and reliable wound assessment and care definitions hinders optimal use of these instruments. Further research documenting their clinical use is warranted. Research-based practice recommendations should direct the development of future valid and reliable algorithms designed to help nurses provide optimal wound care.

  14. Reinforcement learning techniques for controlling resources in power networks

    NASA Astrophysics Data System (ADS)

    Kowli, Anupama Sunil

    As power grids transition towards increased reliance on renewable generation, energy storage and demand response resources, an effective control architecture is required to harness the full functionalities of these resources. There is a critical need for control techniques that recognize the unique characteristics of the different resources and exploit the flexibility afforded by them to provide ancillary services to the grid. The work presented in this dissertation addresses these needs. Specifically, new algorithms are proposed, which allow control synthesis in settings wherein the precise distribution of the uncertainty and its temporal statistics are not known. These algorithms are based on recent developments in Markov decision theory, approximate dynamic programming and reinforcement learning. They impose minimal assumptions on the system model and allow the control to be "learned" based on the actual dynamics of the system. Furthermore, they can accommodate complex constraints such as capacity and ramping limits on generation resources, state-of-charge constraints on storage resources, comfort-related limitations on demand response resources and power flow limits on transmission lines. Numerical studies demonstrating applications of these algorithms to practical control problems in power systems are discussed. Results demonstrate how the proposed control algorithms can be used to improve the performance and reduce the computational complexity of the economic dispatch mechanism in a power network. We argue that the proposed algorithms are eminently suitable to develop operational decision-making tools for large power grids with many resources and many sources of uncertainty.

  15. [GIS and scenario analysis aid to water pollution control planning of river basin].

    PubMed

    Wang, Shao-ping; Cheng, Sheng-tong; Jia, Hai-feng; Ou, Zhi-dan; Tan, Bin

    2004-07-01

    The forward and backward algorithms for watershed water pollution control planning were summarized in this paper as well as their advantages and shortages. The spatial databases of water environmental function region, pollution sources, monitoring sections and sewer outlets were built with ARCGIS8.1 as the platform in the case study of Ganjiang valley, Jiangxi province. Based on the principles of the forward algorithm, four scenarios were designed for the watershed pollution control. Under these scenarios, ten sets of planning schemes were generated to implement cascade pollution source control. The investment costs of sewage treatment for these schemes were estimated by means of a series of cost-effective functions; with pollution source prediction, the water quality was modeled with CSTR model for each planning scheme. The modeled results of different planning schemes were visualized through GIS to aid decision-making. With the results of investment cost and water quality attainment as decision-making accords and based on the analysis of the economic endurable capacity for water pollution control in Ganjiang river basin, two optimized schemes were proposed. The research shows that GIS technology and scenario analysis can provide a good guidance to the synthesis, integrity and sustainability aspects for river basin water quality planning.

  16. Modelling of internal architecture of kinesin nanomotor as a machine language.

    PubMed

    Khataee, H R; Ibrahim, M Y

    2012-09-01

    Kinesin is a protein-based natural nanomotor that transports molecular cargoes within cells by walking along microtubules. Kinesin nanomotor is considered as a bio-nanoagent which is able to sense the cell through its sensors (i.e. its heads and tail), make the decision internally and perform actions on the cell through its actuator (i.e. its motor domain). The study maps the agent-based architectural model of internal decision-making process of kinesin nanomotor to a machine language using an automata algorithm. The applied automata algorithm receives the internal agent-based architectural model of kinesin nanomotor as a deterministic finite automaton (DFA) model and generates a regular machine language. The generated regular machine language was acceptable by the architectural DFA model of the nanomotor and also in good agreement with its natural behaviour. The internal agent-based architectural model of kinesin nanomotor indicates the degree of autonomy and intelligence of the nanomotor interactions with its cell. Thus, our developed regular machine language can model the degree of autonomy and intelligence of kinesin nanomotor interactions with its cell as a language. Modelling of internal architectures of autonomous and intelligent bio-nanosystems as machine languages can lay the foundation towards the concept of bio-nanoswarms and next phases of the bio-nanorobotic systems development.

  17. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    PubMed

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  18. Combining Multiple Rupture Models in Real-Time for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Wu, S.; Beck, J. L.; Heaton, T. H.

    2015-12-01

    The ShakeAlert earthquake early warning system for the west coast of the United States is designed to combine information from multiple independent earthquake analysis algorithms in order to provide the public with robust predictions of shaking intensity at each user's location before they are affected by strong shaking. The current contributing analyses come from algorithms that determine the origin time, epicenter, and magnitude of an earthquake (On-site, ElarmS, and Virtual Seismologist). A second generation of algorithms will provide seismic line source information (FinDer), as well as geodetically-constrained slip models (BEFORES, GPSlip, G-larmS, G-FAST). These new algorithms will provide more information about the spatial extent of the earthquake rupture and thus improve the quality of the resulting shaking forecasts.Each of the contributing algorithms exploits different features of the observed seismic and geodetic data, and thus each algorithm may perform differently for different data availability and earthquake source characteristics. Thus the ShakeAlert system requires a central mediator, called the Central Decision Module (CDM). The CDM acts to combine disparate earthquake source information into one unified shaking forecast. Here we will present a new design for the CDM that uses a Bayesian framework to combine earthquake reports from multiple analysis algorithms and compares them to observed shaking information in order to both assess the relative plausibility of each earthquake report and to create an improved unified shaking forecast complete with appropriate uncertainties. We will describe how these probabilistic shaking forecasts can be used to provide each user with a personalized decision-making tool that can help decide whether or not to take a protective action (such as opening fire house doors or stopping trains) based on that user's distance to the earthquake, vulnerability to shaking, false alarm tolerance, and time required to act.

  19. Design and usability of heuristic-based deliberation tools for women facing amniocentesis.

    PubMed

    Durand, Marie-Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn

    2012-03-01

    Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). The 'Take The Best' heuristic (i.e. selection of a 'most important reason') and 'The Tallying' integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web-based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health-care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of 'your most important reason' (Take The Best) and 'weighing it up' (Tallying). Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health-care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that 'weighing it up' and 'your most important reason' were not appropriate when facing such a difficult and emotional decision. Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real-world decisions about amniocentesis. © 2011 Blackwell Publishing Ltd.

  20. A new framework for modeling decisions about changing information: The Piecewise Linear Ballistic Accumulator model

    PubMed Central

    Heathcote, Andrew

    2016-01-01

    In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448

  1. Enhancing User Customization through Novel Software Architecture for Utility Scale Solar Siting Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brant Peery; Sam Alessi; Randy Lee

    2014-06-01

    There is a need for a spatial decision support application that allows users to create customized metrics for comparing proposed locations of a new solar installation. This document discusses how PVMapper was designed to overcome the customization problem through the development of loosely coupled spatial and decision components in a JavaScript plugin architecture. This allows the user to easily add functionality and data to the system. The paper also explains how PVMapper provides the user with a dynamic and customizable decision tool that enables them to visually modify the formulas that are used in the decision algorithms that convert datamore » to comparable metrics. The technologies that make up the presentation and calculation software stack are outlined. This document also explains the architecture that allows the tool to grow through custom plugins created by the software users. Some discussion is given on the difficulties encountered while designing the system.« less

  2. AdaBoost-based on-line signature verifier

    NASA Astrophysics Data System (ADS)

    Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi

    2005-03-01

    Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.

  3. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives that met the DM's preference criteria, therefore allowing the expert to select among several strong candidate designs depending on her/his LTM budget, c) two of the methodologies - Case-Based Micro Interactive Genetic Algorithm (CBMIGA) and Interactive Genetic Algorithm with Mixed Initiative Interaction (IGAMII) - were also able to assist in controlling human fatigue and adapt to the DM's learning process.

  4. Transition-Independent Decentralized Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Becker, Raphen; Silberstein, Shlomo; Lesser, Victor; Goldman, Claudia V.; Morris, Robert (Technical Monitor)

    2003-01-01

    There has been substantial progress with formal models for sequential decision making by individual agents using the Markov decision process (MDP). However, similar treatment of multi-agent systems is lacking. A recent complexity result, showing that solving decentralized MDPs is NEXP-hard, provides a partial explanation. To overcome this complexity barrier, we identify a general class of transition-independent decentralized MDPs that is widely applicable. The class consists of independent collaborating agents that are tied up by a global reward function that depends on both of their histories. We present a novel algorithm for solving this class of problems and examine its properties. The result is the first effective technique to solve optimally a class of decentralized MDPs. This lays the foundation for further work in this area on both exact and approximate solutions.

  5. Incorporating individual health-protective decisions into disease transmission models: a mathematical framework.

    PubMed

    Durham, David P; Casman, Elizabeth A

    2012-03-07

    It is anticipated that the next generation of computational epidemic models will simulate both infectious disease transmission and dynamic human behaviour change. Individual agents within a simulation will not only infect one another, but will also have situational awareness and a decision algorithm that enables them to modify their behaviour. This paper develops such a model of behavioural response, presenting a mathematical interpretation of a well-known psychological model of individual decision making, the health belief model, suitable for incorporation within an agent-based disease-transmission model. We formalize the health belief model and demonstrate its application in modelling the prevalence of facemask use observed over the course of the 2003 Hong Kong SARS epidemic, a well-documented example of behaviour change in response to a disease outbreak.

  6. Incorporating individual health-protective decisions into disease transmission models: a mathematical framework

    PubMed Central

    Durham, David P.; Casman, Elizabeth A.

    2012-01-01

    It is anticipated that the next generation of computational epidemic models will simulate both infectious disease transmission and dynamic human behaviour change. Individual agents within a simulation will not only infect one another, but will also have situational awareness and a decision algorithm that enables them to modify their behaviour. This paper develops such a model of behavioural response, presenting a mathematical interpretation of a well-known psychological model of individual decision making, the health belief model, suitable for incorporation within an agent-based disease-transmission model. We formalize the health belief model and demonstrate its application in modelling the prevalence of facemask use observed over the course of the 2003 Hong Kong SARS epidemic, a well-documented example of behaviour change in response to a disease outbreak. PMID:21775324

  7. Decision - making of Direct Customers Based on Available Transfer Capability

    NASA Astrophysics Data System (ADS)

    Quan, Tang; Zhaohang, Lin; Huaqiang, Li

    2017-05-01

    Large customer direct-power-purchasing is a hot spot in the electricity market reform. In this paper, the author established an Available Transfer Capability (ATC) model which takes uncertain factors into account, applied the model into large customer direct-power-purchasing transactions and improved the reliability of power supply during direct-power-purchasing by introducing insurance theory. The author also considered the customers loss suffered from power interruption when building ATC model, established large customer decision model, took purchasing quantity of power from different power plants and reserved capacity insurance as variables, targeted minimum power interruption loss as optimization goal and best solution by means of particle swarm algorithm to produce optimal power purchasing decision of large consumers. Simulation was made through IEEE57 system finally and proved that such method is effective.

  8. Alterations in choice behavior by manipulations of world model.

    PubMed

    Green, C S; Benson, C; Kersten, D; Schrater, P

    2010-09-14

    How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) "probability matching"-a consistent example of suboptimal choice behavior seen in humans-occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning.

  9. Alterations in choice behavior by manipulations of world model

    PubMed Central

    Green, C. S.; Benson, C.; Kersten, D.; Schrater, P.

    2010-01-01

    How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) “probability matching”—a consistent example of suboptimal choice behavior seen in humans—occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning. PMID:20805507

  10. Smart Building: Decision Making Architecture for Thermal Energy Management.

    PubMed

    Uribe, Oscar Hernández; Martin, Juan Pablo San; Garcia-Alegre, María C; Santos, Matilde; Guinea, Domingo

    2015-10-30

    Smart applications of the Internet of Things are improving the performance of buildings, reducing energy demand. Local and smart networks, soft computing methodologies, machine intelligence algorithms and pervasive sensors are some of the basics of energy optimization strategies developed for the benefit of environmental sustainability and user comfort. This work presents a distributed sensor-processor-communication decision-making architecture to improve the acquisition, storage and transfer of thermal energy in buildings. The developed system is implemented in a near Zero-Energy Building (nZEB) prototype equipped with a built-in thermal solar collector, where optical properties are analysed; a low enthalpy geothermal accumulation system, segmented in different temperature zones; and an envelope that includes a dynamic thermal barrier. An intelligent control of this dynamic thermal barrier is applied to reduce the thermal energy demand (heating and cooling) caused by daily and seasonal weather variations. Simulations and experimental results are presented to highlight the nZEB thermal energy reduction.

  11. Automatic identification of high impact articles in PubMed to support clinical decision making.

    PubMed

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme

    2017-09-01

    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  13. The PANTHER User Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus

    2015-09-01

    This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less

  14. Primal-dual techniques for online algorithms and mechanisms

    NASA Astrophysics Data System (ADS)

    Liaghat, Vahid

    An offline algorithm is one that knows the entire input in advance. An online algorithm, however, processes its input in a serial fashion. In contrast to offline algorithms, an online algorithm works in a local fashion and has to make irrevocable decisions without having the entire input. Online algorithms are often not optimal since their irrevocable decisions may turn out to be inefficient after receiving the rest of the input. For a given online problem, the goal is to design algorithms which are competitive against the offline optimal solutions. In a classical offline scenario, it is often common to see a dual analysis of problems that can be formulated as a linear or convex program. Primal-dual and dual-fitting techniques have been successfully applied to many such problems. Unfortunately, the usual tricks come short in an online setting since an online algorithm should make decisions without knowing even the whole program. In this thesis, we study the competitive analysis of fundamental problems in the literature such as different variants of online matching and online Steiner connectivity, via online dual techniques. Although there are many generic tools for solving an optimization problem in the offline paradigm, in comparison, much less is known for tackling online problems. The main focus of this work is to design generic techniques for solving integral linear optimization problems where the solution space is restricted via a set of linear constraints. A general family of these problems are online packing/covering problems. Our work shows that for several seemingly unrelated problems, primal-dual techniques can be successfully applied as a unifying approach for analyzing these problems. We believe this leads to generic algorithmic frameworks for solving online problems. In the first part of the thesis, we show the effectiveness of our techniques in the stochastic settings and their applications in Bayesian mechanism design. In particular, we introduce new techniques for solving a fundamental linear optimization problem, namely, the stochastic generalized assignment problem (GAP). This packing problem generalizes various problems such as online matching, ad allocation, bin packing, etc. We furthermore show applications of such results in the mechanism design by introducing Prophet Secretary, a novel Bayesian model for online auctions. In the second part of the thesis, we focus on the covering problems. We develop the framework of "Disk Painting" for a general class of network design problems that can be characterized by proper functions. This class generalizes the node-weighted and edge-weighted variants of several well-known Steiner connectivity problems. We furthermore design a generic technique for solving the prize-collecting variants of these problems when there exists a dual analysis for the non-prize-collecting counterparts. Hence, we solve the online prize-collecting variants of several network design problems for the first time. Finally we focus on designing techniques for online problems with mixed packing/covering constraints. We initiate the study of degree-bounded graph optimization problems in the online setting by designing an online algorithm with a tight competitive ratio for the degree-bounded Steiner forest problem. We hope these techniques establishes a starting point for the analysis of the important class of online degree-bounded optimization on graphs.

  15. Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems

    NASA Astrophysics Data System (ADS)

    Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen

    Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.

  16. Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning

    NASA Astrophysics Data System (ADS)

    Evenson, G. R.

    2012-12-01

    Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.

  17. Emergency Physicians' Perceptions and Decision-making Processes Regarding Patients Presenting with Palpitations.

    PubMed

    Probst, Marc A; Kanzaria, Hemal K; Hoffman, Jerome R; Mower, William R; Moheimani, Roya S; Sun, Benjamin C; Quigley, Denise D

    2015-08-01

    Palpitations are a common emergency department (ED) complaint, yet relatively little research exists on this topic from an emergency care perspective. We sought to describe the perceptions and clinical decision-making processes of emergency physicians (EP) surrounding patients with palpitations. We conducted 21 semistructured interviews with a convenience sample of EPs. We recruited participants from academic and community practice settings from four regions of the United States. The transcribed interviews were analyzed using a combination of structural coding and grounded theory approaches with ATLAS.ti, a qualitative data analysis software program (version 7; Atlas.ti Scientific Software Development GmbH, Berlin, Germany). EPs perceive palpitations to be a common but generally benign chief complaint. EPs' clinical approach to palpitations, with regards to testing, treatment, and ED management, can be classified as relating to one or more of the following themes: (1) risk stratification, (2) diagnostic categorization, (3) algorithmic management, and (4) case-specific gestalt. With regard to disposition decisions, four main themes emerged: (1) presence of a serious diagnosis, (2) perceived need for further cardiac testing/monitoring, (3) presence of key associated symptoms, (4) request of other physician or patient desire. The interrater reliability exercise yielded a Fleiss' kappa measure of 0.69, indicating substantial agreement between coders. EPs perceive palpitations to be a common but generally benign chief complaint. EPs rely on one or more of four main clinical approaches to manage these patients. These findings could help guide future efforts at developing risk-stratification tools and clinical algorithms for patients with palpitations. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Siting and Routing Assessment for Solid Waste Management Under Uncertainty Using the Grey Mini-Max Regret Criterion

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Davila, Eric

    2006-10-01

    Solid waste management (SWM) is at the forefront of environmental concerns in the Lower Rio Grande Valley (LRGV), South Texas. The complexity in SWM drives area decision makers to look for innovative and forward-looking solutions to address various waste management options. In decision analysis, it is not uncommon for decision makers to go by an option that may minimize the maximum regret when some determinant factors are vague, ambiguous, or unclear. This article presents an innovative optimization model using the grey mini-max regret (GMMR) integer programming algorithm to outline an optimal regional coordination of solid waste routing and possible landfill/incinerator construction under an uncertain environment. The LRGV is an ideal location to apply the GMMR model for SWM planning because of its constant urban expansion, dwindling landfill space, and insufficient data availability signifying the planning uncertainty combined with vagueness in decision-making. The results give local decision makers hedged sets of options that consider various forms of systematic and event-based uncertainty. By extending the dimension of decision-making, this may lead to identifying a variety of beneficial solutions with efficient waste routing and facility siting for the time frame of 2005 through 2010 in LRGV. The results show the ability of the GMMR model to open insightful scenario planning that can handle situational and data-driven uncertainty in a way that was previously unavailable. Research findings also indicate that the large capital investment of incineration facilities makes such an option less competitive among municipal options for landfills. It is evident that the investment from a municipal standpoint is out of the question, but possible public-private partnerships may alleviate this obstacle.

  19. Log-Linear Model Based Behavior Selection Method for Artificial Fish Swarm Algorithm

    PubMed Central

    Huang, Zhehuang; Chen, Yidong

    2015-01-01

    Artificial fish swarm algorithm (AFSA) is a population based optimization technique inspired by social behavior of fishes. In past several years, AFSA has been successfully applied in many research and application areas. The behavior of fishes has a crucial impact on the performance of AFSA, such as global exploration ability and convergence speed. How to construct and select behaviors of fishes are an important task. To solve these problems, an improved artificial fish swarm algorithm based on log-linear model is proposed and implemented in this paper. There are three main works. Firstly, we proposed a new behavior selection algorithm based on log-linear model which can enhance decision making ability of behavior selection. Secondly, adaptive movement behavior based on adaptive weight is presented, which can dynamically adjust according to the diversity of fishes. Finally, some new behaviors are defined and introduced into artificial fish swarm algorithm at the first time to improve global optimization capability. The experiments on high dimensional function optimization showed that the improved algorithm has more powerful global exploration ability and reasonable convergence speed compared with the standard artificial fish swarm algorithm. PMID:25691895

  20. SU-D-BRB-05: Quantum Learning for Knowledge-Based Response-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Naqa, I; Ten, R

    Purpose: There is tremendous excitement in radiotherapy about applying data-driven methods to develop personalized clinical decisions for real-time response-based adaptation. However, classical statistical learning methods lack in terms of efficiency and ability to predict outcomes under conditions of uncertainty and incomplete information. Therefore, we are investigating physics-inspired machine learning approaches by utilizing quantum principles for developing a robust framework to dynamically adapt treatments to individual patient’s characteristics and optimize outcomes. Methods: We studied 88 liver SBRT patients with 35 on non-adaptive and 53 on adaptive protocols. Adaptation was based on liver function using a split-course of 3+2 fractions with amore » month break. The radiotherapy environment was modeled as a Markov decision process (MDP) of baseline and one month into treatment states. The patient environment was modeled by a 5-variable state represented by patient’s clinical and dosimetric covariates. For comparison of classical and quantum learning methods, decision-making to adapt at one month was considered. The MDP objective was defined by the complication-free tumor control (P{sup +}=TCPx(1-NTCP)). A simple regression model represented state-action mapping. Single bit in classical MDP and a qubit of 2-superimposed states in quantum MDP represented the decision actions. Classical decision selection was done using reinforcement Q-learning and quantum searching was performed using Grover’s algorithm, which applies uniform superposition over possible states and yields quadratic speed-up. Results: Classical/quantum MDPs suggested adaptation (probability amplitude ≥0.5) 79% of the time for splitcourses and 100% for continuous-courses. However, the classical MDP had an average adaptation probability of 0.5±0.22 while the quantum algorithm reached 0.76±0.28. In cases where adaptation failed, classical MDP yielded 0.31±0.26 average amplitude while the quantum approach averaged a more optimistic 0.57±0.4, but with high phase fluctuations. Conclusion: Our results demonstrate that quantum machine learning approaches provide a feasible and promising framework for real-time and sequential clinical decision-making in adaptive radiotherapy.« less

  1. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  2. Coherent controlization using superconducting qubits

    PubMed Central

    Friis, Nicolai; Melnikov, Alexey A.; Kirchmair, Gerhard; Briegel, Hans J.

    2015-01-01

    Coherent controlization, i.e., coherent conditioning of arbitrary single- or multi-qubit operations on the state of one or more control qubits, is an important ingredient for the flexible implementation of many algorithms in quantum computation. This is of particular significance when certain subroutines are changing over time or when they are frequently modified, such as in decision-making algorithms for learning agents. We propose a scheme to realize coherent controlization for any number of superconducting qubits coupled to a microwave resonator. For two and three qubits, we present an explicit construction that is of high relevance for quantum learning agents. We demonstrate the feasibility of our proposal, taking into account loss, dephasing, and the cavity self-Kerr effect. PMID:26667893

  3. Transcultural Endocrinology: Adapting Type-2 Diabetes Guidelines on a Global Scale.

    PubMed

    Nieto-Martínez, Ramfis; González-Rivas, Juan P; Florez, Hermes; Mechanick, Jeffrey I

    2016-12-01

    Type-2 diabetes (T2D) needs to be prevented and treated effectively to reduce its burden and consequences. White papers, such as evidence-based clinical practice guidelines (CPG) and their more portable versions, clinical practice algorithms and clinical checklists, may improve clinical decision-making and diabetes outcomes. However, CPG are underused and poorly validated. Protocols that translate and implement these CPG are needed. This review presents the global dimension of T2D, details the importance of white papers in the transculturalization process, compares relevant international CPG, analyzes cultural variables, and summarizes translation strategies that can improve care. Specific protocols and algorithmic tools are provided. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Mobility Performance Algorithms for Small Unmanned Ground Vehicles

    DTIC Science & Technology

    2009-05-01

    obstacles need to be developed; specifically, models and data for wheeled vehicle skid steering, interior building floor and roof surfaces, and stair ...an 80-lb SUGV; PackBot® at 50 lb, and GatorTM at 2500 lb. Additionally, the FCS projects that 40% of the military fleet may eventually be robotic ...sensor input analysis and decision-making time. Fields (2002a) discusses representing interaction of humans and robots in the OneSAF Testbed Baseline

  5. The Challenges of Human-Autonomy Teaming

    NASA Technical Reports Server (NTRS)

    Vera, Alonso

    2017-01-01

    Machine intelligence is improving rapidly based on advances in big data analytics, deep learning algorithms, networked operations, and continuing exponential growth in computing power (Moores Law). This growth in the power and applicability of increasingly intelligent systems will change the roles humans, shifting them to tasks where adaptive problem solving, reasoning and decision-making is required. This talk will address the challenges involved in engineering autonomous systems that function effectively with humans in aeronautics domains.

  6. Portfolio Management Decision Support Tools Analysis Relating to Management Value Metrics

    DTIC Science & Technology

    2007-03-01

    creative activities that have been labeled “management dark matter ” (Housel and Kanevsky, 2007). Further, this new source of data can be used, not...organization. “The idea of management dark matter is introduced in this literature as the use of manager’s creative insights when they attempt to...we account for the dark matter or intuitive (i.e., non-algorithmically definable) heuristics that allow a manager to make creative management

  7. Science of Decision Making: A Data-Modeling Approach

    DTIC Science & Technology

    2013-10-01

    were separated on a capillary column using the Dionex UltiMate 3000 (Sunnyvale, CA). The resolved peptides were then sprayed into a linear ion trap...database (3–5). These algorithms assign a peptide sequence, along with a matching score of the experimental ion product mass spectrum, to a theoretical ion ...Bacterial Sample Processing Samples were prepared for liquid chromatography (LC) tandem MS (LC– MS/MS) in a similar manner to that previously reported

  8. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  9. A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.

    PubMed

    Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A

    2018-01-01

    In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.

  10. The early use of PET-CT alters the management of patients with esophageal cancer.

    PubMed

    Williams, R N; Ubhi, S S; Sutton, C D; Thomas, A L; Entwisle, J J; Bowrey, D J

    2009-05-01

    The routine use of positron emission tomography-computed tomography (PET-CT) in the staging of patients with esophageal carcinoma remains contentious, with conflicting reports of its benefit. In our unit, PET-CT has been used routinely in the staging of all patients considered for radical therapy (surgery or chemoradiotherapy). Our aim was to determine the frequency with which PET-CT influenced decision making in the management of patients with carcinoma of the esophagus or gastroesophageal junction. CT, PET-CT, and outcome information were collected on 38 patients considered for radical therapy. Patient proformas, with and without PET-CT findings, were constructed and each independently reviewed in a randomized and blinded fashion by five multidisciplinary team members (three surgeons, two oncologists) and a treatment strategy determined. PET-CT changed the staging for ten patients (26%). This translated into a change in management decision for seven patients (18%). The concordance between individual management plans and treatment intent was 79% for CT (150 of 190 decisions) and it was 92% for PET-CT (175 of 190 decisions). Full concordance between multidisciplinary team members was 66% with CT staging and 74% with the addition of PET-CT. The use of PET-CT early in the staging algorithm for esophageal carcinoma altered the staging for a quarter of patients and the management for a fifth of patients, supporting its inclusion early in the staging algorithm.

  11. Integrated wetland management for waterfowl and shorebirds at Mattamuskeet National Wildlife Refuge, North Carolina

    USGS Publications Warehouse

    Tavernia, Brian G.; Stanton, John D.; Lyons, James E.

    2017-11-22

    Mattamuskeet National Wildlife Refuge (MNWR) offers a mix of open water, marsh, forest, and cropland habitats on 20,307 hectares in coastal North Carolina. In 1934, Federal legislation (Executive Order 6924) established MNWR to benefit wintering waterfowl and other migratory bird species. On an annual basis, the refuge staff decide how to manage 14 impoundments to benefit not only waterfowl during the nonbreeding season, but also shorebirds during fall and spring migration. In making these decisions, the challenge is to select a portfolio, or collection, of management actions for the impoundments that optimizes use by the three groups of birds while respecting budget constraints. In this study, a decision support tool was developed for these annual management decisions.Within the decision framework, there are three different management objectives: shorebird-use days during fall and spring migrations, and waterfowl-use days during the nonbreeding season. Sixteen potential management actions were identified for impoundments; each action represents a combination of hydroperiod and vegetation manipulation. Example hydroperiods include semi-permanent and seasonal drawdowns, and vegetation manipulations include mechanical-chemical treatment, burning, disking, and no action. Expert elicitation was used to build a Bayesian Belief Network (BBN) model that predicts shorebird- and waterfowl-use days for each potential management action. The BBN was parameterized for a representative impoundment, MI-9, and predictions were re-scaled for this impoundment to predict outcomes at other impoundments on the basis of size. Parameter estimates in the BBN model can be updated using observations from ongoing monitoring that is part of the Integrated Waterbird Management and Monitoring (IWMM) program.The optimal portfolio of management actions depends on the importance, that is, weights, assigned to the three objectives, as well as the budget. Five scenarios with a variety of objective weights and budgets were developed. Given the large number of possible portfolios (1614), a heuristic genetic algorithm was used to identify a management action portfolio that maximized use-day objectives while respecting budget constraints. The genetic algorithm identified a portfolio of management actions for each of the five scenarios, enabling refuge staff to explore the sensitivity of their management decisions to objective weights and budget constraints.The decision framework developed here provides a transparent, defensible, and testable foundation for decision making at MNWR. The BBN model explicitly structures and parameterizes a mental model previously used by an expert to assign management actions to the impoundments. With ongoing IWMM monitoring, predictions from the model can be tested, and model parameters updated, to reflect empirical observations. This framework is intended to be a living document that can be updated to reflect changes in the decision context (for example, new objectives or constraints, or new models to compete with the current BBN model). Rather than a mandate to refuge staff, this framework is intended to be a decision support tool; tool outputs can become part of the deliberations of refuge staff when making difficult management decisions for multiple objectives.

  12. Three essays on multi-level optimization models and applications

    NASA Astrophysics Data System (ADS)

    Rahdar, Mohammad

    The general form of a multi-level mathematical programming problem is a set of nested optimization problems, in which each level controls a series of decision variables independently. However, the value of decision variables may also impact the objective function of other levels. A two-level model is called a bilevel model and can be considered as a Stackelberg game with a leader and a follower. The leader anticipates the response of the follower and optimizes its objective function, and then the follower reacts to the leader's action. The multi-level decision-making model has many real-world applications such as government decisions, energy policies, market economy, network design, etc. However, there is a lack of capable algorithms to solve medium and large scale these types of problems. The dissertation is devoted to both theoretical research and applications of multi-level mathematical programming models, which consists of three parts, each in a paper format. The first part studies the renewable energy portfolio under two major renewable energy policies. The potential competition for biomass for the growth of the renewable energy portfolio in the United States and other interactions between two policies over the next twenty years are investigated. This problem mainly has two levels of decision makers: the government/policy makers and biofuel producers/electricity generators/farmers. We focus on the lower-level problem to predict the amount of capacity expansions, fuel production, and power generation. In the second part, we address uncertainty over demand and lead time in a multi-stage mathematical programming problem. We propose a two-stage tri-level optimization model in the concept of rolling horizon approach to reducing the dimensionality of the multi-stage problem. In the third part of the dissertation, we introduce a new branch and bound algorithm to solve bilevel linear programming problems. The total time is reduced by solving a smaller relaxation problem in each node and decreasing the number of iterations. Computational experiments show that the proposed algorithm is faster than the existing ones.

  13. Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem.

    PubMed

    Rajeswari, M; Amudhavel, J; Pothula, Sujatha; Dhavachelvan, P

    2017-01-01

    The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria.

  14. Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem

    PubMed Central

    Amudhavel, J.; Pothula, Sujatha; Dhavachelvan, P.

    2017-01-01

    The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria. PMID:28473849

  15. Collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation

    NASA Astrophysics Data System (ADS)

    Yang, Shangwen; Guo, Baohua; Xiao, Xuefei; Gao, Haichao

    2018-01-01

    To allocate the en-routes and slots to the flights with collaborative decision making, a collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation was proposed. Evaluation indexes include flight delay costs, delay time and the number of turning points. Analytic hierarchy process is applied to determining index weights. Remark set for current two flights not yet obtained the en-route and slot in flight schedule is established. Then, fuzzy comprehensive evaluation is performed, and the en-route and slot for the current two flights are determined. Continue selecting the flight not yet obtained an en-route and a slot in flight schedule. Perform fuzzy comprehensive evaluation until all flights have obtained the en-routes and slots. MatlabR2007b was applied to numerical test based on the simulated data of a civil en-route. Test results show that, compared with the traditional strategy of first come first service, the algorithm gains better effect. The effectiveness of the algorithm was verified.

  16. Effective application of improved profit-mining algorithm for the interday trading model.

    PubMed

    Hsieh, Yu-Lung; Yang, Don-Lin; Wu, Jungpin

    2014-01-01

    Many real world applications of association rule mining from large databases help users make better decisions. However, they do not work well in financial markets at this time. In addition to a high profit, an investor also looks for a low risk trading with a better rate of winning. The traditional approach of using minimum confidence and support thresholds needs to be changed. Based on an interday model of trading, we proposed effective profit-mining algorithms which provide investors with profit rules including information about profit, risk, and winning rate. Since profit-mining in the financial market is still in its infant stage, it is important to detail the inner working of mining algorithms and illustrate the best way to apply them. In this paper we go into details of our improved profit-mining algorithm and showcase effective applications with experiments using real world trading data. The results show that our approach is practical and effective with good performance for various datasets.

  17. Effective Application of Improved Profit-Mining Algorithm for the Interday Trading Model

    PubMed Central

    Wu, Jungpin

    2014-01-01

    Many real world applications of association rule mining from large databases help users make better decisions. However, they do not work well in financial markets at this time. In addition to a high profit, an investor also looks for a low risk trading with a better rate of winning. The traditional approach of using minimum confidence and support thresholds needs to be changed. Based on an interday model of trading, we proposed effective profit-mining algorithms which provide investors with profit rules including information about profit, risk, and winning rate. Since profit-mining in the financial market is still in its infant stage, it is important to detail the inner working of mining algorithms and illustrate the best way to apply them. In this paper we go into details of our improved profit-mining algorithm and showcase effective applications with experiments using real world trading data. The results show that our approach is practical and effective with good performance for various datasets. PMID:24688442

  18. Determination of production run time and warranty length under system maintenance and trade credits

    NASA Astrophysics Data System (ADS)

    Tsao, Yu-Chung

    2012-12-01

    Manufacturers offer a warranty period within which they will fix failed products at no cost to customers. Manufacturers also perform system maintenance when a system is in an out-of-control state. Suppliers provide a credit period to settle the payment to manufacturers. This study considers manufacturer's production and warranty decisions for an imperfect production system under system maintenance and trade credit. Specifically, this study uses the economic production quantity to model the decisions under system maintenance and trade credit. These decisions involve how long the production run time and warranty length should be to maximise total profit. This study provides lemmas for the conditions of optimality and develops a theorem and an algorithm for solving the problems described. Numerical examples illustrate the solution procedures and provide a variety of managerial implications. Results show that simultaneously determining production and warranty decisions is superior to only determining production. This study also discusses the effects of the related parameters on manufacturer's decisions and profits. The results of this study are a useful reference for managerial decision-making and administration.

  19. A Comparative Study with RapidMiner and WEKA Tools over some Classification Techniques for SMS Spam

    NASA Astrophysics Data System (ADS)

    Foozy, Cik Feresa Mohd; Ahmad, Rabiah; Faizal Abdollah, M. A.; Chai Wen, Chuah

    2017-08-01

    SMS Spamming is a serious attack that can manipulate the use of the SMS by spreading the advertisement in bulk. By sending the unwanted SMS that contain advertisement can make the users feeling disturb and this against the privacy of the mobile users. To overcome these issues, many studies have proposed to detect SMS Spam by using data mining tools. This paper will do a comparative study using five machine learning techniques such as Naïve Bayes, K-NN (K-Nearest Neighbour Algorithm), Decision Tree, Random Forest and Decision Stumps to observe the accuracy result between RapidMiner and WEKA for dataset SMS Spam UCI Machine Learning repository.

  20. The use of misclassification costs to learn rule-based decision support models for cost-effective hospital admission strategies.

    PubMed

    Ambrosino, R; Buchanan, B G; Cooper, G F; Fine, M J

    1995-01-01

    Cost-effective health care is at the forefront of today's important health-related issues. A research team at the University of Pittsburgh has been interested in lowering the cost of medical care by attempting to define a subset of patients with community-acquire pneumonia for whom outpatient therapy is appropriate and safe. Sensitivity and specificity requirements for this domain make it difficult to use rule-based learning algorithms with standard measures of performance based on accuracy. This paper describes the use of misclassification costs to assist a rule-based machine-learning program in deriving a decision-support aid for choosing outpatient therapy for patients with community-acquired pneumonia.

  1. A centre-free approach for resource allocation with lower bounds

    NASA Astrophysics Data System (ADS)

    Obando, Germán; Quijano, Nicanor; Rakoto-Ravalontsalama, Naly

    2017-09-01

    Since complexity and scale of systems are continuously increasing, there is a growing interest in developing distributed algorithms that are capable to address information constraints, specially for solving optimisation and decision-making problems. In this paper, we propose a novel method to solve distributed resource allocation problems that include lower bound constraints. The optimisation process is carried out by a set of agents that use a communication network to coordinate their decisions. Convergence and optimality of the method are guaranteed under some mild assumptions related to the convexity of the problem and the connectivity of the underlying graph. Finally, we compare our approach with other techniques reported in the literature, and we present some engineering applications.

  2. Decentralized Opportunistic Spectrum Resources Access Model and Algorithm toward Cooperative Ad-Hoc Networks.

    PubMed

    Liu, Ming; Xu, Yang; Mohammed, Abdul-Wahid

    2016-01-01

    Limited communication resources have gradually become a critical factor toward efficiency of decentralized large scale multi-agent coordination when both system scales up and tasks become more complex. In current researches, due to the agent's limited communication and observational capability, an agent in a decentralized setting can only choose a part of channels to access, but cannot perceive or share global information. Each agent's cooperative decision is based on the partial observation of the system state, and as such, uncertainty in the communication network is unavoidable. In this situation, it is a major challenge working out cooperative decision-making under uncertainty with only a partial observation of the environment. In this paper, we propose a decentralized approach that allows agents cooperatively search and independently choose channels. The key to our design is to build an up-to-date observation for each agent's view so that a local decision model is achievable in a large scale team coordination. We simplify the Dec-POMDP model problem, and each agent can jointly work out its communication policy in order to improve its local decision utilities for the choice of communication resources. Finally, we discuss an implicate resource competition game, and show that, there exists an approximate resources access tradeoff balance between agents. Based on this discovery, the tradeoff between real-time decision-making and the efficiency of cooperation using these channels can be well improved.

  3. Scan statistics with local vote for target detection in distributed system

    NASA Astrophysics Data System (ADS)

    Luo, Junhai; Wu, Qi

    2017-12-01

    Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.

  4. Heuristic thinking makes a chemist smart.

    PubMed

    Graulich, Nicole; Hopf, Henning; Schreiner, Peter R

    2010-05-01

    We focus on the virtually neglected use of heuristic principles in understanding and teaching of organic chemistry. As human thinking is not comparable to computer systems employing factual knowledge and algorithms--people rarely make decisions through careful considerations of every possible event and its probability, risks or usefulness--research in science and teaching must include psychological aspects of the human decision making processes. Intuitive analogical and associative reasoning and the ability to categorize unexpected findings typically demonstrated by experienced chemists should be made accessible to young learners through heuristic concepts. The psychology of cognition defines heuristics as strategies that guide human problem-solving and deciding procedures, for example with patterns, analogies, or prototypes. Since research in the field of artificial intelligence and current studies in the psychology of cognition have provided evidence for the usefulness of heuristics in discovery, the status of heuristics has grown into something useful and teachable. In this tutorial review, we present a heuristic analysis of a familiar fundamental process in organic chemistry--the cyclic six-electron case, and we show that this approach leads to a more conceptual insight in understanding, as well as in teaching and learning.

  5. Quantifying the costs and benefits of privacy-preserving health data publishing.

    PubMed

    Khokhar, Rashid Hussain; Chen, Rui; Fung, Benjamin C M; Lui, Siu Man

    2014-08-01

    Cost-benefit analysis is a prerequisite for making good business decisions. In the business environment, companies intend to make profit from maximizing information utility of published data while having an obligation to protect individual privacy. In this paper, we quantify the trade-off between privacy and data utility in health data publishing in terms of monetary value. We propose an analytical cost model that can help health information custodians (HICs) make better decisions about sharing person-specific health data with other parties. We examine relevant cost factors associated with the value of anonymized data and the possible damage cost due to potential privacy breaches. Our model guides an HIC to find the optimal value of publishing health data and could be utilized for both perturbative and non-perturbative anonymization techniques. We show that our approach can identify the optimal value for different privacy models, including K-anonymity, LKC-privacy, and ∊-differential privacy, under various anonymization algorithms and privacy parameters through extensive experiments on real-life data. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  7. Flood Resilient Systems and their Application for Flood Resilient Planning

    NASA Astrophysics Data System (ADS)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project SMARTeST. A web based three tier advisory system FLORETO-KALYPSO (http://floreto.wb.tu-harburg.de/, Manojlovic et al, 2009) devoted to support decision-making process at the building level has been further developed to support multi-scale decision making on resilient systems, improving the existing data mining algorithms of the Business Logic tier. Further tuning of the algorithms is to be performed based on the new developments and findings in applicability and efficiency of different FRe Technology for different flood typologies. The first results obtained at the case studies in Greater Hamburg, Germany indicate the potential of this approach to contribute to the multiscale resilient planning on the road to flood resilient cities. FIAC (2007): "Final report form the Awareness and Assistance Sub-committee", FIAC, Scottish Government Zevenbergen C. et al (2008) "Challenges in urban flood management: travelling across spatial and temporal scales", Journal of FRM Volume 1 Issue 2, p 81-88 Manojlovic N., et al (2009): "Capacity Building in FRM through a DSS Utilising Data Mining Approach", Proceed. 8th HIC, Concepcion, Chile, January, 2009

  8. A step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy and minimization of gate fee.

    PubMed

    Kyriakis, Efstathios; Psomopoulos, Constantinos; Kokkotis, Panagiotis; Bourtsalas, Athanasios; Themelis, Nikolaos

    2017-06-23

    This study attempts the development of an algorithm in order to present a step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy, also considering the basic obstacle which is in many cases, the gate fee. Various parameters identified and evaluated in order to formulate the proposed decision making method in the form of an algorithm. The principle simulation input is the amount of municipal solid wastes (MSW) available for incineration and along with its net calorific value are the most important factors for the feasibility of the plant. Moreover, the research is focused both on the parameters that could increase the energy production and those that affect the R1 energy efficiency factor. Estimation of the final gate fee is achieved through the economic analysis of the entire project by investigating both expenses and revenues which are expected according to the selected site and outputs of the facility. In this point, a number of commonly revenue methods were included in the algorithm. The developed algorithm has been validated using three case studies in Greece-Athens, Thessaloniki, and Central Greece, where the cities of Larisa and Volos have been selected for the application of the proposed decision making tool. These case studies were selected based on a previous publication made by two of the authors, in which these areas where examined. Results reveal that the development of a «solid» methodological approach in selecting the site and the size of waste-to-energy (WtE) facility can be feasible. However, the maximization of the energy efficiency factor R1 requires high utilization factors while the minimization of the final gate fee requires high R1 and high metals recovery from the bottom ash as well as economic exploitation of recovered raw materials if any.

  9. Efficient frequent pattern mining algorithm based on node sets in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Billa, V. N. Vinay Kumar; Lakshmanna, K.; Rajesh, K.; Reddy, M. Praveen Kumar; Nagaraja, G.; Sudheer, K.

    2017-11-01

    The ultimate goal of Data Mining is to determine the hidden information which is useful in making decisions using the large databases collected by an organization. This Data Mining involves many tasks that are to be performed during the process. Mining frequent itemsets is the one of the most important tasks in case of transactional databases. These transactional databases contain the data in very large scale where the mining of these databases involves the consumption of physical memory and time in proportion to the size of the database. A frequent pattern mining algorithm is said to be efficient only if it consumes less memory and time to mine the frequent itemsets from the given large database. Having these points in mind in this thesis we proposed a system which mines frequent itemsets in an optimized way in terms of memory and time by using cloud computing as an important factor to make the process parallel and the application is provided as a service. A complete framework which uses a proven efficient algorithm called FIN algorithm. FIN algorithm works on Nodesets and POC (pre-order coding) tree. In order to evaluate the performance of the system we conduct the experiments to compare the efficiency of the same algorithm applied in a standalone manner and in cloud computing environment on a real time data set which is traffic accidents data set. The results show that the memory consumption and execution time taken for the process in the proposed system is much lesser than those of standalone system.

  10. Robust Blind Learning Algorithm for Nonlinear Equalization Using Input Decision Information.

    PubMed

    Xu, Lu; Huang, Defeng David; Guo, Yingjie Jay

    2015-12-01

    In this paper, we propose a new blind learning algorithm, namely, the Benveniste-Goursat input-output decision (BG-IOD), to enhance the convergence performance of neural network-based equalizers for nonlinear channel equalization. In contrast to conventional blind learning algorithms, where only the output of the equalizer is employed for updating system parameters, the BG-IOD exploits a new type of extra information, the input decision information obtained from the input of the equalizer, to mitigate the influence of the nonlinear equalizer structure on parameters learning, thereby leading to improved convergence performance. We prove that, with the input decision information, a desirable convergence capability that the output symbol error rate (SER) is always less than the input SER if the input SER is below a threshold, can be achieved. Then, the BG soft-switching technique is employed to combine the merits of both input and output decision information, where the former is used to guarantee SER convergence and the latter is to improve SER performance. Simulation results show that the proposed algorithm outperforms conventional blind learning algorithms, such as stochastic quadratic distance and dual mode constant modulus algorithm, in terms of both convergence performance and SER performance, for nonlinear equalization.

  11. Block-Based Connected-Component Labeling Algorithm Using Binary Decision Trees

    PubMed Central

    Chang, Wan-Yu; Chiu, Chung-Cheng; Yang, Jia-Horng

    2015-01-01

    In this paper, we propose a fast labeling algorithm based on block-based concepts. Because the number of memory access points directly affects the time consumption of the labeling algorithms, the aim of the proposed algorithm is to minimize neighborhood operations. Our algorithm utilizes a block-based view and correlates a raster scan to select the necessary pixels generated by a block-based scan mask. We analyze the advantages of a sequential raster scan for the block-based scan mask, and integrate the block-connected relationships using two different procedures with binary decision trees to reduce unnecessary memory access. This greatly simplifies the pixel locations of the block-based scan mask. Furthermore, our algorithm significantly reduces the number of leaf nodes and depth levels required in the binary decision tree. We analyze the labeling performance of the proposed algorithm alongside that of other labeling algorithms using high-resolution images and foreground images. The experimental results from synthetic and real image datasets demonstrate that the proposed algorithm is faster than other methods. PMID:26393597

  12. A MAP-based image interpolation method via Viterbi decoding of Markov chains of interpolation functions.

    PubMed

    Vedadi, Farhang; Shirani, Shahram

    2014-01-01

    A new method of image resolution up-conversion (image interpolation) based on maximum a posteriori sequence estimation is proposed. Instead of making a hard decision about the value of each missing pixel, we estimate the missing pixels in groups. At each missing pixel of the high resolution (HR) image, we consider an ensemble of candidate interpolation methods (interpolation functions). The interpolation functions are interpreted as states of a Markov model. In other words, the proposed method undergoes state transitions from one missing pixel position to the next. Accordingly, the interpolation problem is translated to the problem of estimating the optimal sequence of interpolation functions corresponding to the sequence of missing HR pixel positions. We derive a parameter-free probabilistic model for this to-be-estimated sequence of interpolation functions. Then, we solve the estimation problem using a trellis representation and the Viterbi algorithm. Using directional interpolation functions and sequence estimation techniques, we classify the new algorithm as an adaptive directional interpolation using soft-decision estimation techniques. Experimental results show that the proposed algorithm yields images with higher or comparable peak signal-to-noise ratios compared with some benchmark interpolation methods in the literature while being efficient in terms of implementation and complexity considerations.

  13. Treatment algorithm for anal fissure. Consensus document of the Spanish Association of Coloproctology and the Coloproctology Division of the Spanish Association of Surgeons.

    PubMed

    Arroyo, Antonio; Montes, Elisa; Calderón, Teresa; Blesa, Isabel; Elía, Manuela; Salgado, Gervasio; García-Armengol, Juan; de-la-Portilla, Fernando

    2018-03-08

    The Spanish Association of Coloproctology and the Coloproctology Division of the Spanish Association of Surgeons propose this consensus document with a treatment algorithm for anal fissure that could be used for decision making. Non-surgical therapy and surgical treatment of anal fissure are explained, and the recommended algorithm is provided. The methodology used was: creation of a group of experts; search in PubMed, MEDLINE and the Cochrane Library for publications from the last 10 years about anal fissure; presentation at the 21st National Meeting of the Spanish Association of Coloproctology Foundation 2017 with voting for/against each conclusion by the attendees and review by the scientific committee of the Spanish Association of Coloproctology. Copyright © 2018 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Development and experimentation of LQR/APF guidance and control for autonomous proximity maneuvers of multiple spacecraft

    NASA Astrophysics Data System (ADS)

    Bevilacqua, R.; Lehmann, T.; Romano, M.

    2011-04-01

    This work introduces a novel control algorithm for close proximity multiple spacecraft autonomous maneuvers, based on hybrid linear quadratic regulator/artificial potential function (LQR/APF), for applications including autonomous docking, on-orbit assembly and spacecraft servicing. Both theoretical developments and experimental validation of the proposed approach are presented. Fuel consumption is sub-optimized in real-time through re-computation of the LQR at each sample time, while performing collision avoidance through the APF and a high level decisional logic. The underlying LQR/APF controller is integrated with a customized wall-following technique and a decisional logic, overcoming problems such as local minima. The algorithm is experimentally tested on a four spacecraft simulators test bed at the Spacecraft Robotics Laboratory of the Naval Postgraduate School. The metrics to evaluate the control algorithm are: autonomy of the system in making decisions, successful completion of the maneuver, required time, and propellant consumption.

  15. Adaptive process control using fuzzy logic and genetic algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  16. Adaptive Process Control with Fuzzy Logic and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision-making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, an analysis element to recognize changes in the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  17. Genetic algorithms in adaptive fuzzy control

    NASA Technical Reports Server (NTRS)

    Karr, C. Lucas; Harper, Tony R.

    1992-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, an analysis element to recognize changes in the problem environment, and a learning element to adjust fuzzy membership functions in response to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific computer-simulated chemical system is used to demonstrate the ideas presented.

  18. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  19. A Relative Ranking Approach for Nano-Enabled Applications to Improve Risk-Based Decision Making: A Case Study of Army Materiel

    DTIC Science & Technology

    2014-12-24

    scenarios. The USACEHR has been conducting research and devel- opment efforts on the incorporation of various ENMs into Army materiel, ranging from food ...materiel characteristics, and (3) apply the algorithm and associated risk ranking tool to prioritize additional assessments based on the human health risk...online correspondence to confirm, edit, and supplement the inventory with additional information (See Section 1 in Supplementary Information (SI) for

  20. Kernel-Based Approximate Dynamic Programming Using Bellman Residual Elimination

    DTIC Science & Technology

    2010-02-01

    framework is the ability to utilize stochastic system models, thereby allowing the system to make sound decisions even if there is randomness in the system ...approximate policy when a system model is unavailable. We present theoretical analysis of all BRE algorithms proving convergence to the optimal policy in...policies based on MDPs is that there may be parameters of the system model that are poorly known and/or vary with time as the system operates. System

  1. Intelligent Local Avoided Collision (iLAC) MAC Protocol for Very High Speed Wireless Network

    NASA Astrophysics Data System (ADS)

    Hieu, Dinh Chi; Masuda, Akeo; Rabarijaona, Verotiana Hanitriniala; Shimamoto, Shigeru

    Future wireless communication systems aim at very high data rates. As the medium access control (MAC) protocol plays the central role in determining the overall performance of the wireless system, designing a suitable MAC protocol is critical to fully exploit the benefit of high speed transmission that the physical layer (PHY) offers. In the latest 802.11n standard [2], the problem of long overhead has been addressed adequately but the issue of excessive colliding transmissions, especially in congested situation, remains untouched. The procedure of setting the backoff value is the heart of the 802.11 distributed coordination function (DCF) to avoid collision in which each station makes its own decision on how to avoid collision in the next transmission. However, collision avoidance is a problem that can not be solved by a single station. In this paper, we introduce a new MAC protocol called Intelligent Local Avoided Collision (iLAC) that redefines individual rationality in choosing the backoff counter value to avoid a colliding transmission. The distinguishing feature of iLAC is that it fundamentally changes this decision making process from collision avoidance to collaborative collision prevention. As a result, stations can avoid colliding transmissions with much greater precision. Analytical solution confirms the validity of this proposal and simulation results show that the proposed algorithm outperforms the conventional algorithms by a large margin.

  2. Associative memory or algorithmic search: a comparative study on learning strategies of bats and shrews.

    PubMed

    Page, Rachel A; von Merten, Sophie; Siemers, Björn M

    2012-07-01

    Two common strategies for successful foraging are learning to associate specific sensory cues with patches of prey ("associative learning") and using set decision-making rules to systematically scan for prey ("algorithmic search"). We investigated whether an animal's life history affects which of these two foraging strategies it is likely to use. Natterer's bats (Myotis nattereri) have slow life-history traits and we predicted they would be more likely to use associative learning. Common shrews (Sorex araneus) have fast life-history traits and we predicted that they would rely more heavily on routine-based search. Apart from their marked differences in life-history traits, these two mammals are similar in body size, brain weight, habitat, and diet. We assessed foraging strategy, associative learning ability, and retention time with a four-arm maze; one arm contained a food reward and was marked with four sensory stimuli. Bats and shrews differed significantly in their foraging strategies. Most bats learned to associate the sensory stimuli with the reward and remembered this association over time. Most shrews searched the maze using consistent decision-making rules, but did not learn or remember the association. We discuss these results in terms of life-history traits and other key differences between these species. Our results suggest a link between an animal's life-history strategy and its use of associative learning.

  3. A Cross-Layer User Centric Vertical Handover Decision Approach Based on MIH Local Triggers

    NASA Astrophysics Data System (ADS)

    Rehan, Maaz; Yousaf, Muhammad; Qayyum, Amir; Malik, Shahzad

    Vertical handover decision algorithm that is based on user preferences and coupled with Media Independent Handover (MIH) local triggers have not been explored much in the literature. We have developed a comprehensive cross-layer solution, called Vertical Handover Decision (VHOD) approach, which consists of three parts viz. mechanism for collecting and storing user preferences, Vertical Handover Decision (VHOD) algorithm and the MIH Function (MIHF). MIHF triggers the VHOD algorithm which operates on user preferences to issue handover commands to mobility management protocol. VHOD algorithm is an MIH User and therefore needs to subscribe events and configure thresholds for receiving triggers from MIHF. In this regard, we have performed experiments in WLAN to suggest thresholds for Link Going Down trigger. We have also critically evaluated the handover decision process, proposed Just-in-time interface activation technique, compared our proposed approach with prominent user centric approaches and analyzed our approach from different aspects.

  4. Extending Wireless Rechargeable Sensor Network Life without Full Knowledge.

    PubMed

    Najeeb, Najeeb W; Detweiler, Carrick

    2017-07-17

    When extending the life of Wireless Rechargeable Sensor Networks (WRSN), one challenge is charging networks as they grow larger. Overcoming this limitation will render a WRSN more practical and highly adaptable to growth in the real world. Most charging algorithms require a priori full knowledge of sensor nodes' power levels in order to determine the nodes that require charging. In this work, we present a probabilistic algorithm that extends the life of scalable WRSN without a priori power knowledge and without full network exploration. We develop a probability bound on the power level of the sensor nodes and utilize this bound to make decisions while exploring a WRSN. We verify the algorithm by simulating a wireless power transfer unmanned aerial vehicle, and charging a WRSN to extend its life. Our results show that, without knowledge, our proposed algorithm extends the life of a WRSN on average 90% of what an optimal full knowledge algorithm can achieve. This means that the charging robot does not need to explore the whole network, which enables the scaling of WRSN. We analyze the impact of network parameters on our algorithm and show that it is insensitive to a large range of parameter values.

  5. Extending Wireless Rechargeable Sensor Network Life without Full Knowledge

    PubMed Central

    Najeeb, Najeeb W.; Detweiler, Carrick

    2017-01-01

    When extending the life of Wireless Rechargeable Sensor Networks (WRSN), one challenge is charging networks as they grow larger. Overcoming this limitation will render a WRSN more practical and highly adaptable to growth in the real world. Most charging algorithms require a priori full knowledge of sensor nodes’ power levels in order to determine the nodes that require charging. In this work, we present a probabilistic algorithm that extends the life of scalable WRSN without a priori power knowledge and without full network exploration. We develop a probability bound on the power level of the sensor nodes and utilize this bound to make decisions while exploring a WRSN. We verify the algorithm by simulating a wireless power transfer unmanned aerial vehicle, and charging a WRSN to extend its life. Our results show that, without knowledge, our proposed algorithm extends the life of a WRSN on average 90% of what an optimal full knowledge algorithm can achieve. This means that the charging robot does not need to explore the whole network, which enables the scaling of WRSN. We analyze the impact of network parameters on our algorithm and show that it is insensitive to a large range of parameter values. PMID:28714936

  6. Optimal strategies for electric energy contract decision making

    NASA Astrophysics Data System (ADS)

    Song, Haili

    2000-10-01

    The power industry restructuring in various countries in recent years has created an environment where trading of electric energy is conducted in a market environment. In such an environment, electric power companies compete for the market share through spot and bilateral markets. Being profit driven, electric power companies need to make decisions on spot market bidding, contract evaluation, and risk management. New methods and software tools are required to meet these upcoming needs. In this research, bidding strategy and contract pricing are studied from a market participant's viewpoint; new methods are developed to guide a market participant in spot and bilateral market operation. A supplier's spot market bidding decision is studied. Stochastic optimization is formulated to calculate a supplier's optimal bids in a single time period. This decision making problem is also formulated as a Markov Decision Process. All the competitors are represented by their bidding parameters with corresponding probabilities. A systematic method is developed to calculate transition probabilities and rewards. The optimal strategy is calculated to maximize the expected reward over a planning horizon. Besides the spot market, a power producer can also trade in the bilateral markets. Bidding strategies in a bilateral market are studied with game theory techniques. Necessary and sufficient conditions of Nash Equilibrium (NE) bidding strategy are derived based on the generators' cost and the loads' willingness to pay. The study shows that in any NE, market efficiency is achieved. Furthermore, all Nash equilibria are revenue equivalent for the generators. The pricing of "Flexible" contracts, which allow delivery flexibility over a period of time with a fixed total amount of electricity to be delivered, is analyzed based on the no-arbitrage pricing principle. The proposed algorithm calculates the price based on the optimality condition of the stochastic optimization formulation. Simulation examples illustrate the tradeoffs between prices and scheduling flexibility. Spot bidding and contract pricing are not independent decision processes. The interaction between spot bidding and contract evaluation is demonstrated with game theory equilibrium model and market simulation results. It leads to the conclusion that a market participant's contract decision making needs to be further investigated as an integrated optimization formulation.

  7. Unified method of knowledge representation in the evolutionary artificial intelligence systems

    NASA Astrophysics Data System (ADS)

    Bykov, Nickolay M.; Bykova, Katherina N.

    2003-03-01

    The evolution of artificial intelligence systems called by complicating of their operation topics and science perfecting has resulted in a diversification of the methods both the algorithms of knowledge representation and usage in these systems. Often by this reason it is very difficult to design the effective methods of knowledge discovering and operation for such systems. In the given activity the authors offer a method of unitized representation of the systems knowledge about objects of an external world by rank transformation of their descriptions, made in the different features spaces: deterministic, probabilistic, fuzzy and other. The proof of a sufficiency of the information about the rank configuration of the object states in the features space for decision making is presented. It is shown that the geometrical and combinatorial model of the rank configurations set introduce their by group of some system of incidence, that allows to store the information on them in a convolute kind. The method of the rank configuration description by the DRP - code (distance rank preserving code) is offered. The problems of its completeness, information capacity, noise immunity and privacy are reviewed. It is shown, that the capacity of a transmission channel for such submission of the information is more than unit, as the code words contain the information both about the object states, and about the distance ranks between them. The effective algorithm of the data clustering for the object states identification, founded on the given code usage, is described. The knowledge representation with the help of the rank configurations allows to unitize and to simplify algorithms of the decision making by fulfillment of logic operations above the DRP - code words. Examples of the proposed clustering techniques operation on the given samples set, the rank configuration of resulted clusters and its DRP-codes are presented.

  8. Communication for end-of-life care planning among Korean patients with terminal cancer: A context-oriented model.

    PubMed

    Koh, Su Jin; Kim, Shinmi; Kim, Jinshil

    2016-02-01

    In Korea, patients with terminal cancer are often caught out of the loop in end-of-life (EoL) care discussions. Healthcare professionals also have difficulty engaging in such communication in a variety of healthcare contexts. Therefore, the objective of our study was to develop a communication model for EoL care decision making compatible with the clinical environment in Korea. Using focus-group interview methodology, participants included eight doctors and five nurses who provide EoL care for terminal cancer patients in acute hospital settings or hospice care facilities in various provinces of Korea. Five themes emerged regarding EoL care discussion, which included: (1) timing, (2) responsible professionals, (3) disclosure of bad news, (4) content areas of EoL care discussion, and (5) implementing strategies for EoL care discussions. These themes were based on development of a communication algorithm for EoL discussion among patients with terminal cancer. A structural communication step for delivery of a terminal prognosis was specified at the phase of disclosure of bad news: beginning with determination of a patient's decision-making capability, followed by a patient's perception of his/her condition, a patient's wish to know, family dynamics, and a patient's and/or family's readiness for EoL discussions. The proposed context-oriented communication algorithm could provide a helpful guideline for EoL communication and, accordingly, facilitate meaningful improvements in EoL care in Korean clinical practice. The feasibility of this algorithm has not yet been determined, and its validation in a larger sample of patients with terminal cancers, using a quantitative research methodology, is a priority of research.

  9. Modeling of geoelectric parameters for assessing groundwater potentiality in a multifaceted geologic terrain, Ipinsa Southwest, Nigeria - A GIS-based GODT approach

    NASA Astrophysics Data System (ADS)

    Mogaji, Kehinde Anthony; Omobude, Osayande Bright

    2017-12-01

    Modeling of groundwater potentiality zones is a vital scheme for effective management of groundwater resources. This study developed a new multi-criteria decision making algorithm for groundwater potentiality modeling through modifying the standard GOD model. The developed model christened as GODT model was applied to assess groundwater potential in a multi-faceted crystalline geologic terrain, southwestern, Nigeria using the derived four unify groundwater potential conditioning factors namely: Groundwater hydraulic confinement (G), aquifer Overlying strata resistivity (O), Depth to water table (D) and Thickness of aquifer (T) from the interpreted geophysical data acquired in the area. With the developed model algorithm, the GIS-based produced G, O, D and T maps were synthesized to estimate groundwater potential index (GWPI) values for the area. The estimated GWPI values were processed in GIS environment to produce groundwater potential prediction index (GPPI) map which demarcate the area into four potential zones. The produced GODT model-based GPPI map was validated through application of both correlation technique and spatial attribute comparative scheme (SACS). The performance of the GODT model was compared with that of the standard analytic hierarchy process (AHP) model. The correlation technique results established 89% regression coefficients for the GODT modeling algorithm compared with 84% for the AHP model. On the other hand, the SACS validation results for the GODT and AHP models are 72.5% and 65%, respectively. The overall results indicate that both models have good capability for predicting groundwater potential zones with the GIS-based GODT model as a good alternative. The GPPI maps produced in this study can form part of decision making model for environmental planning and groundwater management in the area.

  10. Analytical and CASE study on Limited Search, ID3, CHAID, C4.5, Improved C4.5 and OVA Decision Tree Algorithms to design Decision Support System

    NASA Astrophysics Data System (ADS)

    Kaur, Parneet; Singh, Sukhwinder; Garg, Sushil; Harmanpreet

    2010-11-01

    In this paper we study about classification algorithms for farm DSS. By applying classification algorithms i.e. Limited search, ID3, CHAID, C4.5, Improved C4.5 and One VS all Decision Tree on common data set of crop with specified class, results are obtained. The tool used to derive results is SPINA. The graphical results obtained from tool are compared to suggest best technique to develop farm Decision Support System. This analysis would help to researchers to design effective and fast DSS for farmer to take decision for enhancing their yield.

  11. Clinical decision making in hypotonia and gross motor delay: a case report of type 1 spinal muscular atrophy in an infant.

    PubMed

    Malerba, Kirsten Hawkins; Tecklin, Jan Stephen

    2013-06-01

    Children often are referred for physical therapy with the diagnosis of hypotonia when the definitive cause of hypotonia is unknown. The purpose of this case report is to describe the clinical decision-making process using the Hypothesis-Oriented Algorithm for Clinicians II (HOAC II) for an infant with hypotonia and gross motor delay. The patient was a 5-month-old infant who had been evaluated by a neurologist and then referred for physical therapy by his pediatrician. Physical therapist evaluation results and clinical observations of marked hypotonia, significant gross motor delay, tongue fasciculations, feeding difficulties, and respiratory abnormalities prompted necessary referral to specialists. Recognition of developmental, neurologic, and respiratory abnormalities facilitated clinical decision making for determining the appropriate physical therapy plan of care. During the brief episode of physical therapy care, the patient was referred to a feeding specialist and diagnosed with pharyngeal-phase dysphasia and mild aspiration. Continued global weakness, signs and symptoms of type 1 spinal muscular atrophy (SMA), and concerns about increased work of breathing and respiratory compromise were discussed with the referring physician. After inconclusive laboratory testing for metabolic etiologies of hypotonia, a genetics consult was recommended and confirmed the diagnosis of type 1 SMA at 9 months of age. Physical therapists use clinical decision making to determine whether to treat patients or to refer them to other medical professionals. Accurate and timely referral to appropriate specialists may assist families in obtaining a diagnosis for their child and guide necessary interventions. In the case of type 1 SMA, early diagnosis may affect outcomes and survival rate in this pediatric population.

  12. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  13. Solving fuzzy shortest path problem by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Syarif, A.; Muludi, K.; Adrian, R.; Gen, M.

    2018-03-01

    Shortest Path Problem (SPP) is known as one of well-studied fields in the area Operations Research and Mathematical Optimization. It has been applied for many engineering and management designs. The objective is usually to determine path(s) in the network with minimum total cost or traveling time. In the past, the cost value for each arc was usually assigned or estimated as a deteministic value. For some specific real world applications, however, it is often difficult to determine the cost value properly. One way of handling such uncertainty in decision making is by introducing fuzzy approach. With this situation, it will become difficult to solve the problem optimally. This paper presents the investigations on the application of Genetic Algorithm (GA) to a new SPP model in which the cost values are represented as Triangular Fuzzy Number (TFN). We adopts the concept of ranking fuzzy numbers to determine how good the solutions. Here, by giving his/her degree value, the decision maker can determine the range of objective value. This would be very valuable for decision support system in the real world applications.Simulation experiments were carried out by modifying several test problems with 10-25 nodes. It is noted that the proposed approach is capable attaining a good solution with different degree of optimism for the tested problems.

  14. Measurement-based care for refractory depression: a clinical decision support model for clinical research and practice.

    PubMed

    Trivedi, Madhukar H; Daly, Ella J

    2007-05-01

    Despite years of antidepressant drug development and patient and provider education, suboptimal medication dosing and duration of exposure resulting in incomplete remission of symptoms remains the norm in the treatment of depression. Additionally, since no one treatment is effective for all patients, optimal implementation focusing on the measurement of symptoms, side effects, and function is essential to determine effective sequential treatment approaches. There is a need for a paradigm shift in how clinical decision making is incorporated into clinical practice and for a move away from the trial-and-error approach that currently determines the "next best" treatment. This paper describes how our experience with the Texas Medication Algorithm Project (TMAP) and the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial has confirmed the need for easy-to-use clinical support systems to ensure fidelity to guidelines. To further enhance guideline fidelity, we have developed an electronic decision support system that provides critical feedback and guidance at the point of patient care. We believe that a measurement-based care (MBC) approach is essential to any decision support system, allowing physicians to individualize and adapt decisions about patient care based on symptom progress, tolerability of medication, and dose optimization. We also believe that successful integration of sequential algorithms with MBC into real-world clinics will facilitate change that will endure and improve patient outcomes. Although we use major depression to illustrate our approach, the issues addressed are applicable to other chronic psychiatric conditions including comorbid depression and substance use disorder as well as other medical illnesses.

  15. Measurement-Based Care for Refractory Depression: A Clinical Decision Support Model for Clinical Research and Practice

    PubMed Central

    Trivedi, Madhukar H.; Daly, Ella J.

    2009-01-01

    Despite years of antidepressant drug development and patient and provider education, suboptimal medication dosing and duration of exposure resulting in incomplete remission of symptoms remains the norm in the treatment of depression. Additionally, since no one treatment is effective for all patients, optimal implementation focusing on the measurement of symptoms, side effects, and function is essential to determine effective sequential treatment approaches. There is a need for a paradigm shift in how clinical decision making is incorporated into clinical practice and for a move away from the trial-and-error approach that currently determines the “next best” treatment. This paper describes how our experience with the Texas Medication Algorithm Project (TMAP) and the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial has confirmed the need for easy-to-use clinical support systems to ensure fidelity to guidelines. To further enhance guideline fidelity, we have developed an electronic decision support system that provides critical feedback and guidance at the point of patient care. We believe that a measurement-based care (MBC) approach is essential to any decision support system, allowing physicians to individualize and adapt decisions about patient care based on symptom progress, tolerability of medication, and dose optimization. We also believe that successful integration of sequential algorithms with MBC into real-world clinics will facilitate change that will endure and improve patient outcomes. Although we use major depression to illustrate our approach, the issues addressed are applicable to other chronic psychiatric conditions including comorbid depression and substance use disorder as well as other medical illnesses. PMID:17320312

  16. Results of analysis of archive MSG data in the context of MCS prediction system development for economic decisions assistance - case studies

    NASA Astrophysics Data System (ADS)

    Szafranek, K.; Jakubiak, B.; Lech, R.; Tomczuk, M.

    2012-04-01

    PROZA (Operational decision-making based on atmospheric conditions) is the project co-financed by the European Union through the European Regional Development Fund. One of its tasks is to develop the operational forecast system, which is supposed to support different economies branches like forestry or fruit farming by reducing the risk of economic decisions with taking into consideration weather conditions. In the frame of this studies system of sudden convective phenomena (storms or tornados) prediction is going to be built. The main authors' purpose is to predict MCSs (Mezoscale Convective Systems) basing on MSG (Meteosat Second Generation) real-time data. Until now several tests were performed. The Meteosat satellite images in selected spectral channels collected for Central Europe Region for May and August 2010 were used to detect and track cloud systems related to MCSs. In proposed tracking method first the cloud objects are defined using the temperature threshold and next the selected cells are tracked using principle of overlapping position on consecutive images. The main benefit to use a temperature thresholding to define cells is its simplicity. During the tracking process the algorithm links the cells of the image at time t to the one of the following image at time t+dt that correspond to the same cloud system (Morel-Senesi algorithm). An automated detection and elimination of some instabilities presented in tracking algorithm was developed. The poster presents analysis of exemplary MCSs in the context of near real-time prediction system development.

  17. Study on bi-directional pedestrian movement using ant algorithms

    NASA Astrophysics Data System (ADS)

    Sibel, Gokce; Ozhan, Kayacan

    2016-01-01

    A cellular automata model is proposed to simulate bi-directional pedestrian flow. Pedestrian movement is investigated by using ant algorithms. Ants communicate with each other by dropping a chemical, called a pheromone, on the substrate while crawling forward. Similarly, it is considered that oppositely moving pedestrians drop ‘visual pheromones’ on their way and the visual pheromones might cause attractive or repulsive interactions. This pheromenon is introduced into modelling the pedestrians’ walking preference. In this way, the decision-making process of pedestrians will be based on ‘the instinct of following’. At some densities, the relationships of velocity-density and flux-density are analyzed for different evaporation rates of visual pheromones. Lane formation and phase transition are observed for certain evaporation rates of visual pheromones.

  18. Spectrum sensing based on cumulative power spectral density

    NASA Astrophysics Data System (ADS)

    Nasser, A.; Mansour, A.; Yao, K. C.; Abdallah, H.; Charara, H.

    2017-12-01

    This paper presents new spectrum sensing algorithms based on the cumulative power spectral density (CPSD). The proposed detectors examine the CPSD of the received signal to make a decision on the absence/presence of the primary user (PU) signal. Those detectors require the whiteness of the noise in the band of interest. The false alarm and detection probabilities are derived analytically and simulated under Gaussian and Rayleigh fading channels. Our proposed detectors present better performance than the energy (ED) or the cyclostationary detectors (CSD). Moreover, in the presence of noise uncertainty (NU), they are shown to provide more robustness than ED, with less performance loss. In order to neglect the NU, we modified our algorithms to be independent from the noise variance.

  19. Microeconomics-based resource allocation in overlay networks by using non-strategic behavior modeling

    NASA Astrophysics Data System (ADS)

    Analoui, Morteza; Rezvani, Mohammad Hossein

    2011-01-01

    Behavior modeling has recently been investigated for designing self-organizing mechanisms in the context of communication networks in order to exploit the natural selfishness of the users with the goal of maximizing the overall utility. In strategic behavior modeling, the users of the network are assumed to be game players who seek to maximize their utility with taking into account the decisions that the other players might make. The essential difference between the aforementioned researches and this work is that it incorporates the non-strategic decisions in order to design the mechanism for the overlay network. In this solution concept, the decisions that a peer might make does not affect the actions of the other peers at all. The theory of consumer-firm developed in microeconomics is a model of the non-strategic behavior that we have adopted in our research. Based on it, we have presented distributed algorithms for peers' "joining" and "leaving" operations. We have modeled the overlay network as a competitive economy in which the content provided by an origin server can be viewed as commodity and the origin server and the peers who multicast the content to their downside are considered as the firms. On the other hand, due to the dual role of the peers in the overlay network, they can be considered as the consumers as well. On joining to the overlay economy, each peer is provided with an income and tries to get hold of the service regardless to the behavior of the other peers. We have designed the scalable algorithms in such a way that the existence of equilibrium price (known as Walrasian equilibrium price) is guaranteed.

  20. Personalized treatment planning with a model of radiation therapy outcomes for use in multiobjective optimization of IMRT plans for prostate cancer.

    PubMed

    Smith, Wade P; Kim, Minsun; Holdsworth, Clay; Liao, Jay; Phillips, Mark H

    2016-03-11

    To build a new treatment planning approach that extends beyond radiation transport and IMRT optimization by modeling the radiation therapy process and prognostic indicators for more outcome-focused decision making. An in-house treatment planning system was modified to include multiobjective inverse planning, a probabilistic outcome model, and a multi-attribute decision aid. A genetic algorithm generated a set of plans embodying trade-offs between the separate objectives. An influence diagram network modeled the radiation therapy process of prostate cancer using expert opinion, results of clinical trials, and published research. A Markov model calculated a quality adjusted life expectancy (QALE), which was the endpoint for ranking plans. The Multiobjective Evolutionary Algorithm (MOEA) was designed to produce an approximation of the Pareto Front representing optimal tradeoffs for IMRT plans. Prognostic information from the dosimetrics of the plans, and from patient-specific clinical variables were combined by the influence diagram. QALEs were calculated for each plan for each set of patient characteristics. Sensitivity analyses were conducted to explore changes in outcomes for variations in patient characteristics and dosimetric variables. The model calculated life expectancies that were in agreement with an independent clinical study. The radiation therapy model proposed has integrated a number of different physical, biological and clinical models into a more comprehensive model. It illustrates a number of the critical aspects of treatment planning that can be improved and represents a more detailed description of the therapy process. A Markov model was implemented to provide a stronger connection between dosimetric variables and clinical outcomes and could provide a practical, quantitative method for making difficult clinical decisions.

Top