An Analysis of Graduate Nursing Students' Innovation-Decision Process
Kacynski, Kathryn A.; Roy, Katrina D.
1984-01-01
This study's purpose was to examine the innovation-decision process used by graduate nursing students when deciding to use computer applications. Graduate nursing students enrolled in a mandatory research class were surveyed before and after their use of a mainframe computer for beginning data analysis about their general attitudes towards computers, individual characteristics such as “cosmopoliteness”, and their desire to learn more about a computer application. It was expected that an experimental intervention, a videotaped demonstration of interactive video instruction of cardiopulmonary resuscitation (CPR); previous computer experience; and the subject's “cosmopoliteness” wolud influence attitudes towards computers and the desire to learn more about a computer application.
ERIC Educational Resources Information Center
Djang, Philipp A.
1993-01-01
Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…
NASA Astrophysics Data System (ADS)
Roy, Jean; Breton, Richard; Paradis, Stephane
2001-08-01
Situation Awareness (SAW) is essential for commanders to conduct decision-making (DM) activities. Situation Analysis (SA) is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of SAW for the decision maker. Operational trends in warfare put the situation analysis process under pressure. This emphasizes the need for a real-time computer-based Situation analysis Support System (SASS) to aid commanders in achieving the appropriate situation awareness, thereby supporting their response to actual or anticipated threats. Data fusion is clearly a key enabler for SA and a SASS. Since data fusion is used for SA in support of dynamic human decision-making, the exploration of the SA concepts and the design of data fusion techniques must take into account human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight integration of the human element with the SA technology is essential. Regarding these issues, this paper provides a description of CODSI (Command Decision Support Interface), and operational- like human machine interface prototype for investigations in computer-based SA and command decision support. With CODSI, one objective was to apply recent developments in SA theory and information display technology to the problem of enhancing SAW quality. It thus provides a capability to adequately convey tactical information to command decision makers. It also supports the study of human-computer interactions for SA, and methodologies for SAW measurement.
Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.
Park, Eun-Jun; Park, Mihyun
2015-11-01
The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.
On-line confidence monitoring during decision making.
Dotan, Dror; Meyniel, Florent; Dehaene, Stanislas
2018-02-01
Humans can readily assess their degree of confidence in their decisions. Two models of confidence computation have been proposed: post hoc computation using post-decision variables and heuristics, versus online computation using continuous assessment of evidence throughout the decision-making process. Here, we arbitrate between these theories by continuously monitoring finger movements during a manual sequential decision-making task. Analysis of finger kinematics indicated that subjects kept separate online records of evidence and confidence: finger deviation continuously reflected the ongoing accumulation of evidence, whereas finger speed continuously reflected the momentary degree of confidence. Furthermore, end-of-trial finger speed predicted the post-decisional subjective confidence rating. These data indicate that confidence is computed on-line, throughout the decision process. Speed-confidence correlations were previously interpreted as a post-decision heuristics, whereby slow decisions decrease subjective confidence, but our results suggest an adaptive mechanism that involves the opposite causality: by slowing down when unconfident, participants gain time to improve their decisions. Copyright © 2017 Elsevier B.V. All rights reserved.
Naturalistic Decision Making: Implications for Design
1993-04-01
Cognitive Task Analysis Decision Making Design Engineer Design System Human-Computer Interface System Development 15. NUMBER OF PAGES 182 16...people use to select a course of action. The SOAR explains how stress affects the decision making of both individuals and teams. COGNITIVE TASK ANALYSIS : This...procedures for Cognitive Task Analysis , contrasting the strengths and weaknesses of each, and showing how a Cognitive Task Analysis
Fast Image Texture Classification Using Decision Trees
NASA Technical Reports Server (NTRS)
Thompson, David R.
2011-01-01
Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.
A novel computer based expert decision making model for prostate cancer disease management.
Richman, Martin B; Forman, Ernest H; Bayazit, Yildirim; Einstein, Douglas B; Resnick, Martin I; Stovsky, Mark D
2005-12-01
We propose a strategic, computer based, prostate cancer decision making model based on the analytic hierarchy process. We developed a model that improves physician-patient joint decision making and enhances the treatment selection process by making this critical decision rational and evidence based. Two groups (patient and physician-expert) completed a clinical study comparing an initial disease management choice with the highest ranked option generated by the computer model. Participants made pairwise comparisons to derive priorities for the objectives and subobjectives related to the disease management decision. The weighted comparisons were then applied to treatment options to yield prioritized rank lists that reflect the likelihood that a given alternative will achieve the participant treatment goal. Aggregate data were evaluated by inconsistency ratio analysis and sensitivity analysis, which assessed the influence of individual objectives and subobjectives on the final rank list of treatment options. Inconsistency ratios less than 0.05 were reliably generated, indicating that judgments made within the model were mathematically rational. The aggregate prioritized list of treatment options was tabulated for the patient and physician groups with similar outcomes for the 2 groups. Analysis of the major defining objectives in the treatment selection decision demonstrated the same rank order for the patient and physician groups with cure, survival and quality of life being more important than controlling cancer, preventing major complications of treatment, preventing blood transfusion complications and limiting treatment cost. Analysis of subobjectives, including quality of life and sexual dysfunction, produced similar priority rankings for the patient and physician groups. Concordance between initial treatment choice and the highest weighted model option differed between the groups with the patient group having 59% concordance and the physician group having only 42% concordance. This study successfully validated the usefulness of a computer based prostate cancer management decision making model to produce individualized, rational, clinically appropriate disease management decisions without physician bias.
An Investment Behavior Analysis using by Brain Computer Interface
NASA Astrophysics Data System (ADS)
Suzuki, Kyoko; Kinoshita, Kanta; Miyagawa, Kazuhiro; Shiomi, Shinichi; Misawa, Tadanobu; Shimokawa, Tetsuya
In this paper, we will construct a new Brain Computer Interface (BCI), for the purpose of analyzing human's investment decision makings. The BCI is made up of three functional parts which take roles of, measuring brain information, determining market price in an artificial market, and specifying investment decision model, respectively. When subjects make decisions, their brain information is conveyed to the part of specifying investment decision model through the part of measuring brain information, whereas, their decisions of investment order are sent to the part of artificial market to form market prices. Both the support vector machine and the 3 layered perceptron are used to assess the investment decision model. In order to evaluate our BCI, we conduct an experiment in which subjects and a computer trader agent trade shares of stock in the artificial market and test how the computer trader agent can forecast market price formation and investment decision makings from the brain information of subjects. The result of the experiment shows that the brain information can improve the accuracy of forecasts, and so the computer trader agent can supply market liquidity to stabilize market volatility without his loss.
Studying Parental Decision Making with Micro-Computers: The CPSI Technique.
ERIC Educational Resources Information Center
Holden, George W.
A technique for studying how parents think, make decisions, and solve childrearing problems, Computer-Presented Social Interactions (CPSI), is described. Two studies involving CPSI are presented. The first study concerns a common parental cognitive task: causal analysis of an undesired behavior. The task was to diagnose the cause of non-contingent…
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.
Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei
2018-06-15
Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.
Decision Analysis Using Spreadsheets.
ERIC Educational Resources Information Center
Sounderpandian, Jayavel
1989-01-01
Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)
Signal Detection Analysis of Computer Enhanced Group Decision Making Strategies
2007-11-01
group decision making. 20 References American Psychological Association (2002). Ethical principles of psychologists and code of conduct. American... Creelman , C. D. (2005). Detection theory: A user’s guide (2nd ed.). Mahwah, NJ: Lawrence Erlbaum. Sorkin, R. D. (1998). Group performance depends on...the majority rule. Psychological Science, 9, 456-463. Sorkin, R. D. (2001). Signal-detection analysis of group decision making. Psychological
Use of handheld computers in clinical practice: a systematic review.
Mickan, Sharon; Atherton, Helen; Roberts, Nia Wyn; Heneghan, Carl; Tilson, Julie K
2014-07-06
Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals' use of handheld computers improve their access to information and support clinical decision making at the point of care? A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study's aim for assessing the impact of handheld computer use. We included seven randomised trials investigating medical or nursing staffs' use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Healthcare professionals' use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes.
Use of handheld computers in clinical practice: a systematic review
2014-01-01
Background Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals’ use of handheld computers improve their access to information and support clinical decision making at the point of care? Methods A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study’s aim for assessing the impact of handheld computer use. Results We included seven randomised trials investigating medical or nursing staffs’ use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Conclusion Healthcare professionals’ use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes. PMID:24998515
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Wei; Reddy, T. A.; Gurian, Patrick
2007-01-31
A companion paper to Jiang and Reddy that presents a general and computationally efficient methodology for dyanmic scheduling and optimal control of complex primary HVAC&R plants using a deterministic engineering optimization approach.
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
Modeling Human-Computer Decision Making with Covariance Structure Analysis.
ERIC Educational Resources Information Center
Coovert, Michael D.; And Others
Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…
A Multi-Objective Decision-Making Model for Resources Allocation in Humanitarian Relief
2007-03-01
Applied Mathematics and Computation 163, 2005, pp756 19. Malczewski, J., GIS and Multicriteria Decision Analysis , John Wiley and Sons, New York... used when interpreting the results of the analysis . (Raimo et al. 2002) (7) Sensitivity analysis Sensitivity analysis in a DA process answers...Budget Scenario Analysis The MILP is solved ( using LINDO 6.1) for high, medium and low budget scenarios in both damage degree levels. Tables 17 and
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
Mass Conflagration: An Analysis and Adaptation of the Shipboard Damage Control Organization
1991-03-01
the span of control narrows, as each supervisor is able to better monitor the actions and environment of his subordinates. (6) Communciation and... computed decision is reached by the decision makers, often based on a prior formal doctrine or methodology. [Ref. 4:p. 364] While no decision process
Natural Resource Information System, design analysis
NASA Technical Reports Server (NTRS)
1972-01-01
The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.
Enabling drug discovery project decisions with integrated computational chemistry and informatics
NASA Astrophysics Data System (ADS)
Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.
2017-03-01
Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.
The Use of Geoprocessing in Educational Research and Decision Support.
ERIC Educational Resources Information Center
Sexton, Porter
1982-01-01
Discusses geoprocessing, a computer mapping technique used by the Portland (Oregon) School District in which geographic analysis and data processing are combined. Several applications for administrative decision-making are discussed, including bus routing and redistricting. (JJD)
A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.
Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe
2011-05-30
Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
Microeconomic Analysis with BASIC.
ERIC Educational Resources Information Center
Tom, C. F. Joseph
Computer programs written in BASIC for the study of microeconomic analysis with special emphasis in economic decisions on price, output, and profit of a business firm are described. A very brief overview of the content of each of the 28 computer programs comprising the course is provided; four of the programs are then discussed in greater detail.…
The Use of Computer Networks in Data Gathering and Data Analysis.
ERIC Educational Resources Information Center
Yost, Michael; Bremner, Fred
This document describes the review, analysis, and decision-making process that Trinity University, Texas, went through to develop the three-part computer network that they use to gather and analyze EEG (electroencephalography) and EKG (electrocardiogram) data. The data are gathered in the laboratory on a PDP-1124, an analog minicomputer. Once…
Common Sense Planning for a Computer, or, What's It Worth to You?
ERIC Educational Resources Information Center
Crawford, Walt
1984-01-01
Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)
Collaborative Brain-Computer Interface for Aiding Decision-Making
Poli, Riccardo; Valeriani, Davide; Cinel, Caterina
2014-01-01
We look at the possibility of integrating the percepts from multiple non-communicating observers as a means of achieving better joint perception and better group decisions. Our approach involves the combination of a brain-computer interface with human behavioural responses. To test ideas in controlled conditions, we asked observers to perform a simple matching task involving the rapid sequential presentation of pairs of visual patterns and the subsequent decision as whether the two patterns in a pair were the same or different. We recorded the response times of observers as well as a neural feature which predicts incorrect decisions and, thus, indirectly indicates the confidence of the decisions made by the observers. We then built a composite neuro-behavioural feature which optimally combines the two measures. For group decisions, we uses a majority rule and three rules which weigh the decisions of each observer based on response times and our neural and neuro-behavioural features. Results indicate that the integration of behavioural responses and neural features can significantly improve accuracy when compared with the majority rule. An analysis of event-related potentials indicates that substantial differences are present in the proximity of the response for correct and incorrect trials, further corroborating the idea of using hybrids of brain-computer interfaces and traditional strategies for improving decision making. PMID:25072739
Computational Support for Technology- Investment Decisions
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey
2007-01-01
Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.
Montgomery, Alan A; Emmett, Clare L; Fahey, Tom; Jones, Claire; Ricketts, Ian; Patel, Roshni R; Peters, Tim J; Murphy, Deirdre J
2007-06-23
To determine the effects of two computer based decision aids on decisional conflict and mode of delivery among pregnant women with a previous caesarean section. Randomised trial, conducted from May 2004 to August 2006. Four maternity units in south west England, and Scotland. 742 pregnant women with one previous lower segment caesarean section and delivery expected at >or=37 weeks. Non-English speakers were excluded. Usual care: standard care given by obstetric and midwifery staff. Information programme: women navigated through descriptions and probabilities of clinical outcomes for mother and baby associated with planned vaginal birth, elective caesarean section, and emergency caesarean section. Decision analysis: mode of delivery was recommended based on utility assessments performed by the woman combined with probabilities of clinical outcomes within a concealed decision tree. Both interventions were delivered via a laptop computer after brief instructions from a researcher. Total score on decisional conflict scale, and mode of delivery. Women in the information programme (adjusted difference -6.2, 95% confidence interval -8.7 to -3.7) and the decision analysis (-4.0, -6.5 to -1.5) groups had reduced decisional conflict compared with women in the usual care group. The rate of vaginal birth was higher for women in the decision analysis group compared with the usual care group (37% v 30%, adjusted odds ratio 1.42, 0.94 to 2.14), but the rates were similar in the information programme and usual care groups. Decision aids can help women who have had a previous caesarean section to decide on mode of delivery in a subsequent pregnancy. The decision analysis approach might substantially affect national rates of caesarean section. Trial Registration Current Controlled Trials ISRCTN84367722.
Systems Analysis and Design for Decision Support Systems on Economic Feasibility of Projects
NASA Astrophysics Data System (ADS)
Balaji, S. Arun
2010-11-01
This paper discuss about need for development of the Decision Support System (DSS) software for economic feasibility of projects in Rwanda, Africa. The various economic theories needed and the corresponding formulae to compute payback period, internal rate of return and benefit cost ratio of projects are clearly given in this paper. This paper is also deals with the systems flow chart to fabricate the system in any higher level computing language. The various input requirements from the projects and the output needed for the decision makers are also included in this paper. The data dictionary used for input and output data structure is also explained.
Instrumentation and computational modeling for evaluation of bridges substructures across waterways.
DOT National Transportation Integrated Search
2013-12-01
This State Study 229 was proposed as the Phase I study for implementing sensing technologies and computational analysis to assess bridge conditions and support decision-making for bridge maintenance in Mississippi. The objectives of the study are to:...
[Medical expert systems and clinical needs].
Buscher, H P
1991-10-18
The rapid expansion of computer-based systems for problem solving or decision making in medicine, the so-called medical expert systems, emphasize the need for reappraisal of their indication and value. Where specialist knowledge is required, in particular where medical decisions are susceptible to error these systems will probably serve as a valuable support. In the near future computer-based systems should be able to aid the interpretation of findings of technical investigations and the control of treatment, especially where rapid reactions are necessary despite the need of complex analysis of investigated parameters. In the distant future complete support of diagnostic procedures from the history to final diagnosis is possible. It promises to be particularly attractive for the diagnosis of seldom diseases, for difficult differential diagnoses, and in the decision making in the case of expensive, risky or new diagnostic or therapeutic methods. The physician needs to be aware of certain dangers, ranging from misleading information up to abuse. Patient information depends often on subjective reports and error-prone observations. Although basing on problematic knowledge computer-born decisions may have an imperative effect on medical decision making. Also it must be born in mind that medical decisions should always combine the rational with a consideration of human motives.
Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis
2016-03-01
Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.
JPRS Report, Science & Technology, USSR: Computers
1987-09-29
Reliability of Protected Systems (L.S. Stoykova, O.A. Yushchenko; KIBERNETIKA, No 5, Sep-Oct 86) U Decision Making Based on Analysis of a Decision...34 published by the Central Scientific Research Institute for Information and Technoeconomic Research on Material and Technical Supply (TsNIITEIMS) of the...was said becomes clear after a subconscious analysis of the context. We have built our device according to the same pattern. In contrast to its
Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning
NASA Technical Reports Server (NTRS)
Otterstatter, Matthew R.
2005-01-01
The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.
A Cross-National CAI Tool To Support Learning Operations Decision-Making and Market Analysis.
ERIC Educational Resources Information Center
Mockler, Robert J.; Afanasiev, Mikhail Y.; Dologite, Dorothy G.
1999-01-01
Describes bicultural (United States and Russia) development of a computer-aided instruction (CAI) tool to learn management decision-making using information systems technologies. The program has been used with undergraduate and graduate students in both countries; it integrates free and controlled market concepts and combines traditional computer…
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
Monitoring and decision making by people in man machine systems
NASA Technical Reports Server (NTRS)
Johannsen, G.
1979-01-01
The analysis of human monitoring and decision making behavior as well as its modeling are described. Classic and optimal control theoretical, monitoring models are surveyed. The relationship between attention allocation and eye movements is discussed. As an example of applications, the evaluation of predictor displays by means of the optimal control model is explained. Fault detection involving continuous signals and decision making behavior of a human operator engaged in fault diagnosis during different operation and maintenance situations are illustrated. Computer aided decision making is considered as a queueing problem. It is shown to what extent computer aids can be based on the state of human activity as measured by psychophysiological quantities. Finally, management information systems for different application areas are mentioned. The possibilities of mathematical modeling of human behavior in complex man machine systems are also critically assessed.
The Modeling of Human Intelligence in the Computer as Demonstrated in the Game of DIPLOMAT.
ERIC Educational Resources Information Center
Collins, James Edward; Paulsen, Thomas Dean
An attempt was made to develop human-like behavior in the computer. A theory of the human learning process was described. A computer game was presented which simulated the human capabilities of reasoning and learning. The program was required to make intelligent decisions based on past experiences and critical analysis of the present situation.…
Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.
Aitchison, Laurence; Bang, Dan; Bahrami, Bahador; Latham, Peter E
2015-10-01
Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people's confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality.
Interfacing Computer Aided Parallelization and Performance Analysis
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)
2003-01-01
When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.
NASA Astrophysics Data System (ADS)
Rajabzadeh-Oghaz, Hamidreza; Varble, Nicole; Davies, Jason M.; Mowla, Ashkan; Shakir, Hakeem J.; Sonig, Ashish; Shallwani, Hussain; Snyder, Kenneth V.; Levy, Elad I.; Siddiqui, Adnan H.; Meng, Hui
2017-03-01
Neurosurgeons currently base most of their treatment decisions for intracranial aneurysms (IAs) on morphological measurements made manually from 2D angiographic images. These measurements tend to be inaccurate because 2D measurements cannot capture the complex geometry of IAs and because manual measurements are variable depending on the clinician's experience and opinion. Incorrect morphological measurements may lead to inappropriate treatment strategies. In order to improve the accuracy and consistency of morphological analysis of IAs, we have developed an image-based computational tool, AView. In this study, we quantified the accuracy of computer-assisted adjuncts of AView for aneurysmal morphologic assessment by performing measurement on spheres of known size and anatomical IA models. AView has an average morphological error of 0.56% in size and 2.1% in volume measurement. We also investigate the clinical utility of this tool on a retrospective clinical dataset and compare size and neck diameter measurement between 2D manual and 3D computer-assisted measurement. The average error was 22% and 30% in the manual measurement of size and aneurysm neck diameter, respectively. Inaccuracies due to manual measurements could therefore lead to wrong treatment decisions in 44% and inappropriate treatment strategies in 33% of the IAs. Furthermore, computer-assisted analysis of IAs improves the consistency in measurement among clinicians by 62% in size and 82% in neck diameter measurement. We conclude that AView dramatically improves accuracy for morphological analysis. These results illustrate the necessity of a computer-assisted approach for the morphological analysis of IAs.
Bioinformatics in proteomics: application, terminology, and pitfalls.
Wiemer, Jan C; Prokudin, Alexander
2004-01-01
Bioinformatics applies data mining, i.e., modern computer-based statistics, to biomedical data. It leverages on machine learning approaches, such as artificial neural networks, decision trees and clustering algorithms, and is ideally suited for handling huge data amounts. In this article, we review the analysis of mass spectrometry data in proteomics, starting with common pre-processing steps and using single decision trees and decision tree ensembles for classification. Special emphasis is put on the pitfall of overfitting, i.e., of generating too complex single decision trees. Finally, we discuss the pros and cons of the two different decision tree usages.
Effects of computer-based training on procedural modifications to standard functional analyses.
Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.
A Computer String-Grammar of English.
ERIC Educational Resources Information Center
Sager, Naomi
This volume is the fourth in a series of detailed reports on a working computer program for the syntactic analysis of English sentences into their component strings. The report (1) records the considerations involved in various decisions among alternative grammatical formulations and presents the word-subclasses, the linguistic strings, etc., for…
Trusted Advisors, Decision Models and Other Keys to Communicating Science to Decision Makers
NASA Astrophysics Data System (ADS)
Webb, E.
2006-12-01
Water resource management decisions often involve multiple parties engaged in contentious negotiations that try to navigate through complex combinations of legal, social, hydrologic, financial, and engineering considerations. The standard approach for resolving these issues is some form of multi-party negotiation, a formal court decision, or a combination of the two. In all these cases, the role of the decision maker(s) is to choose and implement the best option that fits the needs and wants of the community. However, each path to a decision carries the risk of technical and/or financial infeasibility as well as the possibility of unintended consequences. To help reduce this risk, decision makers often rely on some type of predictive analysis from which they can evaluate the projected consequences of their decisions. Typically, decision makers are supported in the analysis process by trusted advisors who engage in the analysis as well as the day to day tasks associated with multi-party negotiations. In the case of water resource management, the analysis is frequently a numerical model or set of models that can simulate various management decisions across multiple systems and output results that illustrate the impact on areas of concern. Thus, in order to communicate scientific knowledge to the decision makers, the quality of the communication between the analysts, the trusted advisor, and the decision maker must be clear and direct. To illustrate this concept, a multi-attribute decision analysis matrix will be used to outline the value of computer model-based collaborative negotiation approaches to guide water resources decision making and communication with decision makers. In addition, the critical role of the trusted advisor and other secondary participants in the decision process will be discussed using examples from recent water negotiations.
A Gaussian Approximation Approach for Value of Information Analysis.
Jalal, Hawre; Alarid-Escudero, Fernando
2018-02-01
Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.
Growth Dynamics of Information Search Services
ERIC Educational Resources Information Center
Lindquist, Mats G.
1978-01-01
An analysis of computer-based search services (ISSs) from a system's viewpoint, using a continuous simulation model to reveal growth and stagnation of a typical system is presented, as well as an analysis of decision making for an ISS. (Author/MBR)
2017-01-01
In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes “winner-take-all” processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans’ value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light. PMID:29077746
Colas, Jaron T
2017-01-01
In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes "winner-take-all" processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans' value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light.
Happenstance and compromise: a gendered analysis of students' computing degree course selection
NASA Astrophysics Data System (ADS)
Lang, Catherine
2010-12-01
The number of students choosing to study computing at university continues to decline this century, with an even sharper decline in female students. This article presents the results of a series of interviews with university students studying computing courses in Australia that uncovered the influence of happenstance and compromise on course choice. This investigation provides an insight into the contributing factors into the continued downturn of student diversity in computing bachelor degree courses. Many females interviewed made decisions based on happenstance, many males interviewed had chosen computing as a compromise course, and family helped in the decision-making to a large degree in both genders. The major implication from this investigation is the finding that students of both genders appear to be socialised away from this discipline, which is perceived as a support or insurance skill, not a career in itself, in all but the most technical-oriented (usually male) student.
Kostopoulos, Spiros; Ravazoula, Panagiota; Asvestas, Pantelis; Kalatzis, Ioannis; Xenogiannopoulos, George; Cavouras, Dionisis; Glotsos, Dimitris
2017-06-01
Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.
Chaisangmongkon, Warasinee; Swaminathan, Sruthi K.; Freedman, David J.; Wang, Xiao-Jing
2017-01-01
Summary Decision making involves dynamic interplay between internal judgements and external perception, which has been investigated in delayed match-to-category (DMC) experiments. Our analysis of neural recordings shows that, during DMC tasks, LIP and PFC neurons demonstrate mixed, time-varying, and heterogeneous selectivity, but previous theoretical work has not established the link between these neural characteristics and population-level computations. We trained a recurrent network model to perform DMC tasks and found that the model can remarkably reproduce key features of neuronal selectivity at the single-neuron and population levels. Analysis of the trained networks elucidates that robust transient trajectories of the neural population are the key driver of sequential categorical decisions. The directions of trajectories are governed by network self-organized connectivity, defining a ‘neural landscape’, consisting of a task-tailored arrangement of slow states and dynamical tunnels. With this model, we can identify functionally-relevant circuit motifs and generalize the framework to solve other categorization tasks. PMID:28334612
PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems
NASA Astrophysics Data System (ADS)
da Silva, Glauco; Netto Lahoz, Carlos Henrique
2013-09-01
This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).
Computational Complexity and Human Decision-Making.
Bossaerts, Peter; Murawski, Carsten
2017-12-01
The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.
CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.
2011-11-15
We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb, Erik Karl; Tidwell, Vincent Carroll
2009-10-01
This document outlines ways to more effectively communicate with U.S. Federal decision makers by outlining the structure, authority, and motivations of various Federal groups, how to find the trusted advisors, and how to structure communication. All three branches of Federal governments have decision makers engaged in resolving major policy issues. The Legislative Branch (Congress) negotiates the authority and the resources that can be used by the Executive Branch. The Executive Branch has some latitude in implementation and prioritizing resources. The Judicial Branch resolves disputes. The goal of all decision makers is to choose and implement the option that best fitsmore » the needs and wants of the community. However, understanding the risk of technical, political and/or financial infeasibility and possible unintended consequences is extremely difficult. Primarily, decision makers are supported in their deliberations by trusted advisors who engage in the analysis of options as well as the day-to-day tasks associated with multi-party negotiations. In the best case, the trusted advisors use many sources of information to inform the process including the opinion of experts and if possible predictive analysis from which they can evaluate the projected consequences of their decisions. The paper covers the following: (1) Understanding Executive and Legislative decision makers - What can these decision makers do? (2) Finding the target audience - Who are the internal and external trusted advisors? (3) Packaging the message - How do we parse and integrate information, and how do we use computer simulation or models in policy communication?« less
NASA Technical Reports Server (NTRS)
Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug
2005-01-01
Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.
Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection
NASA Astrophysics Data System (ADS)
Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok
2011-11-01
Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.
NASA Astrophysics Data System (ADS)
Chen, Ting-Yu
2012-06-01
This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Financial Analysis for R&D Decisions.
ERIC Educational Resources Information Center
Carter, Robert
1997-01-01
Using personal computer spreadsheet software, standard corporate financial analysis can help university research administrators communicate the value of research and development to sponsors and other stakeholders; balance projects, technologies, or categories of research; and continually assess the value of investing in ongoing projects. It also…
DOT National Transportation Integrated Search
1996-11-01
The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...
The analysis of the pilot's cognitive and decision processes
NASA Technical Reports Server (NTRS)
Curry, R. E.
1975-01-01
Articles are presented on pilot performance in zero-visibility precision approach, failure detection by pilots during automatic landing, experiments in pilot decision-making during simulated low visibility approaches, a multinomial maximum likelihood program, and a random search algorithm for laboratory computers. Other topics discussed include detection of system failures in multi-axis tasks and changes in pilot workload during an instrument landing.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Desktop microsimulation: a tool to improve efficiency in the medical office practice.
Montgomery, James B; Linville, Beth A; Slonim, Anthony D
2013-01-01
Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.
Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel
2012-11-01
Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.
Discharge Chamber Primary Electron Modeling Activities in Three-Dimensions
NASA Technical Reports Server (NTRS)
Steuber, Thomas J.
2004-01-01
Designing discharge chambers for ion thrusters involves many geometric configuration decisions. Various decisions will impact discharge chamber performance with respect to propellant utilization efficiency, ion production costs, and grid lifetime. These hardware design decisions can benefit from the assistance of computational modeling. Computational modeling for discharge chambers has been limited to two-dimensional codes that leveraged symmetry for interpretation into three-dimensional analysis. This paper presents model development activities towards a three-dimensional discharge chamber simulation to aid discharge chamber design decisions. Specifically, of the many geometric configuration decisions toward attainment of a worthy discharge chamber, this paper focuses on addressing magnetic circuit considerations with a three-dimensional discharge chamber simulation as a tool. With this tool, candidate discharge chamber magnetic circuit designs can be analyzed computationally to gain insight into factors that may influence discharge chamber performance such as: primary electron loss width in magnetic cusps, cathode tip position with respect to the low magnetic field volume, definition of a low magnetic field region, and maintenance of a low magnetic field region across the grid span. Corroborating experimental data will be obtained from mockup hardware tests. Initially, simulated candidate magnetic circuit designs will resemble previous successful thruster designs. To provide opportunity to improve beyond previous performance benchmarks, off-design modifications will be simulated and experimentally tested.
Preaching What We Practice: Teaching Ethical Decision-Making to Computer Security Professionals
NASA Astrophysics Data System (ADS)
Fleischmann, Kenneth R.
The biggest challenge facing computer security researchers and professionals is not learning how to make ethical decisions; rather it is learning how to recognize ethical decisions. All too often, technology development suffers from what Langdon Winner terms technological somnambulism - we sleepwalk through our technology design, following past precedents without a second thought, and fail to consider the perspectives of other stakeholders [1]. Computer security research and practice involves a number of opportunities for ethical decisions. For example, decisions about whether or not to automatically provide security updates involve tradeoffs related to caring versus user autonomy. Decisions about online voting include tradeoffs between convenience and security. Finally, decisions about routinely screening e-mails for spam involve tradeoffs of efficiency and privacy. It is critical that these and other decisions facing computer security researchers and professionals are confronted head on as value-laden design decisions, and that computer security researchers and professionals consider the perspectives of various stakeholders in making these decisions.
Cost-Effectiveness and Cost-Benefit Analysis: Confronting the Problem of Choice.
ERIC Educational Resources Information Center
Clardy, Alan
Cost-effectiveness analysis and cost-benefit analysis are two related yet distinct methods to help decision makers choose the best course of action from among competing alternatives. For both types of analysis, costs are computed similarly. Costs may be reduced to present value amounts for multi-year programs, and parameters may be altered to show…
DOT National Transportation Integrated Search
1978-01-01
A system analysis was completed of the general deterrence of driving while intoxicated (DWI). Elements which influence DWI decisions were identified and interrelated in a system model; then, potential countermeasures which might be employed in DWI ge...
Hospital site selection using fuzzy AHP and its derivatives.
Vahidnia, Mohammad H; Alesheikh, Ali A; Alimohammadi, Abbas
2009-07-01
Environmental managers are commonly faced with sophisticated decisions, such as choosing the location of a new facility subject to multiple conflicting criteria. This paper considers the specific problem of creating a well-distributed network of hospitals that delivers its services to the target population with minimal time, pollution and cost. We develop a Multi-Criteria Decision Analysis process that combines Geographical Information System (GIS) analysis with the Fuzzy Analytical Hierarchy Process (FAHP), and use this process to determine the optimum site for a new hospital in the Tehran urban area. The GIS was used to calculate and classify governing criteria, while FAHP was used to evaluate the decision factors and their impacts on alternative sites. Three methods were used to estimate the total weights and priorities of the candidate sites: fuzzy extent analysis, center-of-area defuzzification, and the alpha-cut method. The three methods yield identical priorities for the five alternatives considered. Fuzzy extent analysis provides less discriminating power, but is simpler to implement and compute than the other two methods. The alpha-cut method is more complicated, but integrates the uncertainty and overall attitude of the decision-maker. The usefulness of the new hospital site is evaluated by computing an accessibility index for each pixel in the GIS, defined as the ratio of population density to travel time. With the addition of a new hospital at the optimum site, this index improved over about 6.5 percent of the geographical area.
Attributes Affecting Computer-Aided Decision Making--A Literature Survey.
ERIC Educational Resources Information Center
Moldafsky, Neil I; Kwon, Ik-Whan
1994-01-01
Reviews current literature about personal, demographic, situational, and cognitive attributes that affect computer-aided decision making. The effectiveness of computer-aided decision making is explored in relation to decision quality, effectiveness, and confidence. Studies of the effects of age, anxiety, cognitive type, attitude, gender, and prior…
ERIC Educational Resources Information Center
Von Der Linn, Robert Christopher
A needs assessment of the Grumman E-Beam Systems Group identified the requirement for additional skill mastery for the engineers who assemble, integrate, and maintain devices used to manufacture integrated circuits. Further analysis of the tasks involved led to the decision to develop interactive videodisc, computer-based job aids to enable…
How Do the Different Types of Computer Use Affect Math Achievement?
ERIC Educational Resources Information Center
Flores, Raymond; Inan, Fethi; Lin, Zhangxi
2013-01-01
In this study, the National Educational Longitudinal Study (ELS:2002) dataset was used and a predictive data mining technique, decision tree analysis, was implemented in order to examine which factors, in conjunction to computer use, can be used to predict high or low probability of success in high school mathematics. Specifically, this study…
NASA Astrophysics Data System (ADS)
Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.
2015-08-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.
NASA Astrophysics Data System (ADS)
Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.
2015-04-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.
NASA Technical Reports Server (NTRS)
Fertis, D. G.; Simon, A. L.
1981-01-01
The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.
Elements of an integrated health monitoring framework
NASA Astrophysics Data System (ADS)
Fraser, Michael; Elgamal, Ahmed; Conte, Joel P.; Masri, Sami; Fountain, Tony; Gupta, Amarnath; Trivedi, Mohan; El Zarki, Magda
2003-07-01
Internet technologies are increasingly facilitating real-time monitoring of Bridges and Highways. The advances in wireless communications for instance, are allowing practical deployments for large extended systems. Sensor data, including video signals, can be used for long-term condition assessment, traffic-load regulation, emergency response, and seismic safety applications. Computer-based automated signal-analysis algorithms routinely process the incoming data and determine anomalies based on pre-defined response thresholds and more involved signal analysis techniques. Upon authentication, appropriate action may be authorized for maintenance, early warning, and/or emergency response. In such a strategy, data from thousands of sensors can be analyzed with near real-time and long-term assessment and decision-making implications. Addressing the above, a flexible and scalable (e.g., for an entire Highway system, or portfolio of Networked Civil Infrastructure) software architecture/framework is being developed and implemented. This framework will network and integrate real-time heterogeneous sensor data, database and archiving systems, computer vision, data analysis and interpretation, physics-based numerical simulation of complex structural systems, visualization, reliability & risk analysis, and rational statistical decision-making procedures. Thus, within this framework, data is converted into information, information into knowledge, and knowledge into decision at the end of the pipeline. Such a decision-support system contributes to the vitality of our economy, as rehabilitation, renewal, replacement, and/or maintenance of this infrastructure are estimated to require expenditures in the Trillion-dollar range nationwide, including issues of Homeland security and natural disaster mitigation. A pilot website (http://bridge.ucsd.edu/compositedeck.html) currently depicts some basic elements of the envisioned integrated health monitoring analysis framework.
Multicriteria meta-heuristics for AGV dispatching control based on computational intelligence.
Naso, David; Turchiano, Biagio
2005-04-01
In many manufacturing environments, automated guided vehicles are used to move the processed materials between various pickup and delivery points. The assignment of vehicles to unit loads is a complex problem that is often solved in real-time with simple dispatching rules. This paper proposes an automated guided vehicles dispatching approach based on computational intelligence. We adopt a fuzzy multicriteria decision strategy to simultaneously take into account multiple aspects in every dispatching decision. Since the typical short-term view of dispatching rules is one of the main limitations of such real-time assignment heuristics, we also incorporate in the multicriteria algorithm a specific heuristic rule that takes into account the empty-vehicle travel on a longer time-horizon. Moreover, we also adopt a genetic algorithm to tune the weights associated to each decision criteria in the global decision algorithm. The proposed approach is validated by means of a comparison with other dispatching rules, and with other recently proposed multicriteria dispatching strategies also based on computational Intelligence. The analysis of the results obtained by the proposed dispatching approach in both nominal and perturbed operating conditions (congestions, faults) confirms its effectiveness.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology
NASA Astrophysics Data System (ADS)
Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli
2007-06-01
Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.
Risk analysis of computer system designs
NASA Technical Reports Server (NTRS)
Vallone, A.
1981-01-01
Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.
Development of Analysis Tools for Certification of Flight Control Laws
2009-03-31
In Proc. Conf. on Decision and Control, pages 881-886, Bahamas, 2004. [7] G. Chesi, A. Garulli, A. Tesi , and A. Vicino. LMI-based computation of...Minneapolis, MN, 2006, pp. 117-122. [10] G. Chesi, A. Garulli, A. Tesi . and A. Vicino, "LMI-based computation of optimal quadratic Lyapunov functions...Convex Optimization. Cambridge Univ. Press. Chesi, G., A. Garulli, A. Tesi and A. Vicino (2005). LMI-based computation of optimal quadratic Lyapunov
1982-05-01
Raiffa (831, LaValle [891, and other books on decision analysis. 4.2 Risk Attitudes Much recent research has focused on the investigation of various risk...Issacs, G.L., Hamer, R., Chen, J., Chuang, D., Woodworth, G., Molenaar , I., Lewis C., and Libby, D., Manual for the Computer-Assisted Data Analysis (CADA
Wright, Adam; Sittig, Dean F; Ash, Joan S; Erickson, Jessica L; Hickman, Trang T; Paterno, Marilyn; Gebhardt, Eric; McMullen, Carmit; Tsurikova, Ruslana; Dixon, Brian E; Fraser, Greg; Simonaitis, Linas; Sonnenberg, Frank A; Middleton, Blackford
2015-11-01
To identify challenges, lessons learned and best practices for service-oriented clinical decision support, based on the results of the Clinical Decision Support Consortium, a multi-site study which developed, implemented and evaluated clinical decision support services in a diverse range of electronic health records. Ethnographic investigation using the rapid assessment process, a procedure for agile qualitative data collection and analysis, including clinical observation, system demonstrations and analysis and 91 interviews. We identified challenges and lessons learned in eight dimensions: (1) hardware and software computing infrastructure, (2) clinical content, (3) human-computer interface, (4) people, (5) workflow and communication, (6) internal organizational policies, procedures, environment and culture, (7) external rules, regulations, and pressures and (8) system measurement and monitoring. Key challenges included performance issues (particularly related to data retrieval), differences in terminologies used across sites, workflow variability and the need for a legal framework. Based on the challenges and lessons learned, we identified eight best practices for developers and implementers of service-oriented clinical decision support: (1) optimize performance, or make asynchronous calls, (2) be liberal in what you accept (particularly for terminology), (3) foster clinical transparency, (4) develop a legal framework, (5) support a flexible front-end, (6) dedicate human resources, (7) support peer-to-peer communication, (8) improve standards. The Clinical Decision Support Consortium successfully developed a clinical decision support service and implemented it in four different electronic health records and four diverse clinical sites; however, the process was arduous. The lessons identified by the Consortium may be useful for other developers and implementers of clinical decision support services. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Hayes, Kathryn J; Eljiz, Kathy; Dadich, Ann; Fitzgerald, Janna-Anneke; Sloan, Terry
2015-01-01
The purpose of this paper is to provide a retrospective analysis of computer simulation's role in accelerating individual innovation adoption decisions. The process innovation examined is Lean Systems Thinking, and the organizational context is the imaging department of an Australian public hospital. Intrinsic case study methods including observation, interviews with radiology and emergency personnel about scheduling procedures, mapping patient appointment processes and document analysis were used over three years and then complemented with retrospective interviews with key hospital staff. The multiple data sources and methods were combined in a pragmatic and reflexive manner to explore an extreme case that provides potential to act as an instructive template for effective change. Computer simulation of process change ideas offered by staff to improve patient-flow accelerated the adoption of the process changes, largely because animated computer simulation permitted experimentation (trialability), provided observable predictions of change results (observability) and minimized perceived risk. The difficulty of making accurate comparisons between time periods in a health care setting is acknowledged. This work has implications for policy, practice and theory, particularly for inducing the rapid diffusion of process innovations to address challenges facing health service organizations and national health systems. Originality/value - The research demonstrates the value of animated computer simulation in presenting the need for change, identifying options, and predicting change outcomes and is the first work to indicate the importance of trialability, observability and risk reduction in individual adoption decisions in health services.
Modelling technological process of ion-exchange filtration of fluids in porous media
NASA Astrophysics Data System (ADS)
Ravshanov, N.; Saidov, U. M.
2018-05-01
Solution of an actual problem related to the process of filtration and dehydration of liquid and ionic solutions from gel particles and heavy ionic compounds is considered in the paper. This technological process is realized during the preparation and cleaning of chemical solutions, drinking water, pharmaceuticals, liquid fuels, products for public use, etc. For the analysis, research, determination of the main parameters of the technological process and operating modes of filter units and for support in managerial decision-making, a mathematical model is developed. Using the developed model, a series of computational experiments on a computer is carried out. The results of numerical calculations are illustrated in the form of graphs. Based on the analysis of numerical experiments, the conclusions are formulated that serve as the basis for making appropriate managerial decisions.
Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz
2017-02-01
Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro-medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED.
Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz
2017-01-01
Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro–medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED. PMID:27301429
Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.
2014-01-01
Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241
An Amphibious Ship-To-Shore Simulation for Use on an IBM PC (Personal Computer)
1984-09-01
CA : «< <- j Special ■ *- amphibious ship- an IBM Personal ion of the phy- he logic used analysis, and a DD | JAM 11 1473 COITION...research, for instance, wiL1 be geared toward a technically oriented person who is familiar with computers, programming and the associated logic. A...problem, often vaguely stated by the decision aaker , into precise and operational terms [Ref. Hz p.51]. The analysis begins with specification of the
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
NASA Astrophysics Data System (ADS)
Pierce, S. A.
2017-12-01
Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case studies to highlight how Cloud CI streamlines the process for setting up an interactive decision support system. Moreover, advances in artificial intelligence offer new techniques for old problems from integrating data to adaptive sensing or from interactive dashboards to optimizing multi-attribute problems. The combination of scientific expertise, flexible cloud computing solutions, and intelligent systems opens new research horizons.
NASA Astrophysics Data System (ADS)
Gresch, Helge; Hasselhorn, Marcus; Bögeholz, Susanne
2013-10-01
Dealing with socio-scientific issues in science classes enables students to participate productively in controversial discussions concerning ethical topics, such as sustainable development. In this respect, well-structured decision-making processes are essential for elaborate reasoning. To foster decision-making competence, a computer-based programme was developed that trains secondary school students (grades 11-13) in decision-making strategies. The main research question is: does training students to use these strategies foster decision-making competence? In addition, the influence of meta-decision aids was examined. Students conducted a task analysis to select an appropriate strategy prior to the decision-making process. Hence, the second research question is: does combining decision-making training with a task analysis enhance decision-making competence at a higher rate? To answer these questions, 386 students were tested in a pre-post-follow-up control-group design that included two training groups (decision-making strategies/decision-making strategies combined with a task analysis) and a control group (decision-making with additional ecological information instead of strategic training). An open-ended questionnaire was used to assess decision-making competence in situations related to sustainable development. The decision-making training led to a significant improvement in the post-test and the follow-up, which was administered three months after the training. Long-term effects on the quality of the students' decisions were evident for both training groups. Gains in competence when reflecting upon the decision-making processes of others were found, to a lesser extent, in the training group that received the additional meta-decision training. In conclusion, training in decision-making strategies is a promising approach to deal with socio-scientific issues related to sustainable development.
An Efficiency Analysis of U.S. Business Schools
ERIC Educational Resources Information Center
Sexton, Thomas R.
2010-01-01
In the current economic climate, business schools face crucial decisions. As resources become scarcer, schools must either streamline operations or limit them. An efficiency analysis of U.S. business schools is presented that computes, for each business school, an overall efficiency score and provides separate factor efficiency scores, indicating…
Use of cloud computing technology in natural hazard assessment and emergency management
NASA Astrophysics Data System (ADS)
Webley, P. W.; Dehn, J.
2015-12-01
During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.
An Analysis for Capital Expenditure Decisions at a Naval Regional Medical Center.
1981-12-01
Service Equipment Review Committee 1. Portable defibrilator Computed tomographic scanner and cardioscope 2. ECG cart Automated blood cell counter 3. Gas...system sterilizer Gas system sterilizer 4. Automated blood cell Portable defibrilator and counter cardioscope 5. Computed tomographic ECG cart scanner...dictating and automated typing) systems. e. Filing equipment f. Automatic data processing equipment including data communications equipment. g
ERIC Educational Resources Information Center
Shubik, Martin
The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…
Using medical knowledge sources on handheld computers--a qualitative study among junior doctors.
Axelson, Christian; Wårdh, Inger; Strender, Lars-Erik; Nilsson, Gunnar
2007-09-01
The emergence of mobile computing could have an impact on how junior doctors learn. To exploit this opportunity it is essential to understand their information seeking process. To explore junior doctors' experiences of using medical knowledge sources on handheld computers. Interviews with five Swedish junior doctors. A qualitative manifest content analysis of a focus group interview followed by a qualitative latent content analysis of two individual interviews. A focus group interview showed that users were satisfied with access to handheld medical knowledge sources, but there was concern about contents, reliability and device dependency. Four categories emerged from individual interviews: (1) A feeling of uncertainty about using handheld technology in medical care; (2) A sense of security that handhelds can provide; (3) A need for contents to be personalized; (4) A degree of adaptability to make the handheld a versatile information tool. A theme was established to link the four categories together, as expressed in the Conclusion section. Junior doctors' experiences of using medical knowledge sources on handheld computers shed light on the need to decrease uncertainty about clinical decisions during medical internship, and to find ways to influence the level of self-confidence in the junior doctor's process of decision-making.
2001-10-25
THE USE of MAJOR RISK FACTORS for COMPUTER-BASED DISTINCTION of DIABETIC PATIENTS with ISCHEMIC STROKE and WITHOUT STROKE Sibel Oge Merey1...highlighting the major risk factors of diabetic patients with non-embolic stroke and without stroke by performing dependency analysis and decision making...of Major Risk Factors for Computer-Based Distinction of Diabetic Patients with Ischemic Stroke and Without Stroke Contract Number Grant Number
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Cloud Service Selection Using Multicriteria Decision Analysis
Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645
Cloud service selection using multicriteria decision analysis.
Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.
Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy
2016-03-01
to be used as a decision - making aid to guide system designers and program managers not necessarily familiar with cognitive pro- cessing, or resource...implementing end-to-end cognitive processing flows multiplies and the impact of these design decisions on efficiency and effectiveness increases [1]. The...end-to-end cognitive systems and alternative computing technologies, then system design and acquisition personnel could make systematic analyses and
1988-09-01
Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Systems Management Dexter R... management system software Diag/Prob Diagnosis and problem solving or problem finding GR Graphics software Int/Transp Interoperability and...language software Plan/D.S. Planning and decision support or decision making PM Program management software SC Systems for Command, Control, Communications
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
Human/Automation Trade Methodology for the Moon, Mars and Beyond
NASA Technical Reports Server (NTRS)
Korsmeyer, David J.
2009-01-01
It is possible to create a consistent trade methodology that can characterize operations model alternatives for crewed exploration missions. For example, a trade-space that is organized around the objective of maximizing Crew Exploration Vehicle (CEV) independence would have the input as a classification of the category of analysis to be conducted or decision to be made, and a commitment to a detailed point in a mission profile during which the analysis or decision is to be made. For example, does the decision have to do with crew activity planning, or life support? Is the mission phase trans-Earth injection, cruise, or lunar descent? Different kinds of decision analysis of the trade-space between human and automated decisions will occurs at different points in a mission's profile. The necessary objectives at a given point in time during a mission will call for different kinds of response with respect to where and how computers and automation are expected to help provide an accurate, safe, and timely response. In this paper, a consistent methodology for assessing the trades between human and automated decisions on-board will be presented and various examples discussed.
CAD system for automatic analysis of CT perfusion maps
NASA Astrophysics Data System (ADS)
Hachaj, T.; Ogiela, M. R.
2011-03-01
In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
NASA Astrophysics Data System (ADS)
Pianosi, Francesca
2015-04-01
Sustainable water resource management in a quickly changing world poses new challenges to hydrology and decision sciences. Systems analysis can contribute to promote sustainable practices by providing the theoretical background and the operational tools for an objective and transparent appraisal of policy options for water resource systems (WRS) management. Traditionally, limited availability of data and computing resources imposed to use oversimplified WRS models, with little consideration of modeling uncertainties and of the non-stationarity and feedbacks between WRS drivers, and a priori aggregation of costs and benefits. Nowadays we increasingly recognize the inadequacy of these simplifications, and consider them among the reasons for the limited use of model-generated information in actual decision-making processes. On the other hand, fast-growing availability of data and computing resources are opening up unprecedented possibilities in the way we build and apply numerical models. In this talk I will discuss my experiences and ideas on how we can exploit this potential to improve model-informed decision-making while facing the challenges of uncertainty, non-stationarity, feedbacks and conflicting objectives. In particular, through practical examples of WRS design and operation problems, my talk will aim at stimulating discussion about the impact of uncertainty on decisions: can inaccurate and imprecise predictions still carry valuable information for decision-making? Does uncertainty in predictions necessarily limit our ability to make 'good' decisions? Or can uncertainty even be of help for decision-making, for instance by reducing the projected conflict between competing water use? Finally, I will also discuss how the traditionally separate disciplines of numerical modelling, optimization, and uncertainty and sensitivity analysis have in my experience been just different facets of the same 'systems approach'.
Dopamine Receptor-Specific Contributions to the Computation of Value.
Burke, Christopher J; Soutschek, Alexander; Weber, Susanna; Raja Beharelle, Anjali; Fehr, Ernst; Haker, Helene; Tobler, Philippe N
2018-05-01
Dopamine is thought to play a crucial role in value-based decision making. However, the specific contributions of different dopamine receptor subtypes to the computation of subjective value remain unknown. Here we demonstrate how the balance between D1 and D2 dopamine receptor subtypes shapes subjective value computation during risky decision making. We administered the D2 receptor antagonist amisulpride or placebo before participants made choices between risky options. Compared with placebo, D2 receptor blockade resulted in more frequent choice of higher risk and higher expected value options. Using a novel model fitting procedure, we concurrently estimated the three parameters that define individual risk attitude according to an influential theoretical account of risky decision making (prospect theory). This analysis revealed that the observed reduction in risk aversion under amisulpride was driven by increased sensitivity to reward magnitude and decreased distortion of outcome probability, resulting in more linear value coding. Our data suggest that different components that govern individual risk attitude are under dopaminergic control, such that D2 receptor blockade facilitates risk taking and expected value processing.
Conflicts of interest improve collective computation of adaptive social structures
Brush, Eleanor R.; Krakauer, David C.; Flack, Jessica C.
2018-01-01
In many biological systems, the functional behavior of a group is collectively computed by the system’s individual components. An example is the brain’s ability to make decisions via the activity of billions of neurons. A long-standing puzzle is how the components’ decisions combine to produce beneficial group-level outputs, despite conflicts of interest and imperfect information. We derive a theoretical model of collective computation from mechanistic first principles, using results from previous work on the computation of power structure in a primate model system. Collective computation has two phases: an information accumulation phase, in which (in this study) pairs of individuals gather information about their fighting abilities and make decisions about their dominance relationships, and an information aggregation phase, in which these decisions are combined to produce a collective computation. To model information accumulation, we extend a stochastic decision-making model—the leaky integrator model used to study neural decision-making—to a multiagent game-theoretic framework. We then test alternative algorithms for aggregating information—in this study, decisions about dominance resulting from the stochastic model—and measure the mutual information between the resultant power structure and the “true” fighting abilities. We find that conflicts of interest can improve accuracy to the benefit of all agents. We also find that the computation can be tuned to produce different power structures by changing the cost of waiting for a decision. The successful application of a similar stochastic decision-making model in neural and social contexts suggests general principles of collective computation across substrates and scales. PMID:29376116
Program Your Computer to Make Tough Decisions Easy.
ERIC Educational Resources Information Center
DiGiammarino, Frank P.
1981-01-01
Describes the data management and analysis system of the Lexington (Massachusetts) Public Schools. Discusses the system's database, data dictionary, and end user language and gives examples of the system's use in answering questions about school closings. (RW)
Towards a computer-aided diagnosis system for vocal cord diseases.
Verikas, A; Gelzinis, A; Bacauskiene, M; Uloza, V
2006-01-01
The objective of this work is to investigate a possibility of creating a computer-aided decision support system for an automated analysis of vocal cord images aiming to categorize diseases of vocal cords. The problem is treated as a pattern recognition task. To obtain a concise and informative representation of a vocal cord image, colour, texture, and geometrical features are used. The representation is further analyzed by a pattern classifier categorizing the image into healthy, diffuse, and nodular classes. The approach developed was tested on 785 vocal cord images collected at the Department of Otolaryngology, Kaunas University of Medicine, Lithuania. A correct classification rate of over 87% was obtained when categorizing a set of unseen images into the aforementioned three classes. Bearing in mind the high similarity of the decision classes, the results obtained are rather encouraging and the developed tools could be very helpful for assuring objective analysis of the images of laryngeal diseases.
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
ERIC Educational Resources Information Center
Ballantine, R. Malcolm
Decision Support Systems (DSSs) are computer-based decision aids to use when making decisions which are partially amenable to rational decision-making procedures but contain elements where intuitive judgment is an essential component. In such situations, DSSs are used to improve the quality of decision-making. The DSS approach is based on Simon's…
Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.
ERIC Educational Resources Information Center
Miller, James H.; Carr, Sonya C.
1997-01-01
Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
2015-07-14
AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By
Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt; Philip A. Araman
2005-01-01
This paper describes recent progress in the analysis of computed tomography (CT) images of hardwood logs. The long-term goal of the work is to develop a system that is capable of autonomous (or semiautonomous) detection of internal defects, so that log breakdown decisions can be optimized based on defect locations. The problem is difficult because wood exhibits large...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less
An Investigation Into the Navy Public Works Centers Specific Work Service Processing Problems.
1980-12-01
demonstrated. These computations are from Navy Area Audit Service reports or PWC and NAVFACENGCOM reports. Number One-time Annual Personnel 3,553...study, all of the endorsements, and a Navy Audit Service audit of the cost analysis, the CNO makes the final consolidation decision. With a decision to...organizations to which local activities turn for environmental issue assistance such as noise, water and air polution , airfield encroachment, local
Systematic Analysis of the Decision Rules of Traditional Chinese Medicine
Bin-Rong, Ma; Xi-Yuan, Jiang; Su-Ming, Liso; Huai-ning, Zhu; Xiu-ru, Lin
1981-01-01
Chinese traditional medicine has evolved over many centuries, and has accumulated a body of observed relationships between symptoms, signs and prognoses, and the efficacy of alternative treatments and prescriptions. With the assistance of a computer-based clinical data base for recording the diagnostic and therapeutic practice of skilled practitioners of Chinese traditional medicine, a systematic program is being conducted to identify and define the clinical decision-making rules that underlie current practice.
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.
Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-26
Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness.
Mogol, Burçe Ataç; Gökmen, Vural
2014-05-01
Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.
ERIC Educational Resources Information Center
May, Donald M.; And Others
The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…
Goldstein, M. K.; Miller, D. E.; Davies, S.; Garber, A. M.
2002-01-01
Functional status as measured by dependencies in the Activities of Daily Living (ADLs) is an important indicator of overall health for older adults. Methodologies for outcomes-based medical-decision-making for public policy, such as decision modeling and cost-effectiveness analysis, require utilities for outcome health states. Utilities have been reported for many disease states, but have not been indexed by functional status, which is a strong predictor of outcome in geriatrics. We describe here a utility elicitation program developed specifically for use with computer-inexperienced older adults: Functional Limitation And Independence Rating (FLAIR1). FLAIR1 design features address common physical problems of the aged and computer attitudes of inexperienced users that could impede computer acceptance. We interviewed 400 adults ages 65 years and older with FLAIR1. In exit interviews with 154 respondents, 118 (76%) found FLAIR1 easy to use. Design features in FLAIR1 can be applied to other software for older adults PMID:12463834
NASA Astrophysics Data System (ADS)
Shiju, S.; Sumitra, S.
2017-12-01
In this paper, the multiple kernel learning (MKL) is formulated as a supervised classification problem. We dealt with binary classification data and hence the data modelling problem involves the computation of two decision boundaries of which one related with that of kernel learning and the other with that of input data. In our approach, they are found with the aid of a single cost function by constructing a global reproducing kernel Hilbert space (RKHS) as the direct sum of the RKHSs corresponding to the decision boundaries of kernel learning and input data and searching that function from the global RKHS, which can be represented as the direct sum of the decision boundaries under consideration. In our experimental analysis, the proposed model had shown superior performance in comparison with that of existing two stage function approximation formulation of MKL, where the decision functions of kernel learning and input data are found separately using two different cost functions. This is due to the fact that single stage representation helps the knowledge transfer between the computation procedures for finding the decision boundaries of kernel learning and input data, which inturn boosts the generalisation capacity of the model.
DOT National Transportation Integrated Search
2014-07-01
Pavement Condition surveys are carried out periodically to gather information on pavement distresses that will guide decision-making for maintenance and preservation. Traditional methods involve manual pavement inspections which are time-consuming : ...
Soft System Analysis to Integrate Technology & Human in Controller Workstation
DOT National Transportation Integrated Search
2011-10-16
Computer-based decision support tools (DST), : shared information, and other forms of automation : are increasingly being planned for use by controllers : and pilots to support Air Traffic Management (ATM) : and Air Traffic Control (ATC) in the Next ...
Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei
2017-01-01
Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060
Modeling Opponents in Adversarial Risk Analysis.
Rios Insua, David; Banks, David; Rios, Jesus
2016-04-01
Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. © 2015 Society for Risk Analysis.
2008-06-01
capacity planning; • Electrical generation capacity planning; • Machine scheduling; • Freight scheduling; • Dairy farm expansion planning...Support Systems and Multi Criteria Decision Analysis Products A.2.11.2.2.1 ELECTRE IS ELECTRE IS is a generalization of ELECTRE I. It is a...criteria, ELECTRE IS supports the user in the process of selecting one alternative or a subset of alternatives. The method consists of two parts
Is chess the drosophila of artificial intelligence? A social history of an algorithm.
Ensmenger, Nathan
2012-02-01
Since the mid 1960s, researchers in computer science have famously referred to chess as the 'drosophila' of artificial intelligence (AI). What they seem to mean by this is that chess, like the common fruit fly, is an accessible, familiar, and relatively simple experimental technology that nonetheless can be used productively to produce valid knowledge about other, more complex systems. But for historians of science and technology, the analogy between chess and drosophila assumes a larger significance. As Robert Kohler has ably described, the decision to adopt drosophila as the organism of choice for genetics research had far-reaching implications for the development of 20th century biology. In a similar manner, the decision to focus on chess as the measure of both human and computer intelligence had important and unintended consequences for AL research. This paper explores the emergence of chess as an experimental technology, its significance in the developing research practices of the AI community, and the unique ways in which the decision to focus on chess shaped the program of AI research in the decade of the 1970s. More broadly, it attempts to open up the virtual black box of computer software--and of computer games in particular--to the scrutiny of historical and sociological analysis.
Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.
Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus ismore » decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.« less
Zero-block mode decision algorithm for H.264/AVC.
Lee, Yu-Ming; Lin, Yinyi
2009-03-01
In the previous paper , we proposed a zero-block intermode decision algorithm for H.264 video coding based upon the number of zero-blocks of 4 x 4 DCT coefficients between the current macroblock and the co-located macroblock. The proposed algorithm can achieve significant improvement in computation, but the computation performance is limited for high bit-rate coding. To improve computation efficiency, in this paper, we suggest an enhanced zero-block decision algorithm, which uses an early zero-block detection method to compute the number of zero-blocks instead of direct DCT and quantization (DCT/Q) calculation and incorporates two adequate decision methods into semi-stationary and nonstationary regions of a video sequence. In addition, the zero-block decision algorithm is also applied to the intramode prediction in the P frame. The enhanced zero-block decision algorithm brings out a reduction of average 27% of total encoding time compared to the zero-block decision algorithm.
The integration of social influence and reward: Computational approaches and neural evidence.
Tomlin, Damon; Nedic, Andrea; Prentice, Deborah A; Holmes, Philip; Cohen, Jonathan D
2017-08-01
Decades of research have established that decision-making is dramatically impacted by both the rewards an individual receives and the behavior of others. How do these distinct influences exert their influence on an individual's actions, and can the resulting behavior be effectively captured in a computational model? To address this question, we employed a novel spatial foraging game in which groups of three participants sought to find the most rewarding location in an unfamiliar two-dimensional space. As the game transitioned from one block to the next, the availability of information regarding other group members was varied systematically, revealing the relative impacts of feedback from the environment and information from other group members on individual decision-making. Both reward-based and socially-based sources of information exerted a significant influence on behavior, and a computational model incorporating these effects was able to recapitulate several key trends in the behavioral data. In addition, our findings suggest how these sources were processed and combined during decision-making. Analysis of reaction time, location of gaze, and functional magnetic resonance imaging (fMRI) data indicated that these distinct sources of information were integrated simultaneously for each decision, rather than exerting their influence in a separate, all-or-none fashion across separate subsets of trials. These findings add to our understanding of how the separate influences of reward from the environment and information derived from other social agents are combined to produce decisions.
Application of a computational decision model to examine acute drug effects on human risk taking.
Lane, Scott D; Yechiam, Eldad; Busemeyer, Jerome R
2006-05-01
In 3 previous experiments, high doses of alcohol, marijuana, and alprazolam acutely increased risky decision making by adult humans in a 2-choice (risky vs. nonrisky) laboratory task. In this study, a computational modeling analysis known as the expectancy valence model (J. R. Busemeyer & J. C. Stout, 2002) was applied to individual-participant data from these studies, for the highest administered dose of all 3 drugs and corresponding placebo doses, to determine changes in decision-making processes that may be uniquely engendered by each drug. The model includes 3 parameters: responsiveness to rewards and losses (valence or motivation); the rate of updating expectancies about the value of risky alternatives (learning/memory); and the consistency with which trial-by-trial choices match expected outcomes (sensitivity). Parameter estimates revealed 3 key outcomes: Alcohol increased responsiveness to risky rewards and decreased responsiveness to risky losses (motivation) but did not alter expectancy updating (learning/memory); both marijuana and alprazolam produced increases in risk taking that were related to learning/memory but not motivation; and alcohol and marijuana (but not alprazolam) produced more random response patterns that were less consistently related to expected outcomes on the 2 choices. No significant main effects of gender or dose by gender interactions were obtained, but 2 dose by gender interactions approached significance. These outcomes underscore the utility of using a computational modeling approach to deconstruct decision-making processes and thus better understand drug effects on risky decision making in humans.
Paraconsistent Annotated Logic in Viability Analysis: an Approach to Product Launching
NASA Astrophysics Data System (ADS)
Romeu de Carvalho, Fábio; Brunstein, Israel; Abe, Jair Minoro
2004-08-01
In this paper we present an application of the Para-analyzer, a logical analyzer based on the Paraconsistent Annotated Logic Pτ, introduced by Da Silva Filho and Abe in the decision-making systems. An example is analyzed in detail showing how uncertainty, inconsistency and paracompleteness can be elegantly handled with this logical system. As application for the Para-analyzer in decision-making, we developed the BAM — Baricenter Analysis Method. In order to make the presentation easier, we present the BAM applied in the viability analysis of product launching. Some of the techniques of Paraconsistent Annotated Logic have been applied in Artificial Intelligence, Robotics, Information Technolgy (Computer Sciences), etc..
Petri-net-based 2D design of DNA walker circuits.
Gilbert, David; Heiner, Monika; Rohr, Christian
2018-01-01
We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.
Optimal and Nonoptimal Computer-Based Test Designs for Making Pass-Fail Decisions
ERIC Educational Resources Information Center
Hambleton, Ronald K.; Xing, Dehui
2006-01-01
Now that many credentialing exams are being routinely administered by computer, new computer-based test designs, along with item response theory models, are being aggressively researched to identify specific designs that can increase the decision consistency and accuracy of pass-fail decisions. The purpose of this study was to investigate the…
NASA Technical Reports Server (NTRS)
Chu, Y. Y.
1978-01-01
A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.
Pope, Catherine; Halford, Susan; Turnbull, Joanne; Prichard, Jane
2014-06-01
This article draws on data collected during a 2-year project examining the deployment of a computerised decision support system. This computerised decision support system was designed to be used by non-clinical staff for dealing with calls to emergency (999) and urgent care (out-of-hours) services. One of the promises of computerised decisions support technologies is that they can 'hold' vast amounts of sophisticated clinical knowledge and combine it with decision algorithms to enable standardised decision-making by non-clinical (clerical) staff. This article draws on our ethnographic study of this computerised decision support system in use, and we use our analysis to question the 'automated' vision of decision-making in healthcare call-handling. We show that embodied and experiential (human) expertise remains central and highly salient in this work, and we propose that the deployment of the computerised decision support system creates something new, that this conjunction of computer and human creates a cyborg practice.
NASA Astrophysics Data System (ADS)
Kacprzyk, Janusz; Zadrożny, Sławomir
2010-05-01
We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.
Computation and measurement of cell decision making errors using single cell data
Habibi, Iman; Cheong, Raymond; Levchenko, Andre; Emamian, Effat S.; Abdi, Ali
2017-01-01
In this study a new computational method is developed to quantify decision making errors in cells, caused by noise and signaling failures. Analysis of tumor necrosis factor (TNF) signaling pathway which regulates the transcription factor Nuclear Factor κB (NF-κB) using this method identifies two types of incorrect cell decisions called false alarm and miss. These two events represent, respectively, declaring a signal which is not present and missing a signal that does exist. Using single cell experimental data and the developed method, we compute false alarm and miss error probabilities in wild-type cells and provide a formulation which shows how these metrics depend on the signal transduction noise level. We also show that in the presence of abnormalities in a cell, decision making processes can be significantly affected, compared to a wild-type cell, and the method is able to model and measure such effects. In the TNF—NF-κB pathway, the method computes and reveals changes in false alarm and miss probabilities in A20-deficient cells, caused by cell’s inability to inhibit TNF-induced NF-κB response. In biological terms, a higher false alarm metric in this abnormal TNF signaling system indicates perceiving more cytokine signals which in fact do not exist at the system input, whereas a higher miss metric indicates that it is highly likely to miss signals that actually exist. Overall, this study demonstrates the ability of the developed method for modeling cell decision making errors under normal and abnormal conditions, and in the presence of transduction noise uncertainty. Compared to the previously reported pathway capacity metric, our results suggest that the introduced decision error metrics characterize signaling failures more accurately. This is mainly because while capacity is a useful metric to study information transmission in signaling pathways, it does not capture the overlap between TNF-induced noisy response curves. PMID:28379950
Computation and measurement of cell decision making errors using single cell data.
Habibi, Iman; Cheong, Raymond; Lipniacki, Tomasz; Levchenko, Andre; Emamian, Effat S; Abdi, Ali
2017-04-01
In this study a new computational method is developed to quantify decision making errors in cells, caused by noise and signaling failures. Analysis of tumor necrosis factor (TNF) signaling pathway which regulates the transcription factor Nuclear Factor κB (NF-κB) using this method identifies two types of incorrect cell decisions called false alarm and miss. These two events represent, respectively, declaring a signal which is not present and missing a signal that does exist. Using single cell experimental data and the developed method, we compute false alarm and miss error probabilities in wild-type cells and provide a formulation which shows how these metrics depend on the signal transduction noise level. We also show that in the presence of abnormalities in a cell, decision making processes can be significantly affected, compared to a wild-type cell, and the method is able to model and measure such effects. In the TNF-NF-κB pathway, the method computes and reveals changes in false alarm and miss probabilities in A20-deficient cells, caused by cell's inability to inhibit TNF-induced NF-κB response. In biological terms, a higher false alarm metric in this abnormal TNF signaling system indicates perceiving more cytokine signals which in fact do not exist at the system input, whereas a higher miss metric indicates that it is highly likely to miss signals that actually exist. Overall, this study demonstrates the ability of the developed method for modeling cell decision making errors under normal and abnormal conditions, and in the presence of transduction noise uncertainty. Compared to the previously reported pathway capacity metric, our results suggest that the introduced decision error metrics characterize signaling failures more accurately. This is mainly because while capacity is a useful metric to study information transmission in signaling pathways, it does not capture the overlap between TNF-induced noisy response curves.
Outlook Bright for Computers in Chemistry.
ERIC Educational Resources Information Center
Baum, Rudy M.
1981-01-01
Discusses the recent decision to close down the National Resource for Computation in Chemistry (NRCC), implications of that decision, and various alternatives in the field of computational chemistry. (CS)
2013-01-01
Analyzing and storing data and results from next-generation sequencing (NGS) experiments is a challenging task, hampered by ever-increasing data volumes and frequent updates of analysis methods and tools. Storage and computation have grown beyond the capacity of personal computers and there is a need for suitable e-infrastructures for processing. Here we describe UPPNEX, an implementation of such an infrastructure, tailored to the needs of data storage and analysis of NGS data in Sweden serving various labs and multiple instruments from the major sequencing technology platforms. UPPNEX comprises resources for high-performance computing, large-scale and high-availability storage, an extensive bioinformatics software suite, up-to-date reference genomes and annotations, a support function with system and application experts as well as a web portal and support ticket system. UPPNEX applications are numerous and diverse, and include whole genome-, de novo- and exome sequencing, targeted resequencing, SNP discovery, RNASeq, and methylation analysis. There are over 300 projects that utilize UPPNEX and include large undertakings such as the sequencing of the flycatcher and Norwegian spruce. We describe the strategic decisions made when investing in hardware, setting up maintenance and support, allocating resources, and illustrate major challenges such as managing data growth. We conclude with summarizing our experiences and observations with UPPNEX to date, providing insights into the successful and less successful decisions made. PMID:23800020
1995-03-01
advisory system provides a decision framework for selecting an appropriate model from the nuimerous available transport models conditinni-ed on...l1, T ,TV Groundwater Modeling, Contaminant Transport , Optimi2atio’ 2; Total Reliability, Remediation Si , , -J % UNCLASSIFIED UNCLASSIFIED...0 0 0 0 S 0 Sn S Even with the choice of an appropriate transport model, considlrable uncertainty is likely to be present in the analysis of
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses
Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-01
Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Conclusions Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness. PMID:26813512
ERIC Educational Resources Information Center
Kert, Serhat Bahadir; Uz, Cigdem; Gecu, Zeynep
2014-01-01
This study examined the effectiveness of an electronic performance support system (EPSS) on computer ethics education and the ethical decision-making processes. There were five different phases to this ten month study: (1) Writing computer ethics scenarios, (2) Designing a decision-making framework (3) Developing EPSS software (4) Using EPSS in a…
Melnick, Edward R.; Lopez, Kevin; Hess, Erik P.; Abujarad, Fuad; Brandt, Cynthia A.; Shiffman, Richard N.; Post, Lori A.
2015-01-01
Context: Current information-rich electronic health record (EHR) interfaces require large, high-resolution screens running on desktop computers. This interface compromises the provider’s already limited time at the bedside by physically separating the patient from the doctor. The case study presented here describes a patient-centered clinical decision support (CDS) design process that aims to bring the physician back to the bedside by integrating a patient decision aid with CDS for shared use by the patient and provider on a touchscreen tablet computer for deciding whether or not to obtain a CT scan for minor head injury in the emergency department, a clinical scenario that could benefit from CDS but has failed previous implementation attempts. Case Description: This case study follows the user-centered design (UCD) approach to build a bedside aid that is useful and usable, and that promotes shared decision-making between patients and their providers using a tablet computer at the bedside. The patient-centered decision support design process focuses on the prototype build using agile software development, but also describes the following: (1) the requirement gathering phase including triangulated qualitative research (focus groups and cognitive task analysis) to understand current challenges, (2) features for patient education, the physician, and shared decision-making, (3) system architecture and technical requirements, and (4) future plans for formative usability testing and field testing. Lessons Learned: We share specific lessons learned and general recommendations from critical insights gained in the patient-centered decision support design process about early stakeholder engagement, EHR integration, external expert feedback, challenges to two users on a single device, project management, and accessibility. Conclusions: Successful implementation of this tool will require seamless integration into the provider’s workflow. This protocol can create an effective interface for shared decision-making and safe resource reduction at the bedside in the austere and dynamic clinical environment of the ED and is generalizable for these purposes in other clinical environments as well. PMID:26290885
Melnick, Edward R; Lopez, Kevin; Hess, Erik P; Abujarad, Fuad; Brandt, Cynthia A; Shiffman, Richard N; Post, Lori A
2015-01-01
Current information-rich electronic health record (EHR) interfaces require large, high-resolution screens running on desktop computers. This interface compromises the provider's already limited time at the bedside by physically separating the patient from the doctor. The case study presented here describes a patient-centered clinical decision support (CDS) design process that aims to bring the physician back to the bedside by integrating a patient decision aid with CDS for shared use by the patient and provider on a touchscreen tablet computer for deciding whether or not to obtain a CT scan for minor head injury in the emergency department, a clinical scenario that could benefit from CDS but has failed previous implementation attempts. This case study follows the user-centered design (UCD) approach to build a bedside aid that is useful and usable, and that promotes shared decision-making between patients and their providers using a tablet computer at the bedside. The patient-centered decision support design process focuses on the prototype build using agile software development, but also describes the following: (1) the requirement gathering phase including triangulated qualitative research (focus groups and cognitive task analysis) to understand current challenges, (2) features for patient education, the physician, and shared decision-making, (3) system architecture and technical requirements, and (4) future plans for formative usability testing and field testing. We share specific lessons learned and general recommendations from critical insights gained in the patient-centered decision support design process about early stakeholder engagement, EHR integration, external expert feedback, challenges to two users on a single device, project management, and accessibility. Successful implementation of this tool will require seamless integration into the provider's workflow. This protocol can create an effective interface for shared decision-making and safe resource reduction at the bedside in the austere and dynamic clinical environment of the ED and is generalizable for these purposes in other clinical environments as well.
Threat evaluation for impact assessment in situation analysis systems
NASA Astrophysics Data System (ADS)
Roy, Jean; Paradis, Stephane; Allouche, Mohamad
2002-07-01
Situation analysis is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of situation awareness, for the decision maker. Data fusion is a key enabler to meeting the demanding requirements of military situation analysis support systems. According to the data fusion model maintained by the Joint Directors of Laboratories' Data Fusion Group, impact assessment estimates the effects on situations of planned or estimated/predicted actions by the participants, including interactions between action plans of multiple players. In this framework, the appraisal of actual or potential threats is a necessary capability for impact assessment. This paper reviews and discusses in details the fundamental concepts of threat analysis. In particular, threat analysis generally attempts to compute some threat value, for the individual tracks, that estimates the degree of severity with which engagement events will potentially occur. Presenting relevant tracks to the decision maker in some threat list, sorted from the most threatening to the least, is clearly in-line with the cognitive demands associated with threat evaluation. A key parameter in many threat value evaluation techniques is the Closest Point of Approach (CPA). Along this line of thought, threatening tracks are often prioritized based upon which ones will reach their CPA first. Hence, the Time-to-CPA (TCPA), i.e., the time it will take for a track to reach its CPA, is also a key factor. Unfortunately, a typical assumption for the computation of the CPA/TCPA parameters is that the track velocity will remain constant. When a track is maneuvering, the CPA/TCPA values will change accordingly. These changes will in turn impact the threat value computations and, ultimately, the resulting threat list. This is clearly undesirable from a command decision-making perspective. In this regard, the paper briefly discusses threat value stabilization approaches based on neural networks and other mathematical techniques.
Capalbo, Susan M; Antle, John M; Seavert, Clark
2017-07-01
Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.
Understanding the Hows and Whys of Decision-Making: From Expected Utility to Divisive Normalization.
Glimcher, Paul
2014-01-01
Over the course of the last century, economists and ethologists have built detailed models from first principles of how humans and animals should make decisions. Over the course of the last few decades, psychologists and behavioral economists have gathered a wealth of data at variance with the predictions of these economic models. This has led to the development of highly descriptive models that can often predict what choices people or animals will make but without offering any insight into why people make the choices that they do--especially when those choices reduce a decision-maker's well-being. Over the course of the last two decades, neurobiologists working with economists and psychologists have begun to use our growing understanding of how the nervous system works to develop new models of how the nervous system makes decisions. The result, a growing revolution at the interdisciplinary border of neuroscience, psychology, and economics, is a new field called Neuroeconomics. Emerging neuroeconomic models stand to revolutionize our understanding of human and animal choice behavior by combining fundamental properties of neurobiological representation with decision-theoretic analyses. In this overview, one class of these models, based on the widely observed neural computation known as divisive normalization, is presented in detail. The work demonstrates not only that a discrete class of computation widely observed in the nervous system is fundamentally ubiquitous, but how that computation shapes behaviors ranging from visual perception to financial decision-making. It also offers the hope of reconciling economic analysis of what choices we should make with psychological observations of the choices we actually do make. Copyright © 2014 Cold Spring Harbor Laboratory Press; all rights reserved.
ERIC Educational Resources Information Center
Clements, Douglas H., Ed.; And Others
1988-01-01
Presents reviews of three software packages. Includes "Cube Builder: A 3-D Geometry Tool," which allows students to build three-dimensional shapes; "Number Master," a multipurpose practice program for whole number computation; and "Safari Search: Problem Solving and Inference," which focuses on decision making in mathematical analysis. (PK)
ERIC Educational Resources Information Center
Harris, Philip R.
1985-01-01
Looks at changes in the manager's role due to technological advancement in the workplace. Discusses wider range of uses for computers (analysis, decision making, communications, planning, tracking trends), importance of supervisor training, cyberphobia (fear of new technology), cyberphrenia (addiction to new technology), and the effect of a work…
A bayesian approach to classification criteria for spectacled eiders
Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.
1996-01-01
To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
Hajizadeh, Negin; Perez Figueroa, Rafael E; Uhler, Lauren M; Chiou, Erin; Perchonok, Jennifer E; Montague, Enid
2013-03-06
Computerized decision aids could facilitate shared decision-making at the point of outpatient clinical care. The objective of this study was to investigate whether a computerized shared decision aid would be feasible to implement in an inner-city clinic by evaluating the current practices in shared decision-making, clinicians' use of computers, patient and clinicians' attitudes and beliefs toward computerized decision aids, and the influence of time on shared decision-making. Qualitative data analysis of observations and semi-structured interviews with patients and clinicians at an inner-city outpatient clinic. The findings provided an exploratory look at the prevalence of shared decision-making and attitudes about health information technology and decision aids. A prominent barrier to clinicians engaging in shared decision-making was a lack of perceived patient understanding of medical information. Some patients preferred their clinicians make recommendations for them rather than engage in formal shared decision-making. Health information technology was an integral part of the clinic visit and welcomed by most clinicians and patients. Some patients expressed the desire to engage with health information technology such as viewing their medical information on the computer screen with their clinicians. All participants were receptive to the idea of a decision aid integrated within the clinic visit although some clinicians were concerned about the accuracy of prognostic estimates for complex medical problems. We identified several important considerations for the design and implementation of a computerized decision aid including opportunities to: bridge clinician-patient communication about medical information while taking into account individual patients' decision-making preferences, complement expert clinician judgment with prognostic estimates, take advantage of patient waiting times, and make tasks involved during the clinic visit more efficient. These findings should be incorporated into the design and implementation of a computerized shared decision aid at an inner-city hospital.
Informatic parcellation of the network involved in the computation of subjective value
Rangel, Antonio
2014-01-01
Understanding how the brain computes value is a basic question in neuroscience. Although individual studies have driven this progress, meta-analyses provide an opportunity to test hypotheses that require large collections of data. We carry out a meta-analysis of a large set of functional magnetic resonance imaging studies of value computation to address several key questions. First, what is the full set of brain areas that reliably correlate with stimulus values when they need to be computed? Second, is this set of areas organized into dissociable functional networks? Third, is a distinct network of regions involved in the computation of stimulus values at decision and outcome? Finally, are different brain areas involved in the computation of stimulus values for different reward modalities? Our results demonstrate the centrality of ventromedial prefrontal cortex (VMPFC), ventral striatum and posterior cingulate cortex (PCC) in the computation of value across tasks, reward modalities and stages of the decision-making process. We also find evidence of distinct subnetworks of co-activation within VMPFC, one involving central VMPFC and dorsal PCC and another involving more anterior VMPFC, left angular gyrus and ventral PCC. Finally, we identify a posterior-to-anterior gradient of value representations corresponding to concrete-to-abstract rewards. PMID:23887811
Pitcher, Brandon; Alaqla, Ali; Noujeim, Marcel; Wealleans, James A; Kotsakis, Georgios; Chrepa, Vanessa
2017-03-01
Cone-beam computed tomographic (CBCT) analysis allows for 3-dimensional assessment of periradicular lesions and may facilitate preoperative periapical cyst screening. The purpose of this study was to develop and assess the predictive validity of a cyst screening method based on CBCT volumetric analysis alone or combined with designated radiologic criteria. Three independent examiners evaluated 118 presurgical CBCT scans from cases that underwent apicoectomies and had an accompanying gold standard histopathological diagnosis of either a cyst or granuloma. Lesion volume, density, and specific radiologic characteristics were assessed using specialized software. Logistic regression models with histopathological diagnosis as the dependent variable were constructed for cyst prediction, and receiver operating characteristic curves were used to assess the predictive validity of the models. A conditional inference binary decision tree based on a recursive partitioning algorithm was constructed to facilitate preoperative screening. Interobserver agreement was excellent for volume and density, but it varied from poor to good for the radiologic criteria. Volume and root displacement were strong predictors for cyst screening in all analyses. The binary decision tree classifier determined that if the volume of the lesion was >247 mm 3 , there was 80% probability of a cyst. If volume was <247 mm 3 and root displacement was present, cyst probability was 60% (78% accuracy). The good accuracy and high specificity of the decision tree classifier renders it a useful preoperative cyst screening tool that can aid in clinical decision making but not a substitute for definitive histopathological diagnosis after biopsy. Confirmatory studies are required to validate the present findings. Published by Elsevier Inc.
EEG feature selection method based on decision tree.
Duan, Lijuan; Ge, Hui; Ma, Wei; Miao, Jun
2015-01-01
This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.
Factors Influencing the Adoption of Cloud Computing by Decision Making Managers
ERIC Educational Resources Information Center
Ross, Virginia Watson
2010-01-01
Cloud computing is a growing field, addressing the market need for access to computing resources to meet organizational computing requirements. The purpose of this research is to evaluate the factors that influence an organization in their decision whether to adopt cloud computing as a part of their strategic information technology planning.…
Søgaard, Rikke; Fischer, Barbara Malene B; Mortensen, Jann; Rasmussen, Torben R; Lassen, Ulrik
2013-01-01
To assess the expected costs and outcomes of alternative strategies for staging of lung cancer to inform a Danish National Health Service perspective about the most cost-effective strategy. A decision tree was specified for patients with a confirmed diagnosis of non-small-cell lung cancer. Six strategies were defined from relevant combinations of mediastinoscopy, endoscopic or endobronchial ultrasound with needle aspiration, and combined positron emission tomography-computed tomography with F18-fluorodeoxyglucose. Patients without distant metastases and central or contralateral nodal involvement (N2/N3) were considered to be candidates for surgical resection. Diagnostic accuracies were informed from literature reviews, prevalence and survival from the Danish Lung Cancer Registry, and procedure costs from national average tariffs. All parameters were specified probabilistically to determine the joint decision uncertainty. The cost-effectiveness analysis was based on the net present value of expected costs and life years accrued over a time horizon of 5 years. At threshold values of around €30,000 for cost-effectiveness, it was found to be cost-effective to send all patients to positron emission tomography-computed tomography with confirmation of positive findings on nodal involvement by endobronchial ultrasound. This result appeared robust in deterministic sensitivity analysis. The expected value of perfect information was estimated at €52 per patient, indicating that further research might be worthwhile. The policy recommendation is to make combined positron emission tomography-computed tomography and endobronchial ultrasound available for supplemental staging of patients with non-small-cell lung cancer. The effects of alternative strategies on patients' quality of life, however, should be examined in future studies. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Brohet, C R; Richman, H G
1979-06-01
Automated processing of electrocardiograms by the Veterans Administration program was evaluated for both agreement with physician interpretation and interpretative accuracy as assessed with nonelectrocardiographic criteria. One thousand unselected electrocardiograms were analyzed by two reviewer groups, one familiar and the other unfamiliar with the computer program. A significant number of measurement errors involving repolarization changes and left axis deviation occurred; however, interpretative disagreements related to statistical decision were largely language-related. Use of a printout with a more traditional format resulted in agreement with physician interpretation by both reviewer groups in more than 80 percent of cases. Overall sensitivity based on agreement with nonelectrocardiographic criteria was significantly greater with use of the computer program than with use of the conventional criteria utilized by the reviewers. This difference was particularly evident in the subgroup analysis of myocardial infarction and left ventricular hypertrophy. The degree of overdiagnosis of left ventricular hypertrophy and posteroinferior infarction was initially unacceptable, but this difficulty was corrected by adjustment of probabilities. Clinical acceptability of the Veterans Administration program appears to require greater physician education than that needed for other computer programs of electrocardiographic analysis; the flexibility of interpretation by statistical decision offers the potential for better diagnostic accuracy.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Enabling Real-time Water Decision Support Services Using Model as a Service
NASA Astrophysics Data System (ADS)
Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.
2014-12-01
Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.
NASA Astrophysics Data System (ADS)
Broderick, Scott R.; Santhanam, Ganesh Ram; Rajan, Krishna
2016-08-01
As the size of databases has significantly increased, whether through high throughput computation or through informatics-based modeling, the challenge of selecting the optimal material for specific design requirements has also arisen. Given the multiple, and often conflicting, design requirements, this selection process is not as trivial as sorting the database for a given property value. We suggest that the materials selection process should minimize selector bias, as well as take data uncertainty into account. For this reason, we discuss and apply decision theory for identifying chemical additions to Ni-base alloys. We demonstrate and compare results for both a computational array of chemistries and standard commercial superalloys. We demonstrate how we can use decision theory to select the best chemical additions for enhancing both property and processing, which would not otherwise be easily identifiable. This work is one of the first examples of introducing the mathematical framework of set theory and decision analysis into the domain of the materials selection process.
Collaborative mining and interpretation of large-scale data for biomedical research insights.
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.
Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270
Insect pest management for raw commodities during storage
USDA-ARS?s Scientific Manuscript database
This book chapter provides an overview of the pest management decision-making process during grain storage. An in-depth discussion of sampling methods, cost-benefit analysis, expert systems, consultants and the use of computer simulation models is provided. Sampling is essential to determine if pest...
A computational engine for bringing environmental consequence analysis into aviation decision-making
DOT National Transportation Integrated Search
2010-04-21
This presentation looks at the methods for ambient masking of non-natural sounds. The masking of sounds is most effective when the masker spectrum overlaps the signal spectrum; more likely to occur if the masker is broadband in nature. Land vehicles ...
Eye-gaze control of the computer interface: Discrimination of zoom intent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-10-01
An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at amore » statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered.« less
Computer models for economic and silvicultural decisions
Rosalie J. Ingram
1989-01-01
Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.
Gunay, Osman; Toreyin, Behçet Ugur; Kose, Kivanc; Cetin, A Enis
2012-05-01
In this paper, an entropy-functional-based online adaptive decision fusion (EADF) framework is developed for image analysis and computer vision applications. In this framework, it is assumed that the compound algorithm consists of several subalgorithms, each of which yields its own decision as a real number centered around zero, representing the confidence level of that particular subalgorithm. Decision values are linearly combined with weights that are updated online according to an active fusion method based on performing entropic projections onto convex sets describing subalgorithms. It is assumed that there is an oracle, who is usually a human operator, providing feedback to the decision fusion method. A video-based wildfire detection system was developed to evaluate the performance of the decision fusion algorithm. In this case, image data arrive sequentially, and the oracle is the security guard of the forest lookout tower, verifying the decision of the combined algorithm. The simulation results are presented.
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review.
Liedlgruber, Michael; Uhl, Andreas
2011-01-01
Today, medical endoscopy is a widely used procedure to inspect the inner cavities of the human body. The advent of endoscopic imaging techniques-allowing the acquisition of images or videos-created the possibility for the development of the whole new branch of computer-aided decision support systems. Such systems aim at helping physicians to identify possibly malignant abnormalities more accurately. At the beginning of this paper, we give a brief introduction to the history of endoscopy, followed by introducing the main types of endoscopes which emerged so far (flexible endoscope, wireless capsule endoscope, and confocal laser endomicroscope). We then give a brief introduction to computer-aided decision support systems specifically targeted at endoscopy in the gastrointestinal tract. Then we present general facts and figures concerning computer-aided decision support systems and summarize work specifically targeted at computer-aided decision support in the gastrointestinal tract. This summary is followed by a discussion of some common issues concerning the approaches reviewed and suggestions of possible ways to resolve them.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
A Computer Simulation of Organizational Decision-Making.
1979-12-01
future research into one class of manpower models. In choosing the voting scen- ario I was more interested in the long-term process of political ... socialization , rather than the prediction of the outcome of a particular election. Successive elections are like successive learning trials. The analysis did
The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...
Robust MOE Detector for DS-CDMA Systems with Signature Waveform Mismatch
NASA Astrophysics Data System (ADS)
Lin, Tsui-Tsai
In this letter, a decision-directed MOE detector with excellent robustness against signature waveform mismatch is proposed for DS-CDMA systems. Both the theoretic analysis and computer simulation results demonstrate that the proposed detector can provide better SINR performance than that of conventional detectors.
Chi, Chia-Fen; Tseng, Li-Kai; Jang, Yuh
2012-07-01
Many disabled individuals lack extensive knowledge about assistive technology, which could help them use computers. In 1997, Denis Anson developed a decision tree of 49 evaluative questions designed to evaluate the functional capabilities of the disabled user and choose an appropriate combination of assistive devices, from a selection of 26, that enable the individual to use a computer. In general, occupational therapists guide the disabled users through this process. They often have to go over repetitive questions in order to find an appropriate device. A disabled user may require an alphanumeric entry device, a pointing device, an output device, a performance enhancement device, or some combination of these. Therefore, the current research eliminates redundant questions and divides Anson's decision tree into multiple independent subtrees to meet the actual demand of computer users with disabilities. The modified decision tree was tested by six disabled users to prove it can determine a complete set of assistive devices with a smaller number of evaluative questions. The means to insert new categories of computer-related assistive devices was included to ensure the decision tree can be expanded and updated. The current decision tree can help the disabled users and assistive technology practitioners to find appropriate computer-related assistive devices that meet with clients' individual needs in an efficient manner.
Ideal AFROC and FROC observers.
Khurd, Parmeshwar; Liu, Bin; Gindi, Gene
2010-02-01
Detection of multiple lesions in images is a medically important task and free-response receiver operating characteristic (FROC) analyses and its variants, such as alternative FROC (AFROC) analyses, are commonly used to quantify performance in such tasks. However, ideal observers that optimize FROC or AFROC performance metrics have not yet been formulated in the general case. If available, such ideal observers may turn out to be valuable for imaging system optimization and in the design of computer aided diagnosis techniques for lesion detection in medical images. In this paper, we derive ideal AFROC and FROC observers. They are ideal in that they maximize, amongst all decision strategies, the area, or any partial area, under the associated AFROC or FROC curve. Calculation of observer performance for these ideal observers is computationally quite complex. We can reduce this complexity by considering forms of these observers that use false positive reports derived from signal-absent images only. We also consider a Bayes risk analysis for the multiple-signal detection task with an appropriate definition of costs. A general decision strategy that minimizes Bayes risk is derived. With particular cost constraints, this general decision strategy reduces to the decision strategy associated with the ideal AFROC or FROC observer.
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.
Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa
2016-12-01
Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
Determining the optimal forensic DNA analysis procedure following investigation of sample quality.
Hedell, Ronny; Hedman, Johannes; Mostad, Petter
2018-07-01
Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.
Menychtas, Andreas; Tsanakas, Panayiotis
2016-01-01
The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging. PMID:27222731
Menychtas, Andreas; Tsanakas, Panayiotis; Maglogiannis, Ilias
2016-03-01
The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging.
Structural analysis at aircraft conceptual design stage
NASA Astrophysics Data System (ADS)
Mansouri, Reza
In the past 50 years, computers have helped by augmenting human efforts with tremendous pace. The aircraft industry is not an exception. Aircraft industry is more than ever dependent on computing because of a high level of complexity and the increasing need for excellence to survive a highly competitive marketplace. Designers choose computers to perform almost every analysis task. But while doing so, existing effective, accurate and easy to use classical analytical methods are often forgotten, which can be very useful especially in the early phases of the aircraft design where concept generation and evaluation demands physical visibility of design parameters to make decisions [39, 2004]. Structural analysis methods have been used by human beings since the very early civilization. Centuries before computers were invented; the pyramids were designed and constructed by Egyptians around 2000 B.C, the Parthenon was built by the Greeks, around 240 B.C, Dujiangyan was built by the Chinese. Persepolis, Hagia Sophia, Taj Mahal, Eiffel tower are only few more examples of historical buildings, bridges and monuments that were constructed before we had any advancement made in computer aided engineering. Aircraft industry is no exception either. In the first half of the 20th century, engineers used classical method and designed civil transport aircraft such as Ford Tri Motor (1926), Lockheed Vega (1927), Lockheed 9 Orion (1931), Douglas DC-3 (1935), Douglas DC-4/C-54 Skymaster (1938), Boeing 307 (1938) and Boeing 314 Clipper (1939) and managed to become airborne without difficulty. Evidencing, while advanced numerical methods such as the finite element analysis is one of the most effective structural analysis methods; classical structural analysis methods can also be as useful especially during the early phase of a fixed wing aircraft design where major decisions are made and concept generation and evaluation demands physical visibility of design parameters to make decisions. Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.
Decision Accuracy in Computer-Mediated versus Face-to-Face Decision-Making Teams.
Hedlund; Ilgen; Hollenbeck
1998-10-01
Changes in the way organizations are structured and advances in communication technologies are two factors that have altered the conditions under which group decisions are made. Decisions are increasingly made by teams that have a hierarchical structure and whose members have different areas of expertise. In addition, many decisions are no longer made via strictly face-to-face interaction. The present study examines the effects of two modes of communication (face-to-face or computer-mediated) on the accuracy of teams' decisions. The teams are characterized by a hierarchical structure and their members differ in expertise consistent with the framework outlined in the Multilevel Theory of team decision making presented by Hollenbeck, Ilgen, Sego, Hedlund, Major, and Phillips (1995). Sixty-four four-person teams worked for 3 h on a computer simulation interacting either face-to-face (FtF) or over a computer network. The communication mode had mixed effects on team processes in that members of FtF teams were better informed and made recommendations that were more predictive of the correct team decision, but leaders of CM teams were better able to differentiate staff members on the quality of their decisions. Controlling for the negative impact of FtF communication on staff member differentiation increased the beneficial effect of the FtF mode on overall decision making accuracy. Copyright 1998 Academic Press.
Norman, Luke J; Carlisi, Christina O; Christakou, Anastasia; Murphy, Clodagh M; Chantiluke, Kaylita; Giampietro, Vincent; Simmons, Andrew; Brammer, Michael; Mataix-Cols, David; Rubia, Katya
2018-03-24
The aim of the current paper is to provide the first comparison of computational mechanisms and neurofunctional substrates in adolescents with attention-deficit/hyperactivity disorder (ADHD) and adolescents with obsessive-compulsive disorder (OCD) during decision making under ambiguity. Sixteen boys with ADHD, 20 boys with OCD, and 20 matched control subjects (12-18 years of age) completed a functional magnetic resonance imaging version of the Iowa Gambling Task. Brain activation was compared between groups using three-way analysis of covariance. Hierarchical Bayesian analysis was used to compare computational modeling parameters between groups. Patient groups shared reduced choice consistency and relied less on reinforcement learning during decision making relative to control subjects, while adolescents with ADHD alone demonstrated increased reward sensitivity. During advantageous choices, both disorders shared underactivation in ventral striatum, while OCD patients showed disorder-specific underactivation in the ventromedial orbitofrontal cortex. During outcome evaluation, shared underactivation to losses in patients relative to control subjects was found in the medial prefrontal cortex and shared underactivation to wins was found in the left putamen/caudate. ADHD boys showed disorder-specific dysfunction in the right putamen/caudate, which was activated more to losses in patients with ADHD but more to wins in control subjects. The findings suggest shared deficits in using learned reward expectancies to guide decision making, as well as shared dysfunction in medio-fronto-striato-limbic brain regions. However, findings of unique dysfunction in the ventromedial orbitofrontal cortex in OCD and in the right putamen in ADHD indicate additional, disorder-specific abnormalities and extend similar findings from inhibitory control tasks in the disorders to the domain of decision making under ambiguity. Copyright © 2018 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Heuristic and optimal policy computations in the human brain during sequential decision-making.
Korn, Christoph W; Bach, Dominik R
2018-01-23
Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.
Graphics; For Regional Policy Making, a Preliminary Study.
ERIC Educational Resources Information Center
Ewald, William R., Jr.
The use of graphics (maps, charts, diagrams, renderings, photographs) for regional policy formulation and decision making is discussed at length. The report identifies the capabilities of a number of tools for analysis/synthesis/communication, especially computer assisted graphics to assist in community self-education and the management of change.…
Probability, Problem Solving, and "The Price is Right."
ERIC Educational Resources Information Center
Wood, Eric
1992-01-01
This article discusses the analysis of a decision-making process faced by contestants on the television game show "The Price is Right". The included analyses of the original and related problems concern pattern searching, inductive reasoning, quadratic functions, and graphing. Computer simulation programs in BASIC and tables of…
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications
2007-05-23
Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers
A Decision Support System for Energy Policy Analysis.
1980-07-01
new realities or hypothesized realities to the modeling system. Lack of a PDL would make the system inflexible and accessible only to a patient ... expert . Certainly, given the present ratio of costs of personnel to costs of computers, the alternative of presenting data in its raw form is acceptable
A queueing model of pilot decision making in a multi-task flight management situation
NASA Technical Reports Server (NTRS)
Walden, R. S.; Rouse, W. B.
1977-01-01
Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.
Design and implementation of spatial knowledge grid for integrated spatial analysis
NASA Astrophysics Data System (ADS)
Liu, Xiangnan; Guan, Li; Wang, Ping
2006-10-01
Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.
Effects of automation of information-processing functions on teamwork.
Wright, Melanie C; Kaber, David B
2005-01-01
We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.
Decision problems in management of construction projects
NASA Astrophysics Data System (ADS)
Szafranko, E.
2017-10-01
In a construction business, one must oftentimes make decisions during all stages of a building process, from planning a new construction project through its execution to the stage of using a ready structure. As a rule, the decision making process is made more complicated due to certain conditions specific for civil engineering. With such diverse decision situations, it is recommended to apply various decision making support methods. Both, literature and hands-on experience suggest several methods based on analytical and computational procedures, some less and some more complex. This article presents the methods which can be helpful in supporting decision making processes in the management of civil engineering projects. These are multi-criteria methods, such as MCE, AHP or indicator methods. Because the methods have different advantages and disadvantages, whereas decision situations have their own specific nature, a brief summary of the methods alongside some recommendations regarding their practical applications has been given at the end of the paper. The main aim of this article is to review the methods of decision support and their analysis for possible use in the construction industry.
Decision Making about Computer Acquisition and Use in American Schools.
ERIC Educational Resources Information Center
Becker, Henry Jay
1993-01-01
Discusses the centralization and decentralization of decision making about computer use in elementary and secondary schools based on results of a 1989 national survey. Results unexpectedly indicate that more successful programs are the result of districtwide planning than individual teacher or school-level decision making. (LRW)
Decision support systems for ecosystem management: An evaluation of existing systems
H. Todd Mowrer; Klaus Barber; Joe Campbell; Nick Crookston; Cathy Dahms; John Day; Jim Laacke; Jim Merzenich; Steve Mighton; Mike Rauscher; Rick Sojda; Joyce Thompson; Peter Trenchi; Mark Twery
1997-01-01
This report evaluated 24 computer-aided decision support systems (DSS) that can support management decision-making in forest ecosystems. It compares the scope of each system, spatial capabilities, computational methods, development status, input and output requirements, user support availability, and system performance. Questionnaire responses from the DSS developers (...
The Computer as Adaptive Instructional Decision Maker.
ERIC Educational Resources Information Center
Kopstein, Felix F.; Seidel, Robert J.
The computer's potential for education, and most particularly for instruction, is contingent on the development of a class of instructional decision models (formal instructional strategies) that interact with the student through appropriate peripheral equipment (man-machine interfaces). Computer hardware and software by themselves should not be…
Kaner, Eileen; Heaven, Ben; Rapley, Tim; Murtagh, Madeleine; Graham, Ruth; Thomson, Richard; May, Carl
2007-01-10
Much of the research on decision-making in health care has focused on consultation outcomes. Less is known about the process by which clinicians and patients come to a treatment decision. This study aimed to quantitatively describe the behaviour shown by doctors and patients during primary care consultations when three types of decision aids were used to promote treatment decision-making in a randomised controlled trial. A video-based study set in an efficacy trial which compared the use of paper-based guidelines (control) with two forms of computer-based decision aids (implicit and explicit versions of DARTS II). Treatment decision concerned warfarin anti-coagulation to reduce the risk of stroke in older patients with atrial fibrillation. Twenty nine consultations were video-recorded. A ten-minute 'slice' of the consultation was sampled for detailed content analysis using existing interaction analysis protocols for verbal behaviour and ethological techniques for non-verbal behaviour. Median consultation times (quartiles) differed significantly depending on the technology used. Paper-based guidelines took 21 (19-26) minutes to work through compared to 31 (16-41) minutes for the implicit tool; and 44 (39-55) minutes for the explicit tool. In the ten minutes immediately preceding the decision point, GPs dominated the conversation, accounting for 64% (58-66%) of all utterances and this trend was similar across all three arms of the trial. Information-giving was the most frequent activity for both GPs and patients, although GPs did this at twice the rate compared to patients and at higher rates in consultations involving computerised decision aids. GPs' language was highly technically focused and just 7% of their conversation was socio-emotional in content; this was half the socio-emotional content shown by patients (15%). However, frequent head nodding and a close mirroring in the direction of eye-gaze suggested that both parties were active participants in the conversation Irrespective of the arm of the trial, both patients' and GPs' behaviour showed that they were reciprocally engaged in these consultations. However, even in consultations aimed at promoting shared decision-making, GPs' were verbally dominant, and they worked primarily as information providers for patients. In addition, computer-based decision aids significantly prolonged the consultations, particularly the later phases. These data suggest that decision aids may not lead to more 'sharing' in treatment decision-making and that, in their current form, they may take too long to negotiate for use in routine primary care.
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2013-01-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2012-07-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.
[Breast cancer and pregnancy: decision making and the point of view of the mother].
Eisinger, François; Noizet, Agnès
2002-09-01
For the treatment of breast cancer, modifications of decision making related to pregnancy could be assessed through three questions. Why a decision had been chosen? In that case, the hypothesis is that decisions are based on the expected utility. The theory assumes weighting and computation of complete possibilities with their associated probabilities and values. However values exhibits a wide inter-individual variation range. Therefore the predictability of choice based on this model is indeed very low. Furthermore it is likely that the willingness of pregnancy after breast cancer contains besides classic constituents of appeals of motherhood, a specific meaning of recovery both of health and femininity. The second question: who is in charge of the decision? And under the paradigm of autonomy, women' decision is, merely by itself, the right decision. The last question is how? For some situations for which foreseeing is quiet complex, the value of the process in itself is increased and could help the end-oriented or self-determined decision. Casuistic analysis could therefore improve women' decisions. The issue is not only about decision but also related to patient-physician relationship, about an issue that is not only a biomedical problem.
Evidence Accumulation and Choice Maintenance Are Dissociated in Human Perceptual Decision Making
Pedersen, Mads Lund; Endestad, Tor; Biele, Guido
2015-01-01
Perceptual decision making in monkeys relies on decision neurons, which accumulate evidence and maintain choices until a response is given. In humans, several brain regions have been proposed to accumulate evidence, but it is unknown if these regions also maintain choices. To test if accumulator regions in humans also maintain decisions we compared delayed and self-paced responses during a face/house discrimination decision making task. Computational modeling and fMRI results revealed dissociated processes of evidence accumulation and decision maintenance, with potential accumulator activations found in the dorsomedial prefrontal cortex, right inferior frontal gyrus and bilateral insula. Potential maintenance activation spanned the frontal pole, temporal gyri, precuneus and the lateral occipital and frontal orbital cortices. Results of a quantitative reverse inference meta-analysis performed to differentiate the functions associated with the identified regions did not narrow down potential accumulation regions, but suggested that response-maintenance might rely on a verbalization of the response. PMID:26510176
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
Groundwater Remediation using Bayesian Information-Gap Decision Theory
NASA Astrophysics Data System (ADS)
O'Malley, D.; Vesselinov, V. V.
2016-12-01
Probabilistic analyses of groundwater remediation scenarios frequently fail because the probability of an adverse, unanticipated event occurring is often high. In general, models of flow and transport in contaminated aquifers are always simpler than reality. Further, when a probabilistic analysis is performed, probability distributions are usually chosen more for convenience than correctness. The Bayesian Information-Gap Decision Theory (BIGDT) was designed to mitigate the shortcomings of the models and probabilistic decision analyses by leveraging a non-probabilistic decision theory - information-gap decision theory. BIGDT considers possible models that have not been explicitly enumerated and does not require us to commit to a particular probability distribution for model and remediation-design parameters. Both the set of possible models and the set of possible probability distributions grow as the degree of uncertainty increases. The fundamental question that BIGDT asks is "How large can these sets be before a particular decision results in an undesirable outcome?". The decision that allows these sets to be the largest is considered to be the best option. In this way, BIGDT enables robust decision-support for groundwater remediation problems. Here we apply BIGDT to in a representative groundwater remediation scenario where different options for hydraulic containment and pump & treat are being considered. BIGDT requires many model runs and for complex models high-performance computing resources are needed. These analyses are carried out on synthetic problems, but are applicable to real-world problems such as LANL site contaminations. BIGDT is implemented in Julia (a high-level, high-performance dynamic programming language for technical computing) and is part of the MADS framework (http://mads.lanl.gov/ and https://github.com/madsjulia/Mads.jl).
Parallel computing in enterprise modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.
2008-08-01
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less
Akkas, Oguz; Lee, Cheng Hsien; Hu, Yu Hen; Harris Adamson, Carisa; Rempel, David; Radwin, Robert G
2017-12-01
Two computer vision algorithms were developed to automatically estimate exertion time, duty cycle (DC) and hand activity level (HAL) from videos of workers performing 50 industrial tasks. The average DC difference between manual frame-by-frame analysis and the computer vision DC was -5.8% for the Decision Tree (DT) algorithm, and 1.4% for the Feature Vector Training (FVT) algorithm. The average HAL difference was 0.5 for the DT algorithm and 0.3 for the FVT algorithm. A sensitivity analysis, conducted to examine the influence that deviations in DC have on HAL, found it remained unaffected when DC error was less than 5%. Thus, a DC error less than 10% will impact HAL less than 0.5 HAL, which is negligible. Automatic computer vision HAL estimates were therefore comparable to manual frame-by-frame estimates. Practitioner Summary: Computer vision was used to automatically estimate exertion time, duty cycle and hand activity level from videos of workers performing industrial tasks.
Vision Based Autonomous Robotic Control for Advanced Inspection and Repair
NASA Technical Reports Server (NTRS)
Wehner, Walter S.
2014-01-01
The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.
2008-12-01
Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.
Teaching Advance Care Planning to Medical Students with a Computer-Based Decision Aid
Levi, Benjamin H.
2013-01-01
Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n=60) outperformed the Standard Group (n=61) in terms of students´ knowledge (p<0.01), confidence in helping patients with advance care planning (p<0.01), knowledge of what matters to patients (p=0.05), and satisfaction with their learning experience (p<0.01). Likewise, patients in the Decision Aid Group were more satisfied with the advance care planning method (p<0.01) and with several aspects of student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients. PMID:20632222
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
Alves-Pinto, A.; Sollini, J.; Sumner, C.J.
2012-01-01
Signal detection theory (SDT) provides a framework for interpreting psychophysical experiments, separating the putative internal sensory representation and the decision process. SDT was used to analyse ferret behavioural responses in a (yes–no) tone-in-noise detection task. Instead of measuring the receiver-operating characteristic (ROC), we tested SDT by comparing responses collected using two common psychophysical data collection methods. These (Constant Stimuli, Limits) differ in the set of signal levels presented within and across behavioural sessions. The results support the use of SDT as a method of analysis: SDT sensory component was unchanged between the two methods, even though decisions depended on the stimuli presented within a behavioural session. Decision criterion varied trial-by-trial: a ‘yes’ response was more likely after a correct rejection trial than a hit trial. Simulation using an SDT model with several decision components reproduced the experimental observations accurately, leaving only ∼10% of the variance unaccounted for. The model also showed that trial-by-trial dependencies were unlikely to influence measured psychometric functions or thresholds. An additional model component suggested that inattention did not contribute substantially. Further analysis showed that ferrets were changing their decision criteria, almost optimally, to maximise the reward obtained in a session. The data suggest trial-by-trial reward-driven optimization of the decision process. Understanding the factors determining behavioural responses is important for correlating neural activity and behaviour. SDT provides a good account of animal psychoacoustics, and can be validated using standard psychophysical methods and computer simulations, without recourse to ROC measurements. PMID:22698686
Computer modeling of human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.
42 CFR 412.278 - Administrator's review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... or computational errors, or to correct the decision if the evidence that was considered in making the... discretion, may amend the decision to correct mathematical or computational errors, or to correct the...
42 CFR 412.278 - Administrator's review.
Code of Federal Regulations, 2011 CFR
2011-10-01
... or computational errors, or to correct the decision if the evidence that was considered in making the... discretion, may amend the decision to correct mathematical or computational errors, or to correct the...
42 CFR 412.278 - Administrator's review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... or computational errors, or to correct the decision if the evidence that was considered in making the... discretion, may amend the decision to correct mathematical or computational errors, or to correct the...
42 CFR 412.278 - Administrator's review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... or computational errors, or to correct the decision if the evidence that was considered in making the... discretion, may amend the decision to correct mathematical or computational errors, or to correct the...
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
Web-services-based spatial decision support system to facilitate nuclear waste siting
NASA Astrophysics Data System (ADS)
Huang, L. Xinglai; Sheng, Grant
2006-10-01
The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.
Analysis of Multi-State Systems with Multi-State Components Using EVMDDs
2012-05-01
Fault-Tolerant Computing (FTCS), pp. 249– 258, June 1995. [5] T. Kam, T. Villa, R. K. Brayton , and A. L. Sangiovanni- Vincentelli, “Multi-valued...Shmerko, and R. S. Stankovic, Decision Diagram Techniques for Micro- and Nanoelectronic Design, CRC Press, Taylor & Francis Group, 2006. [16] X. Zang, D
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.
Undrill, P E; Frazer, S C
1979-01-01
A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340
Decision making and problem solving with computer assistance
NASA Technical Reports Server (NTRS)
Kraiss, F.
1980-01-01
In modern guidance and control systems, the human as manager, supervisor, decision maker, problem solver and trouble shooter, often has to cope with a marginal mental workload. To improve this situation, computers should be used to reduce the operator from mental stress. This should not solely be done by increased automation, but by a reasonable sharing of tasks in a human-computer team, where the computer supports the human intelligence. Recent developments in this area are summarized. It is shown that interactive support of operator by intelligent computer is feasible during information evaluation, decision making and problem solving. The applied artificial intelligence algorithms comprehend pattern recognition and classification, adaptation and machine learning as well as dynamic and heuristic programming. Elementary examples are presented to explain basic principles.
Pest management in Douglas-fir seed orchards: a microcomputer decision method
James B. Hoy; Michael I. Haverty
1988-01-01
The computer program described provides a Douglas-fir seed orchard manager (user) with a quantitative method for making insect pest management decisions on a desk-top computer. The decision system uses site-specific information such as estimates of seed crop size, insect attack rates, insecticide efficacy and application costs, weather, and crop value. At sites where...
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Smart algorithms and adaptive methods in computational fluid dynamics
NASA Astrophysics Data System (ADS)
Tinsley Oden, J.
1989-05-01
A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.
Decision Making and Reward in Frontal Cortex
Kennerley, Steven W.; Walton, Mark E.
2011-01-01
Patients with damage to the prefrontal cortex (PFC)—especially the ventral and medial parts of PFC—often show a marked inability to make choices that meet their needs and goals. These decision-making impairments often reflect both a deficit in learning concerning the consequences of a choice, as well as deficits in the ability to adapt future choices based on experienced value of the current choice. Thus, areas of PFC must support some value computations that are necessary for optimal choice. However, recent frameworks of decision making have highlighted that optimal and adaptive decision making does not simply rest on a single computation, but a number of different value computations may be necessary. Using this framework as a guide, we summarize evidence from both lesion studies and single-neuron physiology for the representation of different value computations across PFC areas. PMID:21534649
The conscious mind and its emergent properties; an analysis based on decision theory.
Morris, James A
2011-08-01
The process of conscious and unconscious decision making is analyzed using decision theory. An essential part of an optimum decision strategy is the assessment of values and costs associated with correct and incorrect decisions. In the case of unconscious decisions this involves an automatic process akin to computation using numerical values. But for conscious decisions the conscious mind must experience the outcome of the decision as pleasure or pain. It is suggested that the rules of behavior are programmed in our genes but modified by experience of the society in which we are reared. Our unconscious then uses the rules to reward or punish our conscious mind for the decisions it makes. This is relevant to concepts of altruism and religion in society. It is consistent with the observation that we prefer beauty to utility. The decision theory equations also explain the paradox that a single index of happiness can be applied in society. The symptoms of mental illness can be due to appropriate or inappropriate action by the unconscious. The former indicates a psychological conflict between conscious and unconscious decision making. Inappropriate action indicates that a pathological process has switched on genetic networks that should be switched off. Copyright © 2011 Elsevier Ltd. All rights reserved.
Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models
NASA Astrophysics Data System (ADS)
Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan
2017-04-01
Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).
Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian
2011-01-01
Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024
ERIC Educational Resources Information Center
Dulaney, Malik H.
2013-01-01
Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…
Making Informed Decisions: Management Issues Influencing Computers in the Classroom.
ERIC Educational Resources Information Center
Strickland, James
A number of noninstructional factors appear to determine the extent to which computers make a difference in writing instruction. Once computers have been purchased and installed, it is generally school administrators who make management decisions, often from an uninformed pedagogical orientation. Issues such as what hardware and software to buy,…
78 FR 39233 - Data Practices, Computer III Further Remand: BOC Provision of Enhanced Services
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... additional information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document....702 of the Commission's rules and regulations (Computer II Final Decision), 77 FCC 2d 384 (1980... Commission's decision to lift structural separation in Computer III and the implementation of ONA. In light...
Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward
ERIC Educational Resources Information Center
Miller, Randolph A.
2009-01-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…
How robotics programs influence young women's career choices : a grounded theory model
NASA Astrophysics Data System (ADS)
Craig, Cecilia Dosh-Bluhm
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.
Huk, Alexander C.; Meister, Miriam L. R.
2012-01-01
A recent line of work has found remarkable success in relating perceptual decision-making and the spiking activity in the macaque lateral intraparietal area (LIP). In this review, we focus on questions about the neural computations in LIP that are not answered by demonstrations of neural correlates of psychological processes. We highlight three areas of limitations in our current understanding of the precise neural computations that might underlie neural correlates of decisions: (1) empirical questions not yet answered by existing data; (2) implementation issues related to how neural circuits could actually implement the mechanisms suggested by both extracellular neurophysiology and psychophysics; and (3) ecological constraints related to the use of well-controlled laboratory tasks and whether they provide an accurate window on sensorimotor computation. These issues motivate the adoption of a more general “encoding-decoding framework” that will be fruitful for more detailed contemplation of how neural computations in LIP relate to the formation of perceptual decisions. PMID:23087623
Principles of Experimental Design for Big Data Analysis.
Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G
2017-08-01
Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.
Principles of Experimental Design for Big Data Analysis
Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G
2016-01-01
Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis. PMID:28883686
The Impact of Computed Tomography on Decision Making in Tibial Plateau Fractures.
Castiglia, Marcello Teixeira; Nogueira-Barbosa, Marcello Henrique; Messias, Andre Marcio Vieira; Salim, Rodrigo; Fogagnolo, Fabricio; Schatzker, Joseph; Kfuri, Mauricio
2018-02-14
Schatzker introduced one of the most used classification systems for tibial plateau fractures, based on plain radiographs. Computed tomography brought to attention the importance of coronal plane-oriented fractures. The goal of our study was to determine if the addition of computed tomography would affect the decision making of surgeons who usually use the Schatzker classification to assess tibial plateau fractures. Image studies of 70 patients who sustained tibial plateau fractures were uploaded to a dedicated homepage. Every patient was linked to a folder which contained two radiographic projections (anteroposterior and lateral), three interactive videos of computed tomography (axial, sagittal, and coronal), and eight pictures depicting tridimensional reconstructions of the tibial plateau. Ten attending orthopaedic surgeons, who were blinded to the cases, were granted access to the homepage and assessed each set of images in two different rounds, separated to each other by an interval of 2 weeks. Each case was evaluated in three steps, where surgeons had access, respectively to radiographs, two-dimensional videos of computed tomography, and three-dimensional reconstruction images. After every step, surgeons were asked to present how would they classify the case using the Schatzker system and which surgical approaches would be appropriate. We evaluated the inter- and intraobserver reliability of the Schatzker classification using the Kappa concordance coefficient, as well as the impact of computed tomography in the decision making regarding the surgical approach for each case, by using the chi-square test and likelihood ratio. The interobserver concordance kappa coefficients after each assessment step were, respectively, 0.58, 0.62, and 0.64. For the intraobserver analysis, the coefficients were, respectively, 0.76, 0.75, and 0.78. Computed tomography changed the surgical approach selection for the types II, V, and VI of Schatzker ( p < 0.01). The addition of computed tomography scans to plain radiographs improved the interobserver reliability of Schatzker classification. Computed tomography had a statistically significant impact in the selection of surgical approaches for the lateral tibial plateau. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Cheung, Steven W; Aranda, Derick; Driscoll, Colin L W; Parsa, Andrew T
2010-02-01
Complex medical decision making obligates tradeoff assessments among treatment outcomes expectations, but an accessible tool to perform the necessary analysis is conspicuously absent. We aimed to demonstrate methodology and feasibility of adapting conjoint analysis for mapping clinical outcomes expectations to treatment decisions in vestibular schwannoma (VS) management. Prospective. Tertiary medical center and US-based otologists/neurotologists. Treatment preference profiles among VS stakeholders-61 younger and 74 older prospective patients, 61 observation patients, and 60 surgeons-were assessed for the synthetic VS case scenario of a 10-mm tumor in association with useful hearing and normal facial function. Treatment attribute utility. Conjoint analysis attribute levels were set in accordance to the results of a meta-analysis. Forty-five case series were disaggregated to formulate microsurgery facial nerve and hearing preservation outcomes expectations models. Attribute utilities were computed and mapped to the realistic treatment choices of translabyrinthine craniotomy, middle fossa craniotomy, and gamma knife radiosurgery. Among the treatment attributes of likelihoods of causing deafness, temporary facial weakness for 2 months, and incurable cancer within 20 years, and recovery time, permanent deafness was less important to tumor surgeons, and temporary facial weakness was more important to tumor surgeons and observation patients (Wilcoxon rank-sum, p < 0.001). Inverse mapping of preference profiles to realistic treatment choices showed all study cohorts were inclined to choose gamma knife radiosurgery. Mapping clinical outcomes expectations to treatment decisions for a synthetic clinical scenario revealed inhomogeneous drivers of choice selection among study cohorts. Medical decision engines that analyze personal preferences of outcomes expectations for VS and many other diseases may be developed to promote shared decision making among health care stakeholders and transparency in the informed consent process.
Norris, Gareth
2015-01-01
The increasing use of multi-media applications, trial presentation software and computer generated exhibits (CGE) has raised questions as to the potential impact of the use of presentation technology on juror decision making. A significant amount of the commentary on the manner in which CGE exerts legal influence is largely anecdotal; empirical examinations too are often devoid of established theoretical rationalisations. This paper will examine a range of established judgement heuristics (for example, the attribution error, representativeness, simulation), in order to establish their appropriate application for comprehending legal decisions. Analysis of both past cases and empirical studies will highlight the potential for heuristics and biases to be restricted or confounded by the use of CGE. The paper will conclude with some wider discussion on admissibility, access to justice, and emerging issues in the use of multi-media in court. Copyright © 2015 Elsevier Ltd. All rights reserved.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
The use of decision analysis to examine ethical decision making by critical care nurses.
Hughes, K K; Dvorak, E M
1997-01-01
To examine the extent to which critical care staff nurses make ethical decisions that coincide with those recommended by a decision analytic model. Nonexperimental, ex post facto. Midwestern university-affiliated 500 bed tertiary care medical center. One hundred critical care staff nurses randomly selected from seven critical care units. Complete responses were obtained from 82 nurses (for a final response rate of 82%). The dependent variable--consistent decision making--was measured as staff nurses' abilities to make ethical decisions that coincided with those prescribed by the decision model. Subjects completed two instruments, the Ethical Decision Analytic Model, a computer-administered instrument designed to measure staff nurses' abilities to make consistent decisions about a chemically-impaired colleague; and a Background Inventory. The results indicate marked consensus among nurses when informal methods were used. However, there was little consistency between the nurses' informal decisions and those recommended by the decision analytic model. Although 50% (n = 41) of all nurses chose a course of action that coincided with the model's least optimal alternative, few nurses agreed with the model as to the most optimal course of action. The findings also suggest that consistency was unrelated (p > 0.05) to the nurses' educational background or years of clinical experience; that most subjects reported receiving little or no education in decision making during their basic nursing education programs; but that exposure to decision-making strategies was related to years of nursing experience (p < 0.05). The findings differ from related studies that have found a moderate degree of consistency between nurses and decision analytic models for strictly clinical decision tasks, especially when those tasks were less complex. However, the findings partially coincide with other findings that decision analysis may not be particularly well-suited to the critical care environment. Additional research is needed to determine whether critical care nurses use the same decision-making methods as do other nurses; and to clarify the effects of decision task (clinical versus ethical) on nurses' decision making. It should not be assumed that methods used to study nurses' clinical decision making are applicable for all nurses or all types of decisions, including ethical decisions.
ERIC Educational Resources Information Center
Vos, Hans J.
As part of a project formulating optimal rules for decision making in computer assisted instructional systems in which the computer is used as a decision support tool, an approach that simultaneously optimizes classification of students into two treatments, each followed by a mastery decision, is presented using the framework of Bayesian decision…
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology.
Bhargava, Rohit; Madabhushi, Anant
2016-07-11
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area.
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology
Bhargava, Rohit; Madabhushi, Anant
2017-01-01
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area. PMID:27420575
Enabling computer decisions based on EEG input.
Culpepper, Benjamin J; Keller, Robert M
2003-12-01
Multilayer neural networks were successfully trained to classify segments of 12-channel electroencephalogram (EEG) data into one of five classes corresponding to five cognitive tasks performed by a subject. Independent component analysis (ICA) was used to segregate obvious artifact EEG components from other sources, and a frequency-band representation was used to represent the sources computed by ICA. Examples of results include an 85% accuracy rate on differentiation between two tasks, using a segment of EEG only 0.05 s long and a 95% accuracy rate using a 0.5-s-long segment.
Enabling computer decisions based on EEG input
NASA Technical Reports Server (NTRS)
Culpepper, Benjamin J.; Keller, Robert M.
2003-01-01
Multilayer neural networks were successfully trained to classify segments of 12-channel electroencephalogram (EEG) data into one of five classes corresponding to five cognitive tasks performed by a subject. Independent component analysis (ICA) was used to segregate obvious artifact EEG components from other sources, and a frequency-band representation was used to represent the sources computed by ICA. Examples of results include an 85% accuracy rate on differentiation between two tasks, using a segment of EEG only 0.05 s long and a 95% accuracy rate using a 0.5-s-long segment.
The decision tree approach to classification
NASA Technical Reports Server (NTRS)
Wu, C.; Landgrebe, D. A.; Swain, P. H.
1975-01-01
A class of multistage decision tree classifiers is proposed and studied relative to the classification of multispectral remotely sensed data. The decision tree classifiers are shown to have the potential for improving both the classification accuracy and the computation efficiency. Dimensionality in pattern recognition is discussed and two theorems on the lower bound of logic computation for multiclass classification are derived. The automatic or optimization approach is emphasized. Experimental results on real data are reported, which clearly demonstrate the usefulness of decision tree classifiers.
Conditioned associations and economic decision biases.
Guitart-Masip, Marc; Talmi, Deborah; Dolan, Ray
2010-10-15
Humans show substantial deviation from rationality during economic decision making under uncertainty. A computational perspective suggests these deviations arise out of an interaction between distinct valuation systems in the brain. Here, we provide behavioural data showing that the incidental presentation of aversive and appetitive conditioned stimuli can alter subjects' preferences in an economic task, involving a choice between a safe or gamble option. These behavioural effects informed a model-based analysis of a functional magnetic resonance imaging (fMRI) experiment, involving an identical paradigm, where we demonstrate that this conditioned behavioral bias engages the amygdala, a brain structure associated with acquisition and expression of conditioned associations. Our findings suggest that a well known bias in human economic choice can arise from an influence of conditioned associations on goal-directed decision making, consistent with an architecture of choice that invokes distinct decision-making systems. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.
2012-03-01
Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.
Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A
2006-01-01
Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.
Self-evaluation of decision-making: A general Bayesian framework for metacognitive computation.
Fleming, Stephen M; Daw, Nathaniel D
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a "second-order" inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one's own actions to metacognitive judgments. In addition, the model provides insight into why subjects' metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Self-Evaluation of Decision-Making: A General Bayesian Framework for Metacognitive Computation
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a “second-order” inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one’s own actions to metacognitive judgments. In addition, the model provides insight into why subjects’ metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. PMID:28004960
MoCog1: A computer simulation of recognition-primed human decision making, considering emotions
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1992-01-01
The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.
MoCog1: A computer simulation of recognition-primed human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
NASA Astrophysics Data System (ADS)
Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong
2016-08-01
A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.
ERIC Educational Resources Information Center
Agasisti, Tommaso; Johnes, Geraint
2009-01-01
We employ Data Envelopment Analysis to compute the technical efficiency of Italian and English higher education institutions. Our results show that, in relation to the country-specific frontier, institutions in both countries are typically very efficient. However, institutions in England are more efficient than those in Italy when we compare…
Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.
ERIC Educational Resources Information Center
Swiger, John; Klaus, Allen
1996-01-01
A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)
Finding P-Values for F Tests of Hypothesis on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…
Application of wildfire simulation models for risk analysis
Alan A. Ager; Mark A. Finney
2009-01-01
Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of...
William H. McWilliams; Carol L. Alerich; William A. Bechtold; Mark Hansen; Christopher M. Oswalt; Mike Thompson; Jeff Turner
2012-01-01
The U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program maintains the National Information Management System (NIMS) that provides the computational framework for the annual forest inventory of the United States. Questions regarding the impact of key elements of programming logic, processing criteria, and estimation procedures...
The Automated Logistics Element Planning System (ALEPS)
NASA Technical Reports Server (NTRS)
Schwaab, Douglas G.
1991-01-01
The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.
Reinforcement learning and decision making in monkeys during a competitive game.
Lee, Daeyeol; Conroy, Michelle L; McGreevy, Benjamin P; Barraclough, Dominic J
2004-12-01
Animals living in a dynamic environment must adjust their decision-making strategies through experience. To gain insights into the neural basis of such adaptive decision-making processes, we trained monkeys to play a competitive game against a computer in an oculomotor free-choice task. The animal selected one of two visual targets in each trial and was rewarded only when it selected the same target as the computer opponent. To determine how the animal's decision-making strategy can be affected by the opponent's strategy, the computer opponent was programmed with three different algorithms that exploited different aspects of the animal's choice and reward history. When the computer selected its targets randomly with equal probabilities, animals selected one of the targets more often, violating the prediction of probability matching, and their choices were systematically influenced by the choice history of the two players. When the computer exploited only the animal's choice history but not its reward history, animal's choice became more independent of its own choice history but was still related to the choice history of the opponent. This bias was substantially reduced, but not completely eliminated, when the computer used the choice history of both players in making its predictions. These biases were consistent with the predictions of reinforcement learning, suggesting that the animals sought optimal decision-making strategies using reinforcement learning algorithms.
Development of Fuzzy Logic and Soft Computing Methodologies
NASA Technical Reports Server (NTRS)
Zadeh, L. A.; Yager, R.
1999-01-01
Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be, driven by a quest to progress from perceptions to measurements. Pursuit of this aim has led to brilliant successes. But alongside the successes stand problems whose solutions are not in sight. Representative of such problems is the problem of automation of driving in city traffic. In this case, as in many others, what can be done with ease by humans - without any measurements and a computations - is an intractable task for machines.
Computer Applications in Social Studies.
ERIC Educational Resources Information Center
White, Charles S.
1988-01-01
Examines "Decisions, Decisions-Revolutionary Wars: Choosing Sides," an Apple II software package that emphasizes student decision-making about the nature of revolutions. Targeted at grades 5-12, the product covers a broad range of issues. Concludes that "Decisions, Decisions" models an effective decision-making process and has…
Data Mining and Knowledge Discover - IBM Cognitive Alternatives for NASA KSC
NASA Technical Reports Server (NTRS)
Velez, Victor Hugo
2016-01-01
Skillful tools in cognitive computing to transform industries have been found favorable and profitable for different Directorates at NASA KSC. In this study is shown how cognitive computing systems can be useful for NASA when computers are trained in the same way as humans are to gain knowledge over time. Increasing knowledge through senses, learning and a summation of events is how the applications created by the firm IBM empower the artificial intelligence in a cognitive computing system. NASA has explored and applied for the last decades the artificial intelligence approach specifically with cognitive computing in few projects adopting similar models proposed by IBM Watson. However, the usage of semantic technologies by the dedicated business unit developed by IBM leads these cognitive computing applications to outperform the functionality of the inner tools and present outstanding analysis to facilitate the decision making for managers and leads in a management information system.
Research on AHP decision algorithms based on BP algorithm
NASA Astrophysics Data System (ADS)
Ma, Ning; Guan, Jianhe
2017-10-01
Decision making is the thinking activity that people choose or judge, and scientific decision-making has always been a hot issue in the field of research. Analytic Hierarchy Process (AHP) is a simple and practical multi-criteria and multi-objective decision-making method that combines quantitative and qualitative and can show and calculate the subjective judgment in digital form. In the process of decision analysis using AHP method, the rationality of the two-dimensional judgment matrix has a great influence on the decision result. However, in dealing with the real problem, the judgment matrix produced by the two-dimensional comparison is often inconsistent, that is, it does not meet the consistency requirements. BP neural network algorithm is an adaptive nonlinear dynamic system. It has powerful collective computing ability and learning ability. It can perfect the data by constantly modifying the weights and thresholds of the network to achieve the goal of minimizing the mean square error. In this paper, the BP algorithm is used to deal with the consistency of the two-dimensional judgment matrix of the AHP.
Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles
2004-01-01
The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.
Reason, emotion and decision-making: risk and reward computation with feeling.
Quartz, Steven R
2009-05-01
Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.
A Neural Information Field Approach to Computational Cognition
2016-11-18
We have extended our perceptual decision making model to account for the effects of context in this flexible DISTRIBUTION A. Approved for public...developed a new perceptual decision making model; demonstrated adaptive motor control in a large-scale cognitive simulation with spiking neurons (Spaun...TERMS EOARD, Computational Cognition, Mixed-initiative decision making 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF
NASA Astrophysics Data System (ADS)
Iacobucci, Joseph V.
The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).
Efficient Computation of Info-Gap Robustness for Finite Element Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.
2012-07-05
A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less
Robust averaging protects decisions from noise in neural computations
Herce Castañón, Santiago; Solomon, Joshua A.; Vandormael, Hildward
2017-01-01
An ideal observer will give equivalent weight to sources of information that are equally reliable. However, when averaging visual information, human observers tend to downweight or discount features that are relatively outlying or deviant (‘robust averaging’). Why humans adopt an integration policy that discards important decision information remains unknown. Here, observers were asked to judge the average tilt in a circular array of high-contrast gratings, relative to an orientation boundary defined by a central reference grating. Observers showed robust averaging of orientation, but the extent to which they did so was a positive predictor of their overall performance. Using computational simulations, we show that although robust averaging is suboptimal for a perfect integrator, it paradoxically enhances performance in the presence of “late” noise, i.e. which corrupts decisions during integration. In other words, robust decision strategies increase the brain’s resilience to noise arising in neural computations during decision-making. PMID:28841644
Computer-assisted diagnostic decision support: history, challenges, and possible paths forward.
Miller, Randolph A
2009-09-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References indicate the original sources of many of these ideas.
Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E
2012-01-01
In this paper we describe an Heart Failure analysis Dashboard that, combined with a handy device for the automatic acquisition of a set of patient's clinical parameters, allows to support telemonitoring functions. The Dashboard's intelligent core is a Computer Decision Support System designed to assist the clinical decision of non-specialist caring personnel, and it is based on three functional parts: Diagnosis, Prognosis, and Follow-up management. Four Artificial Intelligence-based techniques are compared for providing diagnosis function: a Neural Network, a Support Vector Machine, a Classification Tree and a Fuzzy Expert System whose rules are produced by a Genetic Algorithm. State of the art algorithms are used to support a score-based prognosis function. The patient's Follow-up is used to refine the diagnosis.
Group Augmentation in Realistic Visual-Search Decisions via a Hybrid Brain-Computer Interface.
Valeriani, Davide; Cinel, Caterina; Poli, Riccardo
2017-08-10
Groups have increased sensing and cognition capabilities that typically allow them to make better decisions. However, factors such as communication biases and time constraints can lead to less-than-optimal group decisions. In this study, we use a hybrid Brain-Computer Interface (hBCI) to improve the performance of groups undertaking a realistic visual-search task. Our hBCI extracts neural information from EEG signals and combines it with response times to build an estimate of the decision confidence. This is used to weigh individual responses, resulting in improved group decisions. We compare the performance of hBCI-assisted groups with the performance of non-BCI groups using standard majority voting, and non-BCI groups using weighted voting based on reported decision confidence. We also investigate the impact on group performance of a computer-mediated form of communication between members. Results across three experiments suggest that the hBCI provides significant advantages over non-BCI decision methods in all cases. We also found that our form of communication increases individual error rates by almost 50% compared to non-communicating observers, which also results in worse group performance. Communication also makes reported confidence uncorrelated with the decision correctness, thereby nullifying its value in weighing votes. In summary, best decisions are achieved by hBCI-assisted, non-communicating groups.
Type-2 fuzzy set extension of DEMATEL method combined with perceptual computing for decision making
NASA Astrophysics Data System (ADS)
Hosseini, Mitra Bokaei; Tarokh, Mohammad Jafar
2013-05-01
Most decision making methods used to evaluate a system or demonstrate the weak and strength points are based on fuzzy sets and evaluate the criteria with words that are modeled with fuzzy sets. The ambiguity and vagueness of the words and different perceptions of a word are not considered in these methods. For this reason, the decision making methods that consider the perceptions of decision makers are desirable. Perceptual computing is a subjective judgment method that considers that words mean different things to different people. This method models words with interval type-2 fuzzy sets that consider the uncertainty of the words. Also, there are interrelations and dependency between the decision making criteria in the real world; therefore, using decision making methods that cannot consider these relations is not feasible in some situations. The Decision-Making Trail and Evaluation Laboratory (DEMATEL) method considers the interrelations between decision making criteria. The current study used the combination of DEMATEL and perceptual computing in order to improve the decision making methods. For this reason, the fuzzy DEMATEL method was extended into type-2 fuzzy sets in order to obtain the weights of dependent criteria based on the words. The application of the proposed method is presented for knowledge management evaluation criteria.
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
ERIC Educational Resources Information Center
Rukeyser, William L.; Kuersten, Joan
1998-01-01
Parents must be well-informed about computer technology in order to make appropriate decisions for their schools. The paper discusses pro's and con's of computers in the schools, the importance of dispelling myths about education and technology, and parents' roles in making decisions. (SM)
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
A neuromorphic network for generic multivariate data classification
Schmuker, Michael; Pfeil, Thomas; Nawrot, Martin Paul
2014-01-01
Computational neuroscience has uncovered a number of computational principles used by nervous systems. At the same time, neuromorphic hardware has matured to a state where fast silicon implementations of complex neural networks have become feasible. En route to future technical applications of neuromorphic computing the current challenge lies in the identification and implementation of functional brain algorithms. Taking inspiration from the olfactory system of insects, we constructed a spiking neural network for the classification of multivariate data, a common problem in signal and data analysis. In this model, real-valued multivariate data are converted into spike trains using “virtual receptors” (VRs). Their output is processed by lateral inhibition and drives a winner-take-all circuit that supports supervised learning. VRs are conveniently implemented in software, whereas the lateral inhibition and classification stages run on accelerated neuromorphic hardware. When trained and tested on real-world datasets, we find that the classification performance is on par with a naïve Bayes classifier. An analysis of the network dynamics shows that stable decisions in output neuron populations are reached within less than 100 ms of biological time, matching the time-to-decision reported for the insect nervous system. Through leveraging a population code, the network tolerates the variability of neuronal transfer functions and trial-to-trial variation that is inevitably present on the hardware system. Our work provides a proof of principle for the successful implementation of a functional spiking neural network on a configurable neuromorphic hardware system that can readily be applied to real-world computing problems. PMID:24469794
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms
NASA Technical Reports Server (NTRS)
Buonanno, Michael; Mavris, Dimitri
2005-01-01
The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.
Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450
Falat, Lukas; Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.
Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.
Dasbach, Erik J; Elbasha, Elamin H
2017-07-01
Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.
Classification of large-scale fundus image data sets: a cloud-computing framework.
Roychowdhury, Sohini
2016-08-01
Large medical image data sets with high dimensionality require substantial amount of computation time for data creation and data processing. This paper presents a novel generalized method that finds optimal image-based feature sets that reduce computational time complexity while maximizing overall classification accuracy for detection of diabetic retinopathy (DR). First, region-based and pixel-based features are extracted from fundus images for classification of DR lesions and vessel-like structures. Next, feature ranking strategies are used to distinguish the optimal classification feature sets. DR lesion and vessel classification accuracies are computed using the boosted decision tree and decision forest classifiers in the Microsoft Azure Machine Learning Studio platform, respectively. For images from the DIARETDB1 data set, 40 of its highest-ranked features are used to classify four DR lesion types with an average classification accuracy of 90.1% in 792 seconds. Also, for classification of red lesion regions and hemorrhages from microaneurysms, accuracies of 85% and 72% are observed, respectively. For images from STARE data set, 40 high-ranked features can classify minor blood vessels with an accuracy of 83.5% in 326 seconds. Such cloud-based fundus image analysis systems can significantly enhance the borderline classification performances in automated screening systems.
M.S.L.A.P. Modular Spectral Line Analysis Program documentation
NASA Technical Reports Server (NTRS)
Joseph, Charles L.; Jenkins, Edward B.
1991-01-01
MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.
Ubiquitous computing in sports: A review and analysis.
Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp
2009-10-01
Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Optimal routing of hazardous substances in time-varying, stochastic transportation networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, A.L.; Miller-Hooks, E.; Mahmassani, H.S.
This report is concerned with the selection of routes in a network along which to transport hazardous substances, taking into consideration several key factors pertaining to the cost of transport and the risk of population exposure in the event of an accident. Furthermore, the fact that travel time and the risk measures are not constant over time is explicitly recognized in the routing decisions. Existing approaches typically assume static conditions, possibly resulting in inefficient route selection and unnecessary risk exposure. The report described the application of recent advances in network analysis methodologies to the problem of routing hazardous substances. Severalmore » specific problem formulations are presented, reflecting different degrees of risk aversion on the part of the decision-maker, as well as different possible operational scenarios. All procedures explicitly consider travel times and travel costs (including risk measures) to be stochastic time-varying quantities. The procedures include both exact algorithms, which may require extensive computational effort in some situations, as well as more efficient heuristics that may not guarantee a Pareto-optimal solution. All procedures are systematically illustrated for an example application using the Texas highway network, for both normal and incident condition scenarios. The application illustrates the trade-offs between the information obtained in the solution and computational efficiency, and highlights the benefits of incorporating these procedures in a decision-support system for hazardous substance shipment routing decisions.« less
NASA Astrophysics Data System (ADS)
Lei, Ted Chih-Wei; Tseng, Fan-Shuo
2017-07-01
This paper addresses the problem of high-computational complexity decoding in traditional Wyner-Ziv video coding (WZVC). The key focus is the migration of two traditionally high-computationally complex encoder algorithms, namely motion estimation and mode decision. In order to reduce the computational burden in this process, the proposed architecture adopts the partial boundary matching algorithm and four flexible types of block mode decision at the decoder. This approach does away with the need for motion estimation and mode decision at the encoder. The experimental results show that the proposed padding block-based WZVC not only decreases decoder complexity to approximately one hundredth that of the state-of-the-art DISCOVER decoding but also outperforms DISCOVER codec by up to 3 to 4 dB.
Decision theory, reinforcement learning, and the brain.
Dayan, Peter; Daw, Nathaniel D
2008-12-01
Decision making is a core competence for animals and humans acting and surviving in environments they only partially comprehend, gaining rewards and punishments for their troubles. Decision-theoretic concepts permeate experiments and computational models in ethology, psychology, and neuroscience. Here, we review a well-known, coherent Bayesian approach to decision making, showing how it unifies issues in Markovian decision problems, signal detection psychophysics, sequential sampling, and optimal exploration and discuss paradigmatic psychological and neural examples of each problem. We discuss computational issues concerning what subjects know about their task and how ambitious they are in seeking optimal solutions; we address algorithmic topics concerning model-based and model-free methods for making choices; and we highlight key aspects of the neural implementation of decision making.
Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei
2015-04-01
Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.
Design Analysis Kit for Optimization and Terascale Applications 6.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less
Automated labeling of log features in CT imagery of multiple hardwood species
Daniel L. Schmoldt; Jing He; A. Lynn Abbott
2000-01-01
Before noninvasive scanning, e.g., computed tomography (CT), becomes feasible in industrial saw-mill operations, we need a procedure that can automatically interpret scan information in order to provide the saw operator with information necessary to make proper sawing decisions. To this end, we have worked to develop an approach for automatic analysis of CT images of...
Supporting NASA Facilities Through GIS
NASA Technical Reports Server (NTRS)
Ingham, Mary E.
2000-01-01
The NASA GIS Team supports NASA facilities and partners in the analysis of spatial data. Geographic Information System (G[S) is an integration of computer hardware, software, and personnel linking topographic, demographic, utility, facility, image, and other geo-referenced data. The system provides a graphic interface to relational databases and supports decision making processes such as planning, design, maintenance and repair, and emergency response.
Economics of cutting hardwood dimension parts with an automated system
Henry A. Huber; Steve Ruddell; Kalinath Mukherjee; Charles W. McMillin
1989-01-01
A financial analysis using discounted cash-flow decision methods was completed to determine the economic feasibility of replacing a conventional roughmill crosscut and rip operation with a proposed automated computer vision and laser cutting system. Red oak and soft maple lumber were cut at production levels of 30 thousand board feet (MBF)/day and 5 MBF/day to produce...
NASA Technical Reports Server (NTRS)
Kriegler, F.; Marshall, R.; Lampert, S.; Gordon, M.; Cornell, C.; Kistler, R.
1973-01-01
The MIDAS system is a prototype, multiple-pipeline digital processor mechanizing the multivariate-Gaussian, maximum-likelihood decision algorithm operating at 200,000 pixels/second. It incorporates displays and film printer equipment under control of a general purpose midi-computer and possesses sufficient flexibility that operational versions of the equipment may be subsequently specified as subsets of the system.
Cook, David A; Sorensen, Kristi J; Wilkinson, John M; Berger, Richard A
2013-11-25
Answering clinical questions affects patient-care decisions and is important to continuous professional development. The process of point-of-care learning is incompletely understood. To understand what barriers and enabling factors influence physician point-of-care learning and what decisions physicians face during this process. Focus groups with grounded theory analysis. Focus group discussions were transcribed and then analyzed using a constant comparative approach to identify barriers, enabling factors, and key decisions related to physician information-seeking activities. Academic medical center and outlying community sites. Purposive sample of 50 primary care and subspecialist internal medicine and family medicine physicians, interviewed in 11 focus groups. Insufficient time was the main barrier to point-of-care learning. Other barriers included the patient comorbidities and contexts, the volume of available information, not knowing which resource to search, doubt that the search would yield an answer, difficulty remembering questions for later study, and inconvenient access to computers. Key decisions were whether to search (reasons to search included infrequently seen conditions, practice updates, complex questions, and patient education), when to search (before, during, or after the clinical encounter), where to search (with the patient present or in a separate room), what type of resource to use (colleague or computer), what specific resource to use (influenced first by efficiency and second by credibility), and when to stop. Participants noted that key features of efficiency (completeness, brevity, and searchability) are often in conflict. Physicians perceive that insufficient time is the greatest barrier to point-of-care learning, and efficiency is the most important determinant in selecting an information source. Designing knowledge resources and systems to target key decisions may improve learning and patient care.
NASA Astrophysics Data System (ADS)
Liou, Cheng-Dar
2015-09-01
This study investigates an infinite capacity Markovian queue with a single unreliable service station, in which the customers may balk (do not enter) and renege (leave the queue after entering). The unreliable service station can be working breakdowns even if no customers are in the system. The matrix-analytic method is used to compute the steady-state probabilities for the number of customers, rate matrix and stability condition in the system. The single-objective model for cost and bi-objective model for cost and expected waiting time are derived in the system to fit in with practical applications. The particle swarm optimisation algorithm is implemented to find the optimal combinations of parameters in the pursuit of minimum cost. Two different approaches are used to identify the Pareto optimal set and compared: the epsilon-constraint method and non-dominate sorting genetic algorithm. Compared results allow using the traditional optimisation approach epsilon-constraint method, which is computationally faster and permits a direct sensitivity analysis of the solution under constraint or parameter perturbation. The Pareto front and non-dominated solutions set are obtained and illustrated. The decision makers can use these to improve their decision-making quality.
Decision making in recurrent neuronal circuits.
Wang, Xiao-Jing
2008-10-23
Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1984-01-01
Guidance is presented to NASA Computer Security Officials for determining the acceptability or unacceptability of ADP security risks based on the technical, operational and economic feasibility of potential safeguards. The risk management process is reviewed as a specialized application of the systems approach to problem solving and information systems analysis and design. Reporting the results of the risk reduction analysis to management is considered. Report formats for the risk reduction study are provided.
Decision making for pancreatic resection in patients with intraductal papillary mucinous neoplasms.
Xu, Bin; Ding, Wei-Xing; Jin, Da-Yong; Wang, Dan-Song; Lou, Wen-Hui
2013-03-07
To identify a practical approach for preoperative decision-making in patients with intraductal papillary mucinous neoplasms (IPMNs) of the pancreas. Between March 1999 and November 2006, the clinical characteristics, pathological data and computed tomography/magnetic resonance imaging (CT/MRI) of 54 IPMNs cases were retrieved and analyzed. The relationships between the above data and decision-making for pancreatic resection were analyzed using SPSS 13.0 software. Univariate analysis of risk factors for malignant or invasive IPMNs was performed with regard to the following variables: carcinoembryonic antigen, carbohydrate antigen 19-9 (CA19-9) and the characteristics from CT/MRI images. Receiver operating characteristic (ROC) curve analysis for pancreatic resection was performed using significant factors from the univariate analysis. CT/MRI images, including main and mixed duct IPMNs, tumor size > 30 mm or a solid component appearance in the lesion, and preoperative serum CA19-9 > 37 U/mL had good predictive value for determining pancreatic resection (P < 0.05), but with limitations. Combining the above factors (CT/MRI images and CA19-9) improved the accuracy and sensitivity for determining pancreatic resection in IPMNs. Using ROC analysis, the area under the curve reached 0.893 (P < 0.01, 95%CI: 0.763-1.023), with a sensitivity, specificity, positive predictive value and negative predictive value of 95.2%, 83.3%, 95.2% and 83.3%, respectively. Combining preoperative CT/MRI images and CA19-9 level may provide useful information for surgical decision-making in IPMNs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
Achieving realistic performance and decison-making capabilities in computer-generated air forces
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.; Santos, Eugene, Jr.; Zurita, Vincent B.; Benslay, James L., Jr.
1997-07-01
For a computer-generated force (CGF) system to be useful in training environments, it must be able to operate at multiple skill levels, exhibit competency at assigned missions, and comply with current doctrine. Because of the rapid rate of change in distributed interactive simulation (DIS) and the expanding set of performance objectives for any computer- generated force, the system must also be modifiable at reasonable cost and incorporate mechanisms for learning. Therefore, CGF applications must have adaptable decision mechanisms and behaviors and perform automated incorporation of past reasoning and experience into its decision process. The CGF must also possess multiple skill levels for classes of entities, gracefully degrade its reasoning capability in response to system stress, possess an expandable modular knowledge structure, and perform adaptive mission planning. Furthermore, correctly performing individual entity behaviors is not sufficient. Issues related to complex inter-entity behavioral interactions, such as the need to maintain formation and share information, must also be considered. The CGF must also be able to acceptably respond to unforeseen circumstances and be able to make decisions in spite of uncertain information. Because of the need for increased complexity in the virtual battlespace, the CGF should exhibit complex, realistic behavior patterns within the battlespace. To achieve these necessary capabilities, an extensible software architecture, an expandable knowledge base, and an adaptable decision making mechanism are required. Our lab has addressed these issues in detail. The resulting DIS-compliant system is called the automated wingman (AW). The AW is based on fuzzy logic, the common object database (CODB) software architecture, and a hierarchical knowledge structure. We describe the techniques we used to enable us to make progress toward a CGF entity that satisfies the requirements presented above. We present our design and implementation of an adaptable decision making mechanism that uses multi-layered, fuzzy logic controlled situational analysis. Because our research indicates that fuzzy logic can perform poorly under certain circumstances, we combine fuzzy logic inferencing with adversarial game tree techniques for decision making in strategic and tactical engagements. We describe the approach we employed to achieve this fusion. We also describe the automated wingman's system architecture and knowledge base architecture.
Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin
2011-12-23
Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.
2011-01-01
Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308
A Bayesian Attractor Model for Perceptual Decision Making
Bitzer, Sebastian; Bruineberg, Jelle; Kiebel, Stefan J.
2015-01-01
Even for simple perceptual decisions, the mechanisms that the brain employs are still under debate. Although current consensus states that the brain accumulates evidence extracted from noisy sensory information, open questions remain about how this simple model relates to other perceptual phenomena such as flexibility in decisions, decision-dependent modulation of sensory gain, or confidence about a decision. We propose a novel approach of how perceptual decisions are made by combining two influential formalisms into a new model. Specifically, we embed an attractor model of decision making into a probabilistic framework that models decision making as Bayesian inference. We show that the new model can explain decision making behaviour by fitting it to experimental data. In addition, the new model combines for the first time three important features: First, the model can update decisions in response to switches in the underlying stimulus. Second, the probabilistic formulation accounts for top-down effects that may explain recent experimental findings of decision-related gain modulation of sensory neurons. Finally, the model computes an explicit measure of confidence which we relate to recent experimental evidence for confidence computations in perceptual decision tasks. PMID:26267143
Self-Organized Service Negotiation for Collaborative Decision Making
Zhang, Bo; Zheng, Ziming
2014-01-01
This paper proposes a self-organized service negotiation method for CDM in intelligent and automatic manners. It mainly includes three phases: semantic-based capacity evaluation for the CDM sponsor, trust computation of the CDM organization, and negotiation selection of the decision-making service provider (DMSP). In the first phase, the CDM sponsor produces the formal semantic description of the complex decision task for DMSP and computes the capacity evaluation values according to participator instructions from different DMSPs. In the second phase, a novel trust computation approach is presented to compute the subjective belief value, the objective reputation value, and the recommended trust value. And in the third phase, based on the capacity evaluation and trust computation, a negotiation mechanism is given to efficiently implement the service selection. The simulation experiment results show that our self-organized service negotiation method is feasible and effective for CDM. PMID:25243228
Self-organized service negotiation for collaborative decision making.
Zhang, Bo; Huang, Zhenhua; Zheng, Ziming
2014-01-01
This paper proposes a self-organized service negotiation method for CDM in intelligent and automatic manners. It mainly includes three phases: semantic-based capacity evaluation for the CDM sponsor, trust computation of the CDM organization, and negotiation selection of the decision-making service provider (DMSP). In the first phase, the CDM sponsor produces the formal semantic description of the complex decision task for DMSP and computes the capacity evaluation values according to participator instructions from different DMSPs. In the second phase, a novel trust computation approach is presented to compute the subjective belief value, the objective reputation value, and the recommended trust value. And in the third phase, based on the capacity evaluation and trust computation, a negotiation mechanism is given to efficiently implement the service selection. The simulation experiment results show that our self-organized service negotiation method is feasible and effective for CDM.
Dorazio, R.M.; Johnson, F.A.
2003-01-01
Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.
An Artificial Neural Network-Based Decision-Support System for Integrated Network Security
2014-09-01
group that they need to know in order to make team-based decisions in real-time environments, (c) Employ secure cloud computing services to host mobile...THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force...out-of-the-loop syndrome and create complexity creep. As a result, full automation efforts can lead to inappropriate decision-making despite a
Tsalatsanis, Athanasios; Hozo, Iztok; Vickers, Andrew; Djulbegovic, Benjamin
2010-09-16
Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc).
2010-01-01
Background Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. Methods First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. Results We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. Conclusions We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc). PMID:20846413
Strategies for Efficient Computation of the Expected Value of Partial Perfect Information
Madan, Jason; Ades, Anthony E.; Price, Malcolm; Maitland, Kathryn; Jemutai, Julie; Revill, Paul; Welton, Nicky J.
2014-01-01
Expected value of information methods evaluate the potential health benefits that can be obtained from conducting new research to reduce uncertainty in the parameters of a cost-effectiveness analysis model, hence reducing decision uncertainty. Expected value of partial perfect information (EVPPI) provides an upper limit to the health gains that can be obtained from conducting a new study on a subset of parameters in the cost-effectiveness analysis and can therefore be used as a sensitivity analysis to identify parameters that most contribute to decision uncertainty and to help guide decisions around which types of study are of most value to prioritize for funding. A common general approach is to use nested Monte Carlo simulation to obtain an estimate of EVPPI. This approach is computationally intensive, can lead to significant sampling bias if an inadequate number of inner samples are obtained, and incorrect results can be obtained if correlations between parameters are not dealt with appropriately. In this article, we set out a range of methods for estimating EVPPI that avoid the need for nested simulation: reparameterization of the net benefit function, Taylor series approximations, and restricted cubic spline estimation of conditional expectations. For each method, we set out the generalized functional form that net benefit must take for the method to be valid. By specifying this functional form, our methods are able to focus on components of the model in which approximation is required, avoiding the complexities involved in developing statistical approximations for the model as a whole. Our methods also allow for any correlations that might exist between model parameters. We illustrate the methods using an example of fluid resuscitation in African children with severe malaria. PMID:24449434
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
"Sugar-Ray" School-Based Decision Groups.
ERIC Educational Resources Information Center
Hunt, John J.; And Others
1992-01-01
Investigates differences between high-achieving and low-achieving school-based decision groups in decision making. Decision groups (207 groups of 3 members each) used computer simulations to address problems facing principals concerning fourth grade academic achievement. Higher-achieving groups made more decisions and made a combination of related…
Jibaja-Weiss, Maria L; Volk, Robert J
2007-01-01
Decision aids have been developed by using various delivery methods, including interactive computer programs. Such programs, however, still rely heavily on written information, health and digital literacy, and reading ease. We describe an approach to overcome these potential barriers for low-literate, underserved populations by making design considerations for poor readers and naïve computer users and by using concepts from entertainment education to engage the user and to contextualize the content for the user. The system design goals are to make the program both didactic and entertaining and the navigation and graphical user interface as simple as possible. One entertainment education strategy, the soap opera, is linked seamlessly to interactive learning modules to enhance the content of the soap opera episodes. The edutainment decision aid model (EDAM) guides developers through the design process. Although designing patient decision aids that are educational, entertaining, and targeted toward poor readers and those with limited computer skills is a complex task, it is a promising strategy for aiding this population. Entertainment education may be a highly effective approach to promoting informed decision making for patients with low health literacy.
Digital avionics design and reliability analyzer
NASA Technical Reports Server (NTRS)
1981-01-01
The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.
A systematic mapping study of process mining
NASA Astrophysics Data System (ADS)
Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo
2018-05-01
This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.
MoCog1: A computer simulation of recognition-primed human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.
Automatic Data Filter Customization Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Mandrake, Lukas
2013-01-01
This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.
Wall, Stephen P; Mayorga, Oliver; Banfield, Christine E; Wall, Mark E; Aisic, Ilan; Auerbach, Carl; Gennis, Paul
2006-11-01
To develop software that categorizes electronic head computed tomography (CT) reports into groups useful for clinical decision rule research. Data were obtained from the Second National Emergency X-Radiography Utilization Study, a cohort of head injury patients having received head CT. CT reports were reviewed manually for presence or absence of clinically important subdural or epidural hematoma, defined as greater than 1.0 cm in width or causing mass effect. Manual categorization was done by 2 independent researchers blinded to each other's results. A third researcher adjudicated discrepancies. A random sample of 300 reports with radiologic abnormalities was selected for software development. After excluding reports categorized manually or by software as indeterminate (neither positive nor negative), we calculated sensitivity and specificity by using manual categorization as the standard. System efficiency was defined as the percentage of reports categorized as positive or negative, regardless of accuracy. Software was refined until analysis of the training data yielded sensitivity and specificity approximating 95% and efficiency exceeding 75%. To test the system, we calculated sensitivity, specificity, and efficiency, using the remaining 1,911 reports. Of the 1,911 reports, 160 had clinically important subdural or epidural hematoma. The software exhibited good agreement with manual categorization of all reports, including indeterminate ones (weighted kappa 0.62; 95% confidence interval [CI] 0.58 to 0.65). Sensitivity, specificity, and efficiency of the computerized system for identifying manual positives and negatives were 96% (95% CI 91% to 98%), 98% (95% CI 98% to 99%), and 79% (95% CI 77% to 80%), respectively. Categorizing head CT reports by computer for clinical decision rule research is feasible.
de Achaval, Sofia; Fraenkel, Liana; Volk, Robert J.; Cox, Vanessa; Suarez-Almazor, Maria E.
2012-01-01
Our objective was to examine the impact of a videobooklet patient decision aid supplemented by an interactive values clarification exercise on decisional conflict in patients with knee osteoarthritis (OA) considering total knee arthroplasy. 208 patients participated in the study (mean age 63 years; 68% female; 66% White). Participants were randomized to 1 of 3 groups: (1) Educational booklet on OA management (control); (2) Patient decision aid (videobooklet) on OA management; and (3) Patient decision aid (videobooklet) + adaptive conjoint analysis ACA tool. The ACA tool enables patients to consider competing attributes (i.e. specific risks/benefits) by asking them to rate a series of paired-comparisons. The primary outcome was the decisional conflict scale ranging from 0 to 100. Differences between groups were analyzed using analysis of variance (ANOVA) and Tukey's honestly significant difference tests. Overall, decisional conflict decreased significantly in all groups (p<0.05). The largest reduction in decisional conflict was observed for participants in the videobooklet decision aid group (21 points). Statistically significant differences in pre vs. post-intervention total scores favored the videobooklet group compared to the control group (21 vs. 10) and to the videobooklet plus ACA group (21 vs. 14; p<0.001). Changes in the decisional conflict score for the control compared to the videobooklet decision aid + ACA group were not significantly different. In our study, an audiovisual patient decision aid decreased decisional conflict more than printed material alone, or than the addition of a more complex computer-based ACA tool requiring more intense cognitive involvement and explicit value choices. PMID:21954198
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Emotion-affected decision making in human simulation.
Zhao, Y; Kang, J; Wright, D K
2006-01-01
Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.
Biomedical Informatics for Computer-Aided Decision Support Systems: A Survey
Belle, Ashwin; Kon, Mark A.; Najarian, Kayvan
2013-01-01
The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest. PMID:23431259
A decision-based perspective for the design of methods for systems design
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.
1989-01-01
Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.
Visual saliency-based fast intracoding algorithm for high efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Xin; Shi, Guangming; Zhou, Wei; Duan, Zhemin
2017-01-01
Intraprediction has been significantly improved in high efficiency video coding over H.264/AVC with quad-tree-based coding unit (CU) structure from size 64×64 to 8×8 and more prediction modes. However, these techniques cause a dramatic increase in computational complexity. An intracoding algorithm is proposed that consists of perceptual fast CU size decision algorithm and fast intraprediction mode decision algorithm. First, based on the visual saliency detection, an adaptive and fast CU size decision method is proposed to alleviate intraencoding complexity. Furthermore, a fast intraprediction mode decision algorithm with step halving rough mode decision method and early modes pruning algorithm is presented to selectively check the potential modes and effectively reduce the complexity of computation. Experimental results show that our proposed fast method reduces the computational complexity of the current HM to about 57% in encoding time with only 0.37% increases in BD rate. Meanwhile, the proposed fast algorithm has reasonable peak signal-to-noise ratio losses and nearly the same subjective perceptual quality.
Two Quantum Protocols for Oblivious Set-member Decision Problem
NASA Astrophysics Data System (ADS)
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
Two Quantum Protocols for Oblivious Set-member Decision Problem
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-01-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology. PMID:26514668
Two Quantum Protocols for Oblivious Set-member Decision Problem.
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-30
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.
Women's decision to major in STEM fields
NASA Astrophysics Data System (ADS)
Conklin, Stephanie
This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.
Glassman, E Katelyn; Hughes, Michelle L
2013-01-01
Current cochlear implants (CIs) have telemetry capabilities for measuring the electrically evoked compound action potential (ECAP). Neural Response Telemetry (Cochlear) and Neural Response Imaging (Advanced Bionics [AB]) can measure ECAP responses across a range of stimulus levels to obtain an amplitude growth function. Software-specific algorithms automatically mark the leading negative peak, N1, and the following positive peak/plateau, P2, and apply linear regression to estimate ECAP threshold. Alternatively, clinicians may apply expert judgments to modify the peak markers placed by the software algorithms, or use visual detection to identify the lowest level yielding a measurable ECAP response. The goals of this study were to: (1) assess the variability between human and computer decisions for (a) marking N1 and P2 and (b) determining linear-regression threshold (LRT) and visual-detection threshold (VDT); and (2) compare LRT and VDT methods within and across human- and computer-decision methods. ECAP amplitude-growth functions were measured for three electrodes in each of 20 ears (10 Cochlear Nucleus® 24RE/CI512, and 10 AB CII/90K). LRT, defined as the current level yielding an ECAP with zero amplitude, was calculated for both computer- (C-LRT) and human-picked peaks (H-LRT). VDT, defined as the lowest level resulting in a measurable ECAP response, was also calculated for both computer- (C-VDT) and human-picked peaks (H-VDT). Because Neural Response Imaging assigns peak markers to all waveforms but does not include waveforms with amplitudes less than 20 μV in its regression calculation, C-VDT for AB subjects was defined as the lowest current level yielding an amplitude of 20 μV or more. Overall, there were significant correlations between human and computer decisions for peak-marker placement, LRT, and VDT for both manufacturers (r = 0.78-1.00, p < 0.001). For Cochlear devices, LRT and VDT correlated equally well for both computer- and human-picked peaks (r = 0.98-0.99, p < 0.001), which likely reflects the well-defined Neural Response Telemetry algorithm and the lower noise floor in the 24RE and CI512 devices. For AB devices, correlations between LRT and VDT for both peak-picker methods were weaker than for Cochlear devices (r = 0.69-0.85, p < 0.001), which likely reflect the higher noise floor of the system. Disagreement between computer and human decisions regarding the presence of an ECAP response occurred for 5 % of traces for Cochlear devices and 2.1 % of traces for AB devices. Results indicate that human and computer peak-picking methods can be used with similar accuracy for both Cochlear and AB devices. Either C-VDT or C-LRT can be used with equal confidence for Cochlear 24RE and CI512 recipients because both methods are strongly correlated with human decisions. However, for AB devices, greater variability exists between different threshold-determination methods. This finding should be considered in the context of using ECAP measures to assist with programming CIs.
Virtual microscopy and digital pathology in training and education.
Hamilton, Peter W; Wang, Yinhai; McCullough, Stephen J
2012-04-01
Traditionally, education and training in pathology has been delivered using textbooks, glass slides and conventional microscopy. Over the last two decades, the number of web-based pathology resources has expanded dramatically with centralized pathological resources being delivered to many students simultaneously. Recently, whole slide imaging technology allows glass slides to be scanned and viewed on a computer screen via dedicated software. This technology is referred to as virtual microscopy and has created enormous opportunities in pathological training and education. Students are able to learn key histopathological skills, e.g. to identify areas of diagnostic relevance from an entire slide, via a web-based computer environment. Students no longer need to be in the same room as the slides. New human-computer interfaces are also being developed using more natural touch technology to enhance the manipulation of digitized slides. Several major initiatives are also underway introducing online competency and diagnostic decision analysis using virtual microscopy and have important future roles in accreditation and recertification. Finally, researchers are investigating how pathological decision-making is achieved using virtual microscopy and modern eye-tracking devices. Virtual microscopy and digital pathology will continue to improve how pathology training and education is delivered. © 2012 The Authors APMIS © 2012 APMIS.
Devaluation and sequential decisions: linking goal-directed and model-based behavior
Friedel, Eva; Koch, Stefan P.; Wendt, Jean; Heinz, Andreas; Deserno, Lorenz; Schlagenhauf, Florian
2014-01-01
In experimental psychology different experiments have been developed to assess goal–directed as compared to habitual control over instrumental decisions. Similar to animal studies selective devaluation procedures have been used. More recently sequential decision-making tasks have been designed to assess the degree of goal-directed vs. habitual choice behavior in terms of an influential computational theory of model-based compared to model-free behavioral control. As recently suggested, different measurements are thought to reflect the same construct. Yet, there has been no attempt to directly assess the construct validity of these different measurements. In the present study, we used a devaluation paradigm and a sequential decision-making task to address this question of construct validity in a sample of 18 healthy male human participants. Correlational analysis revealed a positive association between model-based choices during sequential decisions and goal-directed behavior after devaluation suggesting a single framework underlying both operationalizations and speaking in favor of construct validity of both measurement approaches. Up to now, this has been merely assumed but never been directly tested in humans. PMID:25136310
NASA Astrophysics Data System (ADS)
Gil, Y.; Duffy, C.
2015-12-01
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
Decision Making in Computer-Simulated Experiments.
ERIC Educational Resources Information Center
Suits, J. P.; Lagowski, J. J.
A set of interactive, computer-simulated experiments was designed to respond to the large range of individual differences in aptitude and reasoning ability generally exhibited by students enrolled in first-semester general chemistry. These experiments give students direct experience in the type of decision making needed in an experimental setting.…
Factors Influencing Cloud-Computing Technology Adoption in Developing Countries
ERIC Educational Resources Information Center
Hailu, Alemayehu
2012-01-01
Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…
Computer-Assisted Community Planning and Decision Making.
ERIC Educational Resources Information Center
College of the Atlantic, Bar Harbor, ME.
The College of the Atlantic (COA) developed a broad-based, interdisciplinary curriculum in ecological policy and community planning and decision-making that incorporates two primary computer-based tools: ARC/INFO Geographic Information System (GIS) and STELLA, a systems-dynamics modeling tool. Students learn how to use and apply these tools…
Using a Group Decision Support System for Creativity.
ERIC Educational Resources Information Center
Aiken, Milam; Riggs, Mary
1993-01-01
A computer-based group decision support system (GDSS) to increase collaborative group productivity and creativity is explained. Various roles for the computer are identified, and implementation of GDSS systems at the University of Mississippi and International Business Machines are described. The GDSS is seen as fostering productivity through…
Solving subsurface structural problems using a computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.M.
1987-02-01
Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less
A conceptual and computational model of moral decision making in human and artificial agents.
Wallach, Wendell; Franklin, Stan; Allen, Colin
2010-07-01
Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we will elucidate a process whereby an agent can work through an ethical problem to reach a solution that takes account of ethically relevant factors. Copyright © 2010 Cognitive Science Society, Inc.
Simulation of human decision making
Forsythe, J Chris [Sandia Park, NM; Speed, Ann E [Albuquerque, NM; Jordan, Sabina E [Albuquerque, NM; Xavier, Patrick G [Albuquerque, NM
2008-05-06
A method for computer emulation of human decision making defines a plurality of concepts related to a domain and a plurality of situations related to the domain, where each situation is a combination of at least two of the concepts. Each concept and situation is represented in the computer as an oscillator output, and each situation and concept oscillator output is distinguishable from all other oscillator outputs. Information is input to the computer representative of detected concepts, and the computer compares the detected concepts with the stored situations to determine if a situation has occurred.
Stefanović, Stefica Cerjan; Bolanča, Tomislav; Luša, Melita; Ukić, Sime; Rogošić, Marko
2012-02-24
This paper describes the development of ad hoc methodology for determination of inorganic anions in oilfield water, since their composition often significantly differs from the average (concentration of components and/or matrix). Therefore, fast and reliable method development has to be performed in order to ensure the monitoring of desired properties under new conditions. The method development was based on computer assisted multi-criteria decision making strategy. The used criteria were: maximal value of objective functions used, maximal robustness of the separation method, minimal analysis time, and maximal retention distance between two nearest components. Artificial neural networks were used for modeling of anion retention. The reliability of developed method was extensively tested by the validation of performance characteristics. Based on validation results, the developed method shows satisfactory performance characteristics, proving the successful application of computer assisted methodology in the described case study. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determinesmore » the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.« less
Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L
2012-10-01
This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.
Breast Cancer Detection with Reduced Feature Set.
Mert, Ahmet; Kılıç, Niyazi; Bilgili, Erdem; Akan, Aydin
2015-01-01
This paper explores feature reduction properties of independent component analysis (ICA) on breast cancer decision support system. Wisconsin diagnostic breast cancer (WDBC) dataset is reduced to one-dimensional feature vector computing an independent component (IC). The original data with 30 features and reduced one feature (IC) are used to evaluate diagnostic accuracy of the classifiers such as k-nearest neighbor (k-NN), artificial neural network (ANN), radial basis function neural network (RBFNN), and support vector machine (SVM). The comparison of the proposed classification using the IC with original feature set is also tested on different validation (5/10-fold cross-validations) and partitioning (20%-40%) methods. These classifiers are evaluated how to effectively categorize tumors as benign and malignant in terms of specificity, sensitivity, accuracy, F-score, Youden's index, discriminant power, and the receiver operating characteristic (ROC) curve with its criterion values including area under curve (AUC) and 95% confidential interval (CI). This represents an improvement in diagnostic decision support system, while reducing computational complexity.
Applicability of aquifer impact models to support decisions at CO 2 sequestration sites
Keating, Elizabeth; Bacon, Diana; Carroll, Susan; ...
2016-07-25
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO 2 sequestration sites. This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO 2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014a; Carroll et al., 2014b; Dai et al., 2014 ; Keating et al., 2016). Here in this paper, we seek to demonstrate applicability of ROM-based analysis by considering what types of decisions and aquifermore » types would benefit from the ROM analysis. We present four hypothetical examples where applying ROMs, in ensemble mode, could support decisions during a geologic CO 2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO 2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less
2017-10-01
hypothesis that a computer machine learning algorithm can analyze and classify burn injures using multispectral imaging within 5% of an expert clinician...morbidity. In response to these challenges, the USAISR developed and obtained FDA 510(k) clearance of the Burn Navigator™, a computer decision support... computer decision support software (CDSS), can significantly change the CDSS algorithm’s recommendations and thus the total fluid administered to a
High-Fidelity Design of Multimodal Restorative Interventions in Gulf War Illness
2017-10-01
Bockmayr A, Klarner H, Siebert H. Time series dependent analysis of unparametrized Thomas networks. IEEE/ACM Transactions on Computational Biology and...Award Number: W81XWH-15-1-0582 TITLE:High-Fidelity Design of Multimodal Restorative Interventions in Gulf War Illness PRINCIPAL INVESTIGATOR...not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation. REPORT
ERIC Educational Resources Information Center
Burns, Matthew K.; Taylor, Crystal N.; Warmbold-Brann, Kristy L.; Preast, June L.; Hosp, John L.; Ford, Jeremy W.
2017-01-01
Intervention researchers often use curriculum-based measurement of reading fluency (CBM-R) with a brief experimental analysis (BEA) to identify an effective intervention for individual students. The current study synthesized data from 22 studies that used CBM-R data within a BEA by computing the standard error of measure (SEM) for the median data…
Implementing Computer Technology in the Rehabilitation Process.
ERIC Educational Resources Information Center
McCollum, Paul S., Ed.; Chan, Fong, Ed.
1985-01-01
This special issue contains seven articles, addressing rehabilitation in the information age, computer-assisted rehabilitation services, computer technology in rehabilitation counseling, computer-assisted career exploration and vocational decision making, computer-assisted assessment, computer enhanced employment opportunities for persons with…
An Intelligent Polar Cyberinfrastrucuture to Support Spatiotemporal Decision Making
NASA Astrophysics Data System (ADS)
Song, M.; Li, W.; Zhou, X.
2014-12-01
In the era of big data, polar sciences have already faced an urgent demand of utilizing intelligent approaches to support precise and effective spatiotemporal decision-making. Service-oriented cyberinfrastructure has advantages of seamlessly integrating distributed computing resources, and aggregating a variety of geospatial data derived from Earth observation network. This paper focuses on building a smart service-oriented cyberinfrastructure to support intelligent question answering related to polar datasets. The innovation of this polar cyberinfrastructure includes: (1) a problem-solving environment that parses geospatial question in natural language, builds geoprocessing rules, composites atomic processing services and executes the entire workflow; (2) a self-adaptive spatiotemporal filter that is capable of refining query constraints through semantic analysis; (3) a dynamic visualization strategy to support results animation and statistics in multiple spatial reference systems; and (4) a user-friendly online portal to support collaborative decision-making. By means of this polar cyberinfrastructure, we intend to facilitate integration of distributed and heterogeneous Arctic datasets and comprehensive analysis of multiple environmental elements (e.g. snow, ice, permafrost) to provide a better understanding of the environmental variation in circumpolar regions.
Doyle, Richard J; Wang, Nina; Anthony, David; Borkan, Jeffrey; Shield, Renee R; Goldman, Roberta E
2012-10-01
We compared physicians' self-reported attitudes and behaviours regarding electronic health record (EHR) use before and after installation of computers in patient examination rooms and transition to full implementation of an EHR in a family medicine training practice to identify anticipated and observed effects these changes would have on physicians' practices and clinical encounters. We conducted two individual qualitative interviews with family physicians. The first interview was before and second interview was 8 months later after full implementation of an EHR and computer installation in the examination rooms. Data were analysed through project team discussions and subsequent coding with qualitative analysis software. At the first interviews, physicians frequently expressed concerns about the potential negative effect of the EHR on quality of care and physician-patient interaction, adequacy of their skills in EHR use and privacy and confidentiality concerns. Nevertheless, most physicians also anticipated multiple benefits, including improved accessibility of patient data and online health information. In the second interviews, physicians reported that their concerns did not persist. Many anticipated benefits were realized, appearing to facilitate collaborative physician-patient relationships. Physicians reported a greater teaching role with patients and sharing online medical information and treatment plan decisions. Before computer installation and full EHR implementation, physicians expressed concerns about the impact of computer use on patient care. After installation and implementation, however, many concerns were mitigated. Using computers in the examination rooms to document and access patients' records along with online medical information and decision-making tools appears to contribute to improved physician-patient communication and collaboration.
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
Computer-assisted image analysis to quantify daily growth rates of broiler chickens.
De Wet, L; Vranken, E; Chedad, A; Aerts, J M; Ceunen, J; Berckmans, D
2003-09-01
1. The objective was to investigate the possibility of detecting daily body weight changes of broiler chickens with computer-assisted image analysis. 2. The experiment included 50 broiler chickens reared under commercial conditions. Ten out of 50 chickens were randomly selected and video recorded (upper view) 18 times during the 42-d growing period. The number of surface and periphery pixels from the images was used to derive a relationship between body dimension and live weight. 3. The relative error in weight estimation, expressed in terms of the standard deviation of the residuals from image surface data was 10%, while it was found to be 15% for the image periphery data. 4. Image-processing systems could be developed to assist the farmer in making important management and marketing decisions.
The Contribution of a Decision Support System to Educational Decision-Making Processes
ERIC Educational Resources Information Center
Klein, Joseph; Ronen, Herman
2003-01-01
In the light of reports of bias, the present study investigated the hypothesis that administrative educational decisions assisted by Decision Support Systems (DSS) are characterized by different pedagogical and organizational orientation than decisions made without computer assistance. One hundred and ten high school teachers were asked to suggest…
The role of moral utility in decision making: an interdisciplinary framework.
Tobler, Philippe N; Kalis, Annemarie; Kalenscher, Tobias
2008-12-01
What decisions should we make? Moral values, rules, and virtues provide standards for morally acceptable decisions, without prescribing how we should reach them. However, moral theories do assume that we are, at least in principle, capable of making the right decisions. Consequently, an empirical investigation of the methods and resources we use for making moral decisions becomes relevant. We consider theoretical parallels of economic decision theory and moral utilitarianism and suggest that moral decision making may tap into mechanisms and processes that have originally evolved for nonmoral decision making. For example, the computation of reward value occurs through the combination of probability and magnitude; similar computation might also be used for determining utilitarian moral value. Both nonmoral and moral decisions may resort to intuitions and heuristics. Learning mechanisms implicated in the assignment of reward value to stimuli, actions, and outcomes may also enable us to determine moral value and assign it to stimuli, actions, and outcomes. In conclusion, we suggest that moral capabilities can employ and benefit from a variety of nonmoral decision-making and learning mechanisms.
Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve
2011-01-01
This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
Sensitivity analysis of a ground-water-flow model
Torak, Lynn J.; ,
1991-01-01
A sensitivity analysis was performed on 18 hydrological factors affecting steady-state groundwater flow in the Upper Floridan aquifer near Albany, southwestern Georgia. Computations were based on a calibrated, two-dimensional, finite-element digital model of the stream-aquifer system and the corresponding data inputs. Flow-system sensitivity was analyzed by computing water-level residuals obtained from simulations involving individual changes to each hydrological factor. Hydrological factors to which computed water levels were most sensitive were those that produced the largest change in the sum-of-squares of residuals for the smallest change in factor value. Plots of the sum-of-squares of residuals against multiplier or additive values that effect change in the hydrological factors are used to evaluate the influence of each factor on the simulated flow system. The shapes of these 'sensitivity curves' indicate the importance of each hydrological factor to the flow system. Because the sensitivity analysis can be performed during the preliminary phase of a water-resource investigation, it can be used to identify the types of hydrological data required to accurately characterize the flow system prior to collecting additional data or making management decisions.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
Reinforcement learning in computer vision
NASA Astrophysics Data System (ADS)
Bernstein, A. V.; Burnaev, E. V.
2018-04-01
Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.
Decision tree and ensemble learning algorithms with their applications in bioinformatics.
Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping
2011-01-01
Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.
Semantic Clinical Guideline Documents
Eriksson, Henrik; Tu, Samson W.; Musen, Mark
2005-01-01
Decision-support systems based on clinical practice guidelines can support physicians and other health-care personnel in the process of following best practice consistently. A knowledge-based approach to represent guidelines makes it possible to encode computer-interpretable guidelines in a formal manner, perform consistency checks, and use the guidelines directly in decision-support systems. Decision-support authors and guideline users require guidelines in human-readable formats in addition to computer-interpretable ones (e.g., for guideline review and quality assurance). We propose a new document-oriented information architecture that combines knowledge-representation models with electronic and paper documents. The approach integrates decision-support modes with standard document formats to create a combined clinical-guideline model that supports on-line viewing, printing, and decision support. PMID:16779037
Dissociable Neural Processes Underlying Risky Decisions for Self Versus Other
Jung, Daehyun; Sul, Sunhae; Kim, Hackjin
2013-01-01
Previous neuroimaging studies on decision making have mainly focused on decisions on behalf of oneself. Considering that people often make decisions on behalf of others, it is intriguing that there is little neurobiological evidence on how decisions for others differ from those for oneself. The present study directly compared risky decisions for self with those for another person using functional magnetic resonance imaging (fMRI). Participants were asked to perform a gambling task on behalf of themselves (decision-for-self condition) or another person (decision-for-other condition) while in the scanner. Their task was to choose between a low-risk option (i.e., win or lose 10 points) and a high-risk option (i.e., win or lose 90 points) with variable levels of winning probability. Compared with choices regarding others, those regarding oneself were more risk-averse at lower winning probabilities and more risk-seeking at higher winning probabilities, perhaps due to stronger affective process during risky decisions for oneself compared with those for other. The brain-activation pattern changed according to the target, such that reward-related regions were more active in the decision-for-self condition than in the decision-for-other condition, whereas brain regions related to the theory of mind (ToM) showed greater activation in the decision-for-other condition than in the decision-for-self condition. Parametric modulation analysis using individual decision models revealed that activation of the amygdala and the dorsomedial prefrontal cortex (DMPFC) were associated with value computations for oneself and for another, respectively, during risky financial decisions. The results of the present study suggest that decisions for oneself and for other may recruit fundamentally distinct neural processes, which can be mainly characterized as dominant affective/impulsive and cognitive/regulatory processes, respectively. PMID:23519016
Emotions and Decisions: Beyond Conceptual Vagueness and the Rationality Muddle.
Volz, Kirsten G; Hertwig, Ralph
2016-01-01
For centuries, decision scholars paid little attention to emotions: Decisions were modeled in normative and descriptive frameworks with little regard for affective processes. Recently, however, an "emotions revolution" has taken place, particularly in the neuroscientific study of decision making, putting emotional processes on an equal footing with cognitive ones. Yet disappointingly little theoretical progress has been made. The concepts and processes discussed often remain vague, and conclusions about the implications of emotions for rationality are contradictory and muddled. We discuss three complementary ways to move the neuroscientific study of emotion and decision making from agenda setting to theory building. The first is to use reverse inference as a hypothesis-discovery rather than a hypothesis-testing tool, unless its utility can be systematically quantified (e.g., through meta-analysis). The second is to capitalize on the conceptual inventory advanced by the behavioral science of emotions, testing those concepts and unveiling the underlying processes. The third is to model the interplay between emotions and decisions, harnessing existing cognitive frameworks of decision making and mapping emotions onto the postulated computational processes. To conclude, emotions (like cognitive strategies) are not rational or irrational per se: How (un)reasonable their influence is depends on their fit with the environment. © The Author(s) 2015.
Zakane, S Alphonse; Gustafsson, Lars L; Tomson, Göran; Loukanova, Svetla; Sié, Ali; Nasiell, Josefine; Bastholm-Rahmner, Pia
2014-06-01
In 2010, 245,000 women died due to pregnancy-related causes in sub-Saharan Africa and southern Asia. Our study is nested into the QUALMAT project and seeks to improve the quality of maternal care services through the introduction of a computerized clinical decision support system (CDSS) to help healthcare workers in rural areas. Healthcare information technology applications in low-income countries may improve healthcare provision but recent studies demonstrate unintended consequences with underuse or resistance to CDSS and that the fit between the system and the clinical needs does present challenges. To explore and describe perceived needs and attitudes among healthcare workers to access WHO guidelines using CDSS in maternal and neonatal care in rural Burkina Faso. Data were collected with semi-structured interviews in two rural districts in Burkina Faso with 45 informants. Descriptive statistics were used for the analysis of the quantitative part of the interview corresponding to informants' background. Qualitative data were analyzed using manifest content analysis. Four main findings emerged: (a) an appreciable willingness among healthcare workers for and a great interest to adapt and use modern technologies like computers to learn more in the workplace, (b) a positive attitude to easy access of guidelines and implementation of decision-support using computers in the workplace, (c) a fear that the CDSS would require more working time and lead to double-work, and (d) that the CDSS is complicated and requires substantial computer training and extensive instructions to fully implement. The findings can be divided into aspects of motivators and barriers in relation to how the CDSS is perceived and to be used. These aspects are closely connected to each other as the motivating aspects can easily be turned into barriers if not taken care of properly in the final design, during implementation and maintenance of the CDSS at point of care. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Computer simulation models of pre-diabetes populations: a systematic review protocol
Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha
2017-01-01
Introduction Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. Methods and analysis A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. Ethics and dissemination This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. Reviewregistration number CRD42016047228. PMID:28982807
Yang, Qian; Zimmerman, John; Steinfeld, Aaron; Carey, Lisa; Antaki, James F.
2016-01-01
Clinical decision support tools (DSTs) are computational systems that aid healthcare decision-making. While effective in labs, almost all these systems failed when they moved into clinical practice. Healthcare researchers speculated it is most likely due to a lack of user-centered HCI considerations in the design of these systems. This paper describes a field study investigating how clinicians make a heart pump implant decision with a focus on how to best integrate an intelligent DST into their work process. Our findings reveal a lack of perceived need for and trust of machine intelligence, as well as many barriers to computer use at the point of clinical decision-making. These findings suggest an alternative perspective to the traditional use models, in which clinicians engage with DSTs at the point of making a decision. We identify situations across patients’ healthcare trajectories when decision supports would help, and we discuss new forms it might take in these situations. PMID:27833397
Utility and Usability as Factors Influencing Teacher Decisions about Software Integration
ERIC Educational Resources Information Center
Okumus, Samet; Lewis, Lindsey; Wiebe, Eric; Hollebrands, Karen
2016-01-01
Given the importance of teacher in the implementation of computer technology in classrooms, the technology acceptance model and TPACK model were used to better understand the decision-making process teachers use in determining how, when, and where computer software is used in mathematics classrooms. Thirty-four (34) teachers implementing…
Value encoding in single neurons in the human amygdala during decision making.
Jenison, Rick L; Rangel, Antonio; Oya, Hiroyuki; Kawasaki, Hiroto; Howard, Matthew A
2011-01-05
A growing consensus suggests that the brain makes simple choices by assigning values to the stimuli under consideration and then comparing these values to make a decision. However, the network involved in computing the values has not yet been fully characterized. Here, we investigated whether the human amygdala plays a role in the computation of stimulus values at the time of decision making. We recorded single neuron activity from the amygdala of awake patients while they made simple purchase decisions over food items. We found 16 amygdala neurons, located primarily in the basolateral nucleus that responded linearly to the values assigned to individual items.
NASA Astrophysics Data System (ADS)
Safaei, S.; Haghnegahdar, A.; Razavi, S.
2016-12-01
Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Janssen, Terry
2000-01-01
A system and method for facilitating decision-making comprising a computer program causing linkage of data representing a plurality of argument structure units into a hierarchical argument structure. Each argument structure unit comprises data corresponding to a hypothesis and its corresponding counter-hypothesis, data corresponding to grounds that provide a basis for inference of the hypothesis or its corresponding counter-hypothesis, data corresponding to a warrant linking the grounds to the hypothesis or its corresponding counter-hypothesis, and data corresponding to backing that certifies the warrant. The hierarchical argument structure comprises a top level argument structure unit and a plurality of subordinate level argument structure units. Each of the plurality of subordinate argument structure units comprises at least a portion of the grounds of the argument structure unit to which it is subordinate. Program code located on each of a plurality of remote computers accepts input from one of a plurality of contributors. Each input comprises data corresponding to an argument structure unit in the hierarchical argument structure and supports the hypothesis or its corresponding counter-hypothesis. A second programming code is adapted to combine the inputs into a single hierarchical argument structure. A third computer program code is responsive to the second computer program code and is adapted to represent a degree of support for the hypothesis and its corresponding counter-hypothesis in the single hierarchical argument structure.
Evolutionary image simplification for lung nodule classification with convolutional neural networks.
Lückehe, Daniel; von Voigt, Gabriele
2018-05-29
Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.
Expertise transfer for expert system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boose, J.H.
This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
Bansback, Nick; Li, Linda C; Lynd, Larry; Bryan, Stirling
2014-08-01
Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a 'think aloud' protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68-85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their preferred option when using DCIDA. Preliminary results suggest that DCIDA has potential to improve the quality of patient decision-making. Next steps include larger studies to test individual components of DCIDA and feasibility testing with patients making real decisions.
Summerfield, Christopher; Tsetsos, Konstantinos
2012-01-01
Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision-making (PDM) is concerned with how observers detect, discriminate, and categorize noisy sensory information. Economic decision-making (EDM) explores how options are selected on the basis of their reinforcement history. Traditionally, the sub-fields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both tasks, under which an agent has to combine sensory information (what is the stimulus) with value information (what is it worth). We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions - the parietal cortex, the basal ganglia, and the orbitofrontal cortex (OFC) - to perceptual and EDM, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasized the role of the striatum and OFC in value-guided choices, they may play an important role in categorization of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move toward a general framework for understanding decision-making in humans and other primates.
Summerfield, Christopher; Tsetsos, Konstantinos
2012-01-01
Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision-making (PDM) is concerned with how observers detect, discriminate, and categorize noisy sensory information. Economic decision-making (EDM) explores how options are selected on the basis of their reinforcement history. Traditionally, the sub-fields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both tasks, under which an agent has to combine sensory information (what is the stimulus) with value information (what is it worth). We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions – the parietal cortex, the basal ganglia, and the orbitofrontal cortex (OFC) – to perceptual and EDM, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasized the role of the striatum and OFC in value-guided choices, they may play an important role in categorization of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move toward a general framework for understanding decision-making in humans and other primates. PMID:22654730
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
IEEE 1982. Proceedings of the international conference on cybernetics and society
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-01-01
The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1981-01-01
The benefits of changes in management organization and facilities for the Center for Remote Sensing and Cartography in Utah are reported as well as interactions with and outreach to state and local agencies. Completed projects are described which studied (1) Unita Basin wetland/land use; (2) Davis County foothill development; (3) Farmington Bay shoreline fluctuation; (4) irrigation detection; and (5) satellite investigation of snow cover/mule deer relationships. Techniques developed for composite computer mapping, contrast enhancement, U-2 CIR/LANDSAT digital interface; factor analysis, and multivariate statistical analysis are described.
Hydrodynamics Analysis and CFD Simulation of Portal Venous System by TIPS and LS.
Wang, Meng; Zhou, Hongyu; Huang, Yaozhen; Gong, Piyun; Peng, Bing; Zhou, Shichun
2015-06-01
In cirrhotic patients, portal hypertension is often associated with a hyperdynamic changes. Transjugular Intrahepatic Portosystemic Shunt (TIPS) and Laparoscopic splenectomy are both treatments for liver cirrhosis due to portal hypertension. While, the two different interventions have different effects on hemodynamics after operation and the possibilities of triggering PVT are different. How hemodynamics of portal vein system evolving with two different operations remain unknown. Based on ultrasound and established numerical methods, CFD technique is applied to analyze hemodynamic changes after TIPS and Laparoscopic splenectomy. In this paper, we applied two 3-D flow models to the hemodynamic analysis for two patients who received a TIPS and a laparoscopic splenectomy, both therapies for treating portal hypertension induced diseases. The current computer simulations give a quantitative analysis of the interplay between hemodynamics and TIPS or splenectomy. In conclusion, the presented computational model can be used for the theoretical analysis of TIPS and laparoscopic splenectomy, clinical decisions could be made based on the simulation results with personal properly treatment.
Hough transform for human action recognition
NASA Astrophysics Data System (ADS)
Siemon, Mia S. N.
2016-09-01
Nowadays, the demand of computer analysis, especially regarding team sports, continues drastically growing. More and more decisions are made by electronic devices for the live to become `easier' to a certain context. There already exist application areas in sports, during which critical situations are being handled by means of digital software. This paper aims at the evaluation and introduction to the necessary foundation which would make it possible to develop a concept similar to that of `hawk-eye', a decision-making program to evaluate the impact of a ball with respect to a target line and to apply it to the sport of volleyball. The pattern recognition process is in this case performed by means of the mathematical model of Hough transform which is able of identifying relevant lines and circles in the image in order to later on use them for the necessary evaluation of the image and the decision-making process.
The speed-accuracy tradeoff: history, physiology, methodology, and behavior
Heitz, Richard P.
2014-01-01
There are few behavioral effects as ubiquitous as the speed-accuracy tradeoff (SAT). From insects to rodents to primates, the tendency for decision speed to covary with decision accuracy seems an inescapable property of choice behavior. Recently, the SAT has received renewed interest, as neuroscience approaches begin to uncover its neural underpinnings and computational models are compelled to incorporate it as a necessary benchmark. The present work provides a comprehensive overview of SAT. First, I trace its history as a tractable behavioral phenomenon and the role it has played in shaping mathematical descriptions of the decision process. Second, I present a “users guide” of SAT methodology, including a critical review of common experimental manipulations and analysis techniques and a treatment of the typical behavioral patterns that emerge when SAT is manipulated directly. Finally, I review applications of this methodology in several domains. PMID:24966810
Skill-related differences between athletes and nonathletes in speed discrimination.
Thomson, Kaivo; Watt, Anthony; Liukkonen, Jarmo
2008-12-01
This study examined differences in decision-making time and accurscy as attributes of speed discrimination between participants skilled and less skilled in ball games. A total of 130 men, ages 18 to 28 years (M=21.2, SD=2.6), participated. The athlete sample (skilled group) comprised Estonian National League volleyball (n=26) and basketball players (n=27). The nonathlete sample (less skilled group) included 77 soldiers of the Estonian Defence Force with no reported top level experience in ball games. Speed-discrimination stimuli were images of red square shapes presented moving along the sagittal axis at four different virtual velocities on a computer (PC) screen which represented the frontal plane. Analysis indicated that only decision-making time was significantly different between the elite athlete and nonathlete groups. This finding suggests a possible effect of ball-game skills for decision-making time in speed discrimination.
Evaluation of ilmenite serpentine concrete and ordinary concrete as nuclear reactor shielding
NASA Astrophysics Data System (ADS)
Abulfaraj, Waleed H.; Kamal, Salah M.
1994-07-01
The present study involves adapting a formal decision methodology to the selection of alternative nuclear reactor concretes shielding. Multiattribute utility theory is selected to accommodate decision makers' preferences. Multiattribute utility theory (MAU) is here employed to evaluate two appropriate nuclear reactor shielding concretes in terms of effectiveness to determine the optimal choice in order to meet the radiation protection regulations. These concretes are Ordinary concrete (O.C.) and Ilmenite Serpentile concrete (I.S.C.). These are normal weight concrete and heavy heat resistive concrete, respectively. The effectiveness objective of the nuclear reactor shielding is defined and structured into definite attributes and subattributes to evaluate the best alternative. Factors affecting the decision are dose received by reactor's workers, the material properties as well as cost of concrete shield. A computer program is employed to assist in performing utility analysis. Based upon data, the result shows the superiority of Ordinary concrete over Ilmenite Serpentine concrete.
A Wireless Sensor Network-Based Approach with Decision Support for Monitoring Lake Water Quality.
Huang, Xiaoci; Yi, Jianjun; Chen, Shaoli; Zhu, Xiaomin
2015-11-19
Online monitoring and water quality analysis of lakes are urgently needed. A feasible and effective approach is to use a Wireless Sensor Network (WSN). Lake water environments, like other real world environments, present many changing and unpredictable situations. To ensure flexibility in such an environment, the WSN node has to be prepared to deal with varying situations. This paper presents a WSN self-configuration approach for lake water quality monitoring. The approach is based on the integration of a semantic framework, where a reasoner can make decisions on the configuration of WSN services. We present a WSN ontology and the relevant water quality monitoring context information, which considers its suitability in a pervasive computing environment. We also propose a rule-based reasoning engine that is used to conduct decision support through reasoning techniques and context-awareness. To evaluate the approach, we conduct usability experiments and performance benchmarks.
Postmus, Douwe; Tervonen, Tommi; van Valkenhoef, Gert; Hillege, Hans L; Buskens, Erik
2014-09-01
A standard practice in health economic evaluation is to monetize health effects by assuming a certain societal willingness-to-pay per unit of health gain. Although the resulting net monetary benefit (NMB) is easy to compute, the use of a single willingness-to-pay threshold assumes expressibility of the health effects on a single non-monetary scale. To relax this assumption, this article proves that the NMB framework is a special case of the more general stochastic multi-criteria acceptability analysis (SMAA) method. Specifically, as SMAA does not restrict the number of criteria to two and also does not require the marginal rates of substitution to be constant, there are problem instances for which the use of this more general method may result in a better understanding of the trade-offs underlying the reimbursement decision-making problem. This is illustrated by applying both methods in a case study related to infertility treatment.
An interactive modular design for computerized photometry in spectrochemical analysis
NASA Technical Reports Server (NTRS)
Bair, V. L.
1980-01-01
A general functional description of totally automatic photometry of emission spectra is not available for an operating environment in which the sample compositions and analysis procedures are low-volume and non-routine. The advantages of using an interactive approach to computer control in such an operating environment are demonstrated. This approach includes modular subroutines selected at multiple-option, menu-style decision points. This style of programming is used to trace elemental determinations, including the automated reading of spectrographic plates produced by a 3.4 m Ebert mount spectrograph using a dc-arc in an argon atmosphere. The simplified control logic and modular subroutine approach facilitates innovative research and program development, yet is easily adapted to routine tasks. Operator confidence and control are increased by the built-in options including degree of automation, amount of intermediate data printed out, amount of user prompting, and multidirectional decision points.
Ag2S atomic switch-based `tug of war' for decision making
NASA Astrophysics Data System (ADS)
Lutz, C.; Hasegawa, T.; Chikyow, T.
2016-07-01
For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture.For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00690f
Enhancing point of care vigilance using computers.
St Jacques, Paul; Rothman, Brian
2011-09-01
Information technology has the potential to provide a tremendous step forward in perioperative patient safety. Through automated delivery of information through fixed and portable computer resources, clinicians may achieve improved situational awareness of the overall operation of the operating room suite and the state of individual patients in various stages of surgical care. Coupling the raw, but integrated, information with decision support and alerting algorithms enables clinicians to achieve high reliability in documentation compliance and response to care protocols. Future studies and outcomes analysis are needed to quantify the degree of benefit of these new components of perioperative information systems. Copyright © 2011 Elsevier Inc. All rights reserved.
International Instrumentation Symposium, 38th, Las Vegas, NV, Apr. 26-30, 1992, Proceedings
NASA Astrophysics Data System (ADS)
The present volume on aerospace instrumentation discusses computer applications, blast and shock, implementation of the Clean Air Act amendments, and thermal systems. Attention is given to measurement uncertainty/flow measurement, data acquisition and processing, force/acceleration/motion measurements, and hypersonics/reentry vehicle systems. Topics addressed include wind tunnels, real time systems, and pressure effects. Also discussed are a distributed data and control system for space simulation and thermal testing a stepwise shockwave velocity determinator, computer tracking and decision making, the use of silicon diodes for detecting the liquid-vapor interface in hydrogen, and practical methods for analysis of uncertainty propagation.
Spaulding, William; Deogun, Jitender
2011-09-01
Personalization of treatment is a current strategic goal for improving health care. Integrated treatment approaches such as psychiatric rehabilitation benefit from personalization because they involve matching diverse arrays of treatment options to individually unique profiles of need. The need for personalization is evident in the heterogeneity of people with severe mental illness and in the findings of experimental psychopathology. One pathway to personalization lies in analysis of the judgments and decision making of human experts and other participants as they respond to complex circumstances in pursuit of treatment and rehabilitation goals. Such analysis is aided by computer simulation of human decision making, which in turn informs development of computerized clinical decision support systems. This inspires a research program involving concurrent development of databases, domain ontology, and problem-solving algorithms, toward the goal of personalizing psychiatric rehabilitation through human collaboration with intelligent cyber systems. The immediate hurdle is to demonstrate that clinical decisions beyond diagnosis really do affect outcome. This can be done by supporting the hypothesis that a human treatment team with access to a reasonably comprehensive clinical database that tracks patient status and treatment response over time achieves better outcome than a treatment team without such access, in a controlled experimental trial. Provided the hypothesis can be supported, the near future will see prototype systems that can construct an integrated assessment, formulation, and rehabilitation plan from clinical assessment data and contextual information. This will lead to advanced systems that collaborate with human decision makers to personalize psychiatric rehabilitation and optimize outcome.
A Holistic Approach to Networked Information Systems Design and Analysis
2016-04-15
attain quite substantial savings. 11. Optimal algorithms for energy harvesting in wireless networks. We use a Markov- decision-process (MDP) based...approach to obtain optimal policies for transmissions . The key advantage of our approach is that it holistically considers information and energy in a...Coding technique to minimize delays and the number of transmissions in Wireless Systems. As we approach an era of ubiquitous computing with information
Optimizing Force Deployment and Force Structure for the Rapid Deployment Force
1984-03-01
Analysis . . . . .. .. ... ... 97 Experimental Design . . . . . .. .. .. ... 99 IX. Use of a Flexible Response Surface ........ 10.2 Selection of a...setS . ere designe . arun, programming methodology , where the require: s.stem re..r is input and the model optimizes the num=er. :::pe, cargo. an...to obtain new computer outputs" (Ref 38:23). The methodology can be used with any decision model, linear or nonlinear. Experimental Desion Since the
Rebuilding the NAVSEA Early Stage Ship Design Environment
2010-04-01
rules -of- thumb to base these crucial decisions upon. With High Performance Computing (HPC) as an enabler, the vision is to explore all downstream...the results of the analysis back into LEAPS. Another software development worthy of discussion here is Intelligent Ship Arrangements ( ISA ), which...constraints and rules set by the users ahead of time. When used in a systematic and stochastic way, and when integrated using LEAPS, having this
Near-term hybrid vehicle program, phase 1. Appendix C: Preliminary design data package
NASA Technical Reports Server (NTRS)
1979-01-01
The design methodology, the design decision rationale, the vehicle preliminary design summary, and the advanced technology developments are presented. The detailed vehicle design, the vehicle ride and handling and front structural crashworthiness analysis, the microcomputer control of the propulsion system, the design study of the battery switching circuit, the field chopper, and the battery charger, and the recent program refinements and computer results are presented.
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
2007-04-01
judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit
FORBEEF: A Forage-Livestock System Computer Model Used as a Teaching Aid for Decision Making.
ERIC Educational Resources Information Center
Stringer, W. C.; And Others
1987-01-01
Describes the development of a computer simulation model of forage-beef production systems, which is intended to incorporate soil, forage, and animal decisions into an enterprise scenario. Produces a summary of forage production and livestock needs. Cites positive assessment of the program's value by participants in inservice training workshops.…
ERIC Educational Resources Information Center
Hopf-Weichel, Rosemarie; And Others
This report describes results of the first year of a three-year program to develop and evaluate a new Adaptive Computerized Training System (ACTS) for electronics maintenance training. (ACTS incorporates an adaptive computer program that learns the student's diagnostic and decision value structure, compares it to that of an expert, and adapts the…
Computer Simulation of the Alonso Household Location Model in the Microeconomics Course
ERIC Educational Resources Information Center
Bolton, Roger E.
2005-01-01
Computer simulation of the Alonso household location model can enrich the intermediate microeconomics course. The model includes decisions on location, land space, and other goods and is a valuable complement to the usual textbook model of household consumption. It has three decision variables, one of which is a "bad," and one good's price is a…
Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics
ERIC Educational Resources Information Center
Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.
2013-01-01
An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…
Making the Right Decisions: Leadership in 1-to-1 Computing in Education
ERIC Educational Resources Information Center
Towndrow, Phillip A.; Vallance, Michael
2013-01-01
Purpose: The purpose of this paper is to detail the necessity for more informed decision making and leadership in the implementation of 1-to-1 computing in education. Design/methodology/approach: The contexts of high-tech countries of Singapore and Japan are used as case studies to contextualize and support four evidence-based recommendations for…
Pilot interaction with automated airborne decision making systems
NASA Technical Reports Server (NTRS)
Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.
1976-01-01
An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.
Improving Site-Specific Radiological Performance Assessments - 13431
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tauxe, John; Black, Paul; Catlett, Kate
2013-07-01
An improved approach is presented for conducting complete and defensible radiological site-specific performance assessments (PAs) to support radioactive waste disposal decisions. The basic tenets of PA were initiated some thirty years ago, focusing on geologic disposals and evaluating compliance with regulations. Some of these regulations were inherently probabilistic (i.e., addressing uncertainty in a quantitative fashion), such as the containment requirements of the U.S. Environmental Protection Agency's (EPA's) 40 CFR 191, Environmental Radiation Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes, Chap. 191.13 [1]. Methods of analysis were developed to meet those requirements, butmore » at their core early PAs used 'conservative' parameter values and modeling approaches. This limited the utility of such PAs to compliance evaluation, and did little to inform decisions about optimizing disposal, closure and long-term monitoring and maintenance, or, in general, maintaining doses 'as low as reasonably achievable' (ALARA). This basic approach to PA development in the United States was employed essentially unchanged through the end of the 20. century, principally by the U.S. Department of Energy (DOE). Performance assessments developed in support of private radioactive waste disposal operations, regulated by the U.S. Nuclear Regulatory Commission (NRC) and its agreement states, were typically not as sophisticated. Discussion of new approaches to PA is timely, since at the time of this writing, the DOE is in the midst of revising its Order 435.1, Radioactive Waste Management [2], and the NRC is revising 10 CFR 61, Licensing Requirements for Land Disposal of Radioactive Waste [3]. Over the previous decade, theoretical developments and improved computational technology have provided the foundation for integrating decision analysis (DA) concepts and objective-focused thinking, plus a Bayesian approach to probabilistic modeling and risk analysis, to guide improvements in PA. This decision-making approach, [4, 5, 6] provides a transparent formal framework for using a value- or objective-focused approach to decision-making. DA, as an analytical means to implement structured decision making, provides a context for both understanding how uncertainty affects decisions and for targeting uncertainty reduction. The proposed DA approach improves defensibility and transparency of decision-making. The DA approach is fully consistent with the need to perform realistic modeling (rather than conservative modeling), including evaluation of site-specific factors. Instead of using generic stylized scenarios for radionuclide fate and transport and for human exposures to radionuclides, site-specific scenarios better represent the advantages and disadvantages of alternative disposal sites or engineered designs, thus clarifying their differences as well as providing a sound basis for evaluation of site performance. The full DA approach to PA is described, from explicitly incorporating societal values through stakeholder involvement to model building. Model building involves scoping by considering features, events, processes, and exposure scenarios (FEPSs), development of a conceptual site model (CSM), translation into numerical models and subsequent computation, and model evaluation. These are implemented in a cycle of uncertainty analysis, sensitivity analysis and value of information analysis so that uncertainty can be reduced until sufficient confidence is gained in the decisions to be made. This includes the traditional focus on hydrogeological processes, but also places emphasis on other FEPSs such as biotically-induced transport and human exposure phenomena. The significance of human exposure scenarios is emphasized by modifying the traditional acronym 'FEPs' to include them, hence 'FEPSs'. The radioactive waste community is also recognizing that disposal sites are to be considered a national (or even global) resource. As such, there is a pressing need to optimize their utility within the constraints of protecting human health and the environment. Failing to do so will result in the need for additional sites or options for storing radioactive waste temporarily, assuming a continued need for radioactive waste disposal. Optimization should be performed using DA, including economic analysis, invoked if necessary through the ALARA process. The economic analysis must recognize the cost of implementation (disposal design, closure, maintenance, etc.), and intra- and inter-generational equity in order to ensure that the best possible radioactive waste management decisions are made for the protection of both current and future generations. In most cases this requires consideration of population or collective risk. (authors)« less
Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry
Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less
Neural decoding of collective wisdom with multi-brain computing.
Eckstein, Miguel P; Das, Koel; Pham, Binh T; Peterson, Matthew F; Abbey, Craig K; Sy, Jocelyn L; Giesbrecht, Barry
2012-01-02
Group decisions and even aggregation of multiple opinions lead to greater decision accuracy, a phenomenon known as collective wisdom. Little is known about the neural basis of collective wisdom and whether its benefits arise in late decision stages or in early sensory coding. Here, we use electroencephalography and multi-brain computing with twenty humans making perceptual decisions to show that combining neural activity across brains increases decision accuracy paralleling the improvements shown by aggregating the observers' opinions. Although the largest gains result from an optimal linear combination of neural decision variables across brains, a simpler neural majority decision rule, ubiquitous in human behavior, results in substantial benefits. In contrast, an extreme neural response rule, akin to a group following the most extreme opinion, results in the least improvement with group size. Analyses controlling for number of electrodes and time-points while increasing number of brains demonstrate unique benefits arising from integrating neural activity across different brains. The benefits of multi-brain integration are present in neural activity as early as 200 ms after stimulus presentation in lateral occipital sites and no additional benefits arise in decision related neural activity. Sensory-related neural activity can predict collective choices reached by aggregating individual opinions, voting results, and decision confidence as accurately as neural activity related to decision components. Estimation of the potential for the collective to execute fast decisions by combining information across numerous brains, a strategy prevalent in many animals, shows large time-savings. Together, the findings suggest that for perceptual decisions the neural activity supporting collective wisdom and decisions arises in early sensory stages and that many properties of collective cognition are explainable by the neural coding of information across multiple brains. Finally, our methods highlight the potential of multi-brain computing as a technique to rapidly and in parallel gather increased information about the environment as well as to access collective perceptual/cognitive choices and mental states. Copyright © 2011 Elsevier Inc. All rights reserved.
Nihonsugi, Tsuyoshi; Ihara, Aya; Haruno, Masahiko
2015-02-25
The intention behind another's action and the impact of the outcome are major determinants of human economic behavior. It is poorly understood, however, whether the two systems share a core neural computation. Here, we investigated whether the two systems are causally dissociable in the brain by integrating computational modeling, functional magnetic resonance imaging, and transcranial direct current stimulation experiments in a newly developed trust game task. We show not only that right dorsolateral prefrontal cortex (DLPFC) activity is correlated with intention-based economic decisions and that ventral striatum and amygdala activity are correlated with outcome-based decisions, but also that stimulation to the DLPFC selectively enhances intention-based decisions. These findings suggest that the right DLPFC is involved in the implementation of intention-based decisions in the processing of cooperative decisions. This causal dissociation of cortical and subcortical backgrounds may indicate evolutionary and developmental differences in the two decision systems. Copyright © 2015 the authors 0270-6474/15/53412-08$15.00/0.
Computer-Aided Diagnosis of Breast Cancer: A Multi-Center Demonstrator
1998-10-01
Artificial Neural Network (ANN) approach to computer aided diagnosis of breast cancer from mammographic findings. An ANN has been developed to provide support for the clinical decision to perform breast biopsy. The system is designed to aid in the decision to biopsy those patients who have suspicious mammographic findings. The decision to biopsy can be viewed as a two stage process: 1)the mammographer views the mammogram and determines the presence or absence of image features such as calcifications and masses, 2) the presence and description of these features
Griffey, Richard T; Jeffe, Donna B; Bailey, Thomas
2014-07-01
Although computerized decision support for imaging is often recommended for optimizing computed tomography (CT) use, no studies have evaluated emergency physicians' (EPs') preferences regarding computerized decision support in the emergency department (ED). In this needs assessment, the authors sought to determine if EPs view overutilization as a problem, if they want decision support, and if so, the kinds of support they prefer. A 42-item, Web-based survey of EPs was developed and used to measure EPs' attitudes, preferences, and knowledge. Key contacts at local EDs sent letters describing the study to their physicians. Exploratory principal components analysis (PCA) was used to determine the underlying factor structure of multi-item scales, Cronbach's alpha was used to measure internal consistency of items on a scale, Spearman correlations were used to describe bivariate associations, and multivariable linear regression analysis was used to identify variables independently associated with physician interest in decision support. Of 235 surveys sent, 155 (66%) EPs responded. Five factors emerged from the PCA. EPs felt that: 1) CT overutilization is a problem in the ED (α = 0.75); 2) a patient's cumulative CT study count affects decisions of whether and what type of imaging study to order only some of the time (α = 0.75); 3) knowledge that a patient has had prior CT imaging for the same indication makes EPs less likely to order a CT (α = 0.42); 4) concerns about malpractice, patient satisfaction, or insistence on CTs affect CT ordering decisions (α = 0.62); and 5) EPs want decision support before ordering CTs (α = 0.85). Performance on knowledge questions was poor, with only 18% to 39% correctly responding to each of the three multiple-choice items about effective radiation doses of chest radiograph and single-pass abdominopelvic CT, as well as estimated increased risk of cancer from a 10-mSv exposure. Although EPs wanted information on patients' cumulative exposures, they feel inadequately familiar with this information to make use of it clinically. If provided with patients' cumulative radiation exposures from CT, 87% of EPs said that they would use this information to discuss imaging options with their patients. In the multiple regression model, which included all variables associated with interest in decision support at p < 0.10 in bivariate tests, items independently associated with EPs' greater interest in all types of decision support proposed included lower total knowledge scores, greater frequency that cumulative CT study count affects EP's decision to order CTs, and greater agreement that overutilization of CT is a problem and that awareness of multiple prior CTs for a given indication affects CT ordering decisions. Emergency physicians view overutilization of CT scans as a problem with potential for improvement in the ED and would like to have more information to discuss risks with their patients. EPs are interested in all types of imaging decision support proposed to help optimize imaging ordering in the ED and to reduce radiation to their patients. Findings reveal several opportunities that could potentially affect CT utilization. © 2014 by the Society for Academic Emergency Medicine.
Griffey, Richard T.; Jeffe, Donna B.; Bailey, Thomas
2014-01-01
Objectives Although computerized decision support for imaging is often recommended for optimizing computed tomography (CT) use, no studies have evaluated emergency physicians’ (EPs’) preferences regarding computerized decision support in the emergency department (ED). In this needs assessment, the authors sought to determine if EPs view overutilization as a problem, if they want decision support, and if so, the kinds of support they prefer. Methods A 42-item, Web-based survey of EPs was developed and used to measure EPs’ attitudes, preferences, and knowledge. Key contacts at local EDs sent letters describing the study to their physicians. Exploratory principal components analysis (PCA) was used to determine the underlying factor structure of multi-item scales, Cronbach’s alpha was used to measure internal consistency of items on a scale, Spearman correlations were used to describe bivariate associations, and multivariable linear regression analysis was used to identify variables independently associated with physician interest in decision support. Results Of 235 surveys sent, 155 (66%) EPs responded. Five factors emerged from the PCA. EPs felt that: 1) CT overutilization is a problem in the ED (α = 0.75); 2) a patient’s cumulative CT study count affects decisions of whether and what type of imaging study to order only some of the time (α = 0.75); 3) knowledge that a patient has had prior CT imaging for the same indication makes EPs less likely to order a CT (α = 0.42); 4) concerns about malpractice, patient satisfaction, or insistence on CTs affect CT ordering decisions (α = 0.62); and 5) EPs want decision support before ordering CTs (α = 0.85). Performance on knowledge questions was poor, with only 18% to 39% correctly responding to each of the three multiple-choice items about effective radiation doses of chest radiograph and single-pass abdominopelvic CT, as well as estimated increased risk of cancer from a 10-mSv exposure. Although EPs wanted information on patients’ cumulative exposures, they feel inadequately familiar with this information to make use of it clinically. If provided with patients’ cumulative radiation exposures from CT, 87% of EPs said that they would use this information to discuss imaging options with their patients. In the multiple regression model, which included all variables associated with interest in decision support at p < 0.10 in bivariate tests, items independently associated with EPs’ greater interest in all types of decision support proposed included lower total knowledge scores, greater frequency that cumulative CT study count affects EP’s decision to order CTs, and greater agreement that overutilization of CT is a problem and that awareness of multiple prior CTs for a given indication affects CT ordering decisions. Conclusions Emergency physicians view overutilization of CT scans as a problem with potential for improvement in the ED and would like to have more information to discuss risks with their patients. EPs are interested in all types of imaging decision support proposed to help optimize imaging ordering in the ED and to reduce radiation to their patients. Findings reveal several opportunities that could potentially affect CT utilization. PMID:25125272
RECOVER: An Automated Cloud-Based Decision Support System for Post-fire Rehabilitation Planning
NASA Technical Reports Server (NTRS)
Schnase, John L.; Carroll, Mark; Weber, K. T.; Brown, Molly E.; Gill, Roger L.; Wooten, Margaret; May J.; Serr, K.; Smith, E.; Goldsby, R.;
2014-01-01
RECOVER is a site-specific decision support system that automatically brings together in a single analysis environment the information necessary for post-fire rehabilitation decision-making. After a major wildfire, law requires that the federal land management agencies certify a comprehensive plan for public safety, burned area stabilization, resource protection, and site recovery. These burned area emergency response (BAER) plans are a crucial part of our national response to wildfire disasters and depend heavily on data acquired from a variety of sources. Final plans are due within 21 days of control of a major wildfire and become the guiding document for managing the activities and budgets for all subsequent remediation efforts. There are few instances in the federal government where plans of such wide-ranging scope and importance are assembled on such short notice and translated into action more quickly. RECOVER has been designed in close collaboration with our agency partners and directly addresses their high-priority decision-making requirements. In response to a fire detection event, RECOVER uses the rapid resource allocation capabilities of cloud computing to automatically collect Earth observational data, derived decision products, and historic biophysical data so that when the fire is contained, BAER teams will have a complete and ready-to-use RECOVER dataset and GIS analysis environment customized for the target wildfire. Initial studies suggest that RECOVER can transform this information-intensive process by reducing from days to a matter of minutes the time required to assemble and deliver crucial wildfire-related data.
RECOVER: An Automated, Cloud-Based Decision Support System for Post-Fire Rehabilitation Planning
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Carroll, M. L.; Weber, K. T.; Brown, M. E.; Gill, R. L.; Wooten, M.; May, J.; Serr, K.; Smith, E.; Goldsby, R.; Newtoff, K.; Bradford, K.; Doyle, C.; Volker, E.; Weber, S.
2014-11-01
RECOVER is a site-specific decision support system that automatically brings together in a single analysis environment the information necessary for post-fire rehabilitation decision-making. After a major wildfire, law requires that the federal land management agencies certify a comprehensive plan for public safety, burned area stabilization, resource protection, and site recovery. These burned area emergency response (BAER) plans are a crucial part of our national response to wildfire disasters and depend heavily on data acquired from a variety of sources. Final plans are due within 21 days of control of a major wildfire and become the guiding document for managing the activities and budgets for all subsequent remediation efforts. There are few instances in the federal government where plans of such wide-ranging scope and importance are assembled on such short notice and translated into action more quickly. RECOVER has been designed in close collaboration with our agency partners and directly addresses their high-priority decision-making requirements. In response to a fire detection event, RECOVER uses the rapid resource allocation capabilities of cloud computing to automatically collect Earth observational data, derived decision products, and historic biophysical data so that when the fire is contained, BAER teams will have a complete and ready-to-use RECOVER dataset and GIS analysis environment customized for the target wildfire. Initial studies suggest that RECOVER can transform this information-intensive process by reducing from days to a matter of minutes the time required to assemble and deliver crucial wildfire-related data.
Ciplak, Nesli
2015-08-01
The aim of this paper is to identify the best possible health care waste management option in the West Black Sea Region by taking into account economic, social, environmental, and technical aspects in the concept of multi-criteria decision analysis. In the scope of this research, three different health care waste management scenarios that consist of different technology alternatives were developed and compared using a decision-making computer software, called Right Choice, by identifying various criteria, measuring them, and ranking their relative importance from the point of key stakeholders. The results of the study show that the decentralized autoclave technology option coupled with the disposal through land-filling with energy recovery has potential to be an optimum option for health care waste management system, and an efficient health care waste segregation scheme should be given more attention by the authorities in the region. Furthermore, the discussion of the results points out multidisciplinary approach and the equilibrium between social, environmental, economic, and technical criteria. The methodology used in this research was developed in order to enable the decision makers to gain an increased perception of a decision problem. In general, the results and remarks of this study can be used as a basis of future planning and anticipation of needs for investment in the area of health care waste management in the region and also in developing countries that are dealing with the similar waste management problems.
Assessing Professional Decision-Making Abilities.
ERIC Educational Resources Information Center
McNergney, Robert; Hinson, Stephanie
1985-01-01
Describes Teacher Development Decision Exercises, a computer-based method of diagnosing abilities of elementary and secondary school supervisors (principals, staff developers, curriculum coordinators) to make professional preactive or planning decisions. This approval simulates assessment of supervisors' abilities to use professional knowledge to…
Making Market Decisions in the Classroom.
ERIC Educational Resources Information Center
Rose, Stephen A.
1986-01-01
Computer software that will help intermediate and secondary social studies students learn to make rational decisions about personal and societal concerns are described. The courseware places students in the roles of business managers who make decisions about operating their firms. (RM)
Edwards, W; Fasolo, B
2001-01-01
This review is about decision technology-the rules and tools that help us make wiser decisions. First, we review the three rules that are at the heart of most traditional decision technology-multi-attribute utility, Bayes' theorem, and subjective expected utility maximization. Since the inception of decision research, these rules have prescribed how we should infer values and probabilities and how we should combine them to make better decisions. We suggest how to make best use of all three rules in a comprehensive 19-step model. The remainder of the review explores recently developed tools of decision technology. It examines the characteristics and problems of decision-facilitating sites on the World Wide Web. Such sites now provide anyone who can use a personal computer with access to very sophisticated decision-aiding tools structured mainly to facilitate consumer decision making. It seems likely that the Web will be the mode by means of which decision tools will be distributed to lay users. But methods for doing such apparently simple things as winnowing 3000 options down to a more reasonable number, like 10, contain traps for unwary decision technologists. The review briefly examines Bayes nets and influence diagrams-judgment and decision-making tools that are available as computer programs. It very briefly summarizes the state of the art of eliciting probabilities from experts. It concludes that decision tools will be as important in the 21st century as spreadsheets were in the 20th.
ERIC Educational Resources Information Center
Birken, Marvin N.
1967-01-01
Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…
A computational framework for the study of confidence in humans and animals
Kepecs, Adam; Mainen, Zachary F.
2012-01-01
Confidence judgements, self-assessments about the quality of a subject's knowledge, are considered a central example of metacognition. Prima facie, introspection and self-report appear the only way to access the subjective sense of confidence or uncertainty. Contrary to this notion, overt behavioural measures can be used to study confidence judgements by animals trained in decision-making tasks with perceptual or mnemonic uncertainty. Here, we suggest that a computational approach can clarify the issues involved in interpreting these tasks and provide a much needed springboard for advancing the scientific understanding of confidence. We first review relevant theories of probabilistic inference and decision-making. We then critically discuss behavioural tasks employed to measure confidence in animals and show how quantitative models can help to constrain the computational strategies underlying confidence-reporting behaviours. In our view, post-decision wagering tasks with continuous measures of confidence appear to offer the best available metrics of confidence. Since behavioural reports alone provide a limited window into mechanism, we argue that progress calls for measuring the neural representations and identifying the computations underlying confidence reports. We present a case study using such a computational approach to study the neural correlates of decision confidence in rats. This work shows that confidence assessments may be considered higher order, but can be generated using elementary neural computations that are available to a wide range of species. Finally, we discuss the relationship of confidence judgements to the wider behavioural uses of confidence and uncertainty. PMID:22492750
ERIC Educational Resources Information Center
Snipes, Katherine H.
2009-01-01
A set of computer-based recreation choice experiments were run to examine the effect of expected congestion and social interactions on the decision making process. MouseTrace is a process-tracing program that recorded individual subject's information acquisitions and provided the necessary information to determine if subjects used attribute-based…
Distinct Roles of Dopamine and Subthalamic Nucleus in Learning and Probabilistic Decision Making
ERIC Educational Resources Information Center
Coulthard, Elizabeth J.; Bogacz, Rafal; Javed, Shazia; Mooney, Lucy K.; Murphy, Gillian; Keeley, Sophie; Whone, Alan L.
2012-01-01
Even simple behaviour requires us to make decisions based on combining multiple pieces of learned and new information. Making such decisions requires both learning the optimal response to each given stimulus as well as combining probabilistic information from multiple stimuli before selecting a response. Computational theories of decision making…
Ma, Ning; Yu, Angela J
2015-01-01
Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.
Chronic Exposure to Methamphetamine Disrupts Reinforcement-Based Decision Making in Rats.
Groman, Stephanie M; Rich, Katherine M; Smith, Nathaniel J; Lee, Daeyeol; Taylor, Jane R
2018-03-01
The persistent use of psychostimulant drugs, despite the detrimental outcomes associated with continued drug use, may be because of disruptions in reinforcement-learning processes that enable behavior to remain flexible and goal directed in dynamic environments. To identify the reinforcement-learning processes that are affected by chronic exposure to the psychostimulant methamphetamine (MA), the current study sought to use computational and biochemical analyses to characterize decision-making processes, assessed by probabilistic reversal learning, in rats before and after they were exposed to an escalating dose regimen of MA (or saline control). The ability of rats to use flexible and adaptive decision-making strategies following changes in stimulus-reward contingencies was significantly impaired following exposure to MA. Computational analyses of parameters that track choice and outcome behavior indicated that exposure to MA significantly impaired the ability of rats to use negative outcomes effectively. These MA-induced changes in decision making were similar to those observed in rats following administration of a dopamine D2/3 receptor antagonist. These data use computational models to provide insight into drug-induced maladaptive decision making that may ultimately identify novel targets for the treatment of psychostimulant addiction. We suggest that the disruption in utilization of negative outcomes to adaptively guide dynamic decision making is a new behavioral mechanism by which MA rigidly biases choice behavior.
Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.
Cox, Louis Anthony Tony
2015-10-01
Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. © 2015 Society for Risk Analysis.
ERIC Educational Resources Information Center
Solway, Alec; Botvinick, Matthew M.
2012-01-01
Recent work has given rise to the view that reward-based decision making is governed by two key controllers: a habit system, which stores stimulus-response associations shaped by past reward, and a goal-oriented system that selects actions based on their anticipated outcomes. The current literature provides a rich body of computational theory…
ERIC Educational Resources Information Center
Ramsey, Gregory W.
2010-01-01
This dissertation proposes and tests a theory explaining how people make decisions to achieve a goal in a specific task environment. The theory is represented as a computational model and implemented as a computer program. The task studied was primary care physicians treating patients with type 2 diabetes. Some physicians succeed in achieving…
Gregory Elmes; Thomas Millette; Charles B. Yuill
1991-01-01
GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...
An oculomotor and computational study of a patient with diagonistic dyspraxia.
Pouget, Pierre; Pradat-Diehl, Pascale; Rivaud-Péchoux, Sophie; Wattiez, Nicolas; Gaymard, Bertrand
2011-04-01
Diagonistic dyspraxia (DD) is a behavioural disorder encountered in split-brain subjects in which the left arm acts against the subject's will, deliberately counteracting what the right arm does. We report here an oculomotor and computational study of a patient with a long lasting form of DD. A first series of oculomotor paradigms revealed marked and unprecedented saccade impairments. We used a computational model in order to provide information about the impaired decision-making process: the analysis of saccade latencies revealed that variations of decision times were explained by adjustments of response criterion. This result and paradoxical impairments observed in additional oculomotor paradigms allowed to propose that this adjustment of the criterion level resulted from the co-existence of counteracting oculomotor programs, consistent with the existence of antagonist programs in homotopic cortical areas. In the intact brain, trans-hemispheric inhibition would allow suppression of these counter programs. Depending on the topography of the disconnected areas, various motor and/or behavioural impairments would arise in split-brain subjects. In motor systems, such conflict would result in increased criteria for desired movement execution (oculomotor system) or in simultaneous execution of counteracting movements (skeletal motor system). At higher cognitive levels, it may result in conflict of intentions. Copyright © 2010 Elsevier Srl. All rights reserved.
Effectiveness of diagnostic strategies in suspected delayed cerebral ischemia: a decision analysis.
Rawal, Sapna; Barnett, Carolina; John-Baptiste, Ava; Thein, Hla-Hla; Krings, Timo; Rinkel, Gabriel J E
2015-01-01
Delayed cerebral ischemia (DCI) is a serious complication after aneurysmal subarachnoid hemorrhage. If DCI is suspected clinically, imaging methods designed to detect angiographic vasospasm or regional hypoperfusion are often used before instituting therapy. Uncertainty in the strength of the relationship between imaged vasospasm or perfusion deficits and DCI-related outcomes raises the question of whether imaging to select patients for therapy improves outcomes in clinical DCI. Decision analysis was performed using Markov models. Strategies were either to treat all patients immediately or to first undergo diagnostic testing by digital subtraction angiography or computed tomography angiography to assess for angiographic vasospasm, or computed tomography perfusion to assess for perfusion deficits. According to current practice guidelines, treatment consisted of induced hypertension. Outcomes were survival in terms of life-years and quality-adjusted life-years. When treatment was assumed to be ineffective in nonvasospasm patients, Treat All and digital subtraction angiography were equivalent strategies; when a moderate treatment effect was assumed in nonvasospasm patients, Treat All became the superior strategy. Treating all patients was also superior to selecting patients for treatment via computed tomography perfusion. One-way sensitivity analyses demonstrated that the models were robust; 2- and 3-way sensitivity analyses with variation of disease and treatment parameters reinforced dominance of the Treat All strategy. Imaging studies to test for the presence of angiographic vasospasm or perfusion deficits in patients with clinical DCI do not seem helpful in selecting which patients should undergo treatment and may not improve outcomes. Future directions include validating these results in prospective cohort studies. © 2014 American Heart Association, Inc.
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.
1975-01-01
The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.
Castelló Castañeda, Coral; Ríos Santos, Jose Vicente; Bullón, Pedro
2008-01-01
Dentists are currently required to make multiple diagnoses and treatment decisions every day and the information necessary to achieve this satisfactorily doubles in volume every five years. Knowledge therefore rapidly becomes out of date, so that it is often impossible to remember established information and assimilate new concepts. This may result in a significant lack of knowledge in the future, which would jeopardize the success of treatments. To remedy this situation and to prevent it, we nowadays have access to modern computing systems, with an extensive data base, which helps us to retain the information necessary for daily practice and access it instantaneously. The objectives of this study are therefore to determine how widespread the use of computing is in this environment and to determine the opinion of students and qualified dentists as regards its use in Dentistry. 90 people were chosen to take part in the study, divided into the following groups (students) (newly qualified dentists) (experts). It has been demonstrated that a high percentage (93.30%) use a computer, but that their level of computing knowledge is predominantly moderate. The place where a computer is used most is the home, which suggests that the majority own a computer. Analysis of the results obtained for evaluation of computers in teaching showed that the participants thought that it saved a great deal of time and had great potential for providing an image (in terms of marketing) and they considered it a very innovative and stimulating tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RAYBOURN,ELAINE M.; FORSYTHE,JAMES C.
2001-08-01
This report documents an exploratory FY 00 LDRD project that sought to demonstrate the first steps toward a realistic computational representation of the variability encountered in individual human behavior. Realism, as conceptualized in this project, required that the human representation address the underlying psychological, cultural, physiological, and environmental stressors. The present report outlines the researchers' approach to representing cognitive, cultural, and physiological variability of an individual in an ambiguous situation while faced with a high-consequence decision that would greatly impact subsequent events. The present project was framed around a sensor-shooter scenario as a soldier interacts with an unexpected target (twomore » young Iraqi girls). A software model of the ''Sensor Shooter'' scenario from Desert Storm was developed in which the framework consisted of a computational instantiation of Recognition Primed Decision Making in the context of a Naturalistic Decision Making model [1]. Recognition Primed Decision Making was augmented with an underlying foundation based on our current understanding of human neurophysiology and its relationship to human cognitive processes. While the Gulf War scenario that constitutes the framework for the Sensor Shooter prototype is highly specific, the human decision architecture and the subsequent simulation are applicable to other problems similar in concept, intensity, and degree of uncertainty. The goal was to provide initial steps toward a computational representation of human variability in cultural, cognitive, and physiological state in order to attain a better understanding of the full depth of human decision-making processes in the context of ambiguity, novelty, and heightened arousal.« less
Dynamic remapping decisions in multi-phase parallel computations
NASA Technical Reports Server (NTRS)
Nicol, D. M.; Reynolds, P. F., Jr.
1986-01-01
The effectiveness of any given mapping of workload to processors in a parallel system is dependent on the stochastic behavior of the workload. Program behavior is often characterized by a sequence of phases, with phase changes occurring unpredictably. During a phase, the behavior is fairly stable, but may become quite different during the next phase. Thus a workload assignment generated for one phase may hinder performance during the next phase. We consider the problem of deciding whether to remap a paralled computation in the face of uncertainty in remapping's utility. Fundamentally, it is necessary to balance the expected remapping performance gain against the delay cost of remapping. This paper treats this problem formally by constructing a probabilistic model of a computation with at most two phases. We use stochastic dynamic programming to show that the remapping decision policy which minimizes the expected running time of the computation has an extremely simple structure: the optimal decision at any step is followed by comparing the probability of remapping gain against a threshold. This theoretical result stresses the importance of detecting a phase change, and assessing the possibility of gain from remapping. We also empirically study the sensitivity of optimal performance to imprecise decision threshold. Under a wide range of model parameter values, we find nearly optimal performance if remapping is chosen simply when the gain probability is high. These results strongly suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change; precise quantification of the decision model parameters is not necessary.
Health decision making: lynchpin of evidence-based practice.
Spring, Bonnie
2008-01-01
Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers' intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed.
Health Decision Making: Lynchpin of Evidence-Based Practice
Spring, Bonnie
2008-01-01
Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. Implications for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers’ intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed. PMID:19015288
Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg
2017-08-01
Social information such as observing others can improve performance in decision making. In particular, social information has been shown to be useful when finding the best solution on one's own is difficult, costly, or dangerous. However, past research suggests that when making decisions people do not always consider other people's behaviour when it is at odds with their own experiences. Furthermore, the cognitive processes guiding the integration of social information with individual experiences are still under debate. Here, we conducted two experiments to test whether information about other persons' behaviour influenced people's decisions in a classification task. Furthermore, we examined how social information is integrated with individual learning experiences by testing different computational models. Our results show that social information had a small but reliable influence on people's classifications. The best computational model suggests that in categorization people first make up their own mind based on the non-social information, which is then updated by the social information.
Pilot study of a point-of-use decision support tool for cancer clinical trials eligibility.
Breitfeld, P P; Weisburd, M; Overhage, J M; Sledge, G; Tierney, W M
1999-01-01
Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites.
Pilot Study of a Point-of-use Decision Support Tool for Cancer Clinical Trials Eligibility
Breitfeld, Philip P.; Weisburd, Marina; Overhage, J. Marc; Sledge, George; Tierney, William M.
1999-01-01
Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites. PMID:10579605
A qualitative analysis of how advanced practice nurses use clinical decision support systems.
Weber, Scott
2007-12-01
The purpose of this study was to generate a grounded theory that will reflect the experiences of advanced practice nurses (APNs) working as critical care nurse practitioners (NPs) and clinical nurse specialists (CNS) with computer-based decision-making systems. A study design using grounded theory qualitative research methods and convenience sampling was employed in this study. Twenty-three APNs (13 CNS and 10 NPs) were recruited from 16 critical care units located in six large urban medical centers in the U.S. Midwest. Single-structured in-depth interviews with open-ended audio-taped questions were conducted with each APN. Through this process, APNs defined what they consider to be relevant themes and patterns of clinical decision system use in their critical care practices, and they identified the interrelatedness of the conceptual categories that emerged from the results. Data were analyzed using the constant comparative analysis method of qualitative research. APN participants were predominantly female, white/non-Hispanic, had a history of access to the clinical decision system used in their critical care settings for an average of 14 months, and had attended a formal training program to learn how to use clinical decision systems. "Forecasting decision outcomes," which was defined as the voluntary process employed to forecast the outcomes of patient care decisions in critical care prior to actual decision making, was the core variable describing system use that emerged from the responses. This variable consisted of four user constructs or components: (a) users' perceptions of their initial system learning experience, (b) users' sense of how well they understand how system technology works, (c) users' understanding of how system inferences are created or derived, and (d) users' relative trust of system-derived data. Each of these categories was further described through the grounded theory research process, and the relationships between the categories were identified. The findings of this study suggest that the main reason critical care APNs choose to integrate clinical decision systems into their practices is to provide an objective, scientifically derived, technology-based backup for human forecasting of the outcomes of patient care decisions prior to their actual decision making. Implications for nursing, health care, and technology research are presented.
van der Krieke, Lian; Emerencia, Ando C; Boonstra, Nynke; Wunderink, Lex; de Jonge, Peter; Sytema, Sjoerd
2013-10-07
Mental health policy makers encourage the development of electronic decision aids to increase patient participation in medical decision making. Evidence is needed to determine whether these decision aids are helpful in clinical practice and whether they lead to increased patient involvement and better outcomes. This study reports the outcome of a randomized controlled trial and process evaluation of a Web-based intervention to facilitate shared decision making for people with psychotic disorders. The study was carried out in a Dutch mental health institution. Patients were recruited from 2 outpatient teams for patients with psychosis (N=250). Patients in the intervention condition (n=124) were provided an account to access a Web-based information and decision tool aimed to support patients in acquiring an overview of their needs and appropriate treatment options provided by their mental health care organization. Patients were given the opportunity to use the Web-based tool either on their own (at their home computer or at a computer of the service) or with the support of an assistant. Patients in the control group received care as usual (n=126). Half of the patients in the sample were patients experiencing a first episode of psychosis; the other half were patients with a chronic psychosis. Primary outcome was patient-perceived involvement in medical decision making, measured with the Combined Outcome Measure for Risk Communication and Treatment Decision-making Effectiveness (COMRADE). Process evaluation consisted of questionnaire-based surveys, open interviews, and researcher observation. In all, 73 patients completed the follow-up measurement and were included in the final analysis (response rate 29.2%). More than one-third (48/124, 38.7%) of the patients who were provided access to the Web-based decision aid used it, and most used its full functionality. No differences were found between the intervention and control conditions on perceived involvement in medical decision making (COMRADE satisfaction with communication: F1,68=0.422, P=.52; COMRADE confidence in decision: F1,67=0.086, P=.77). In addition, results of the process evaluation suggest that the intervention did not optimally fit in with routine practice of the participating teams. The development of electronic decision aids to facilitate shared medical decision making is encouraged and many people with a psychotic disorder can work with them. This holds for both first-episode patients and long-term care patients, although the latter group might need more assistance. However, results of this paper could not support the assumption that the use of electronic decision aids increases patient involvement in medical decision making. This may be because of weak implementation of the study protocol and a low response rate.
How to Say "No" to a Nonword: A Leaky Competing Accumulator Model of Lexical Decision
ERIC Educational Resources Information Center
Dufau, Stephane; Grainger, Jonathan; Ziegler, Johannes C.
2012-01-01
We describe a leaky competing accumulator (LCA) model of the lexical decision task that can be used as a response/decision module for any computational model of word recognition. The LCA model uses evidence for a word, operationalized as some measure of lexical activity, as input to the "YES" decision node. Input to the "NO" decision node is…
Sequential decision making in computational sustainability via adaptive submodularity
Krause, Andreas; Golovin, Daniel; Converse, Sarah J.
2015-01-01
Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.
NASA Astrophysics Data System (ADS)
Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv
2018-02-01
New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.
Memory states influence value-based decisions.
Duncan, Katherine D; Shohamy, Daphna
2016-11-01
Using memory to guide decisions allows past experience to improve future outcomes. However, the circumstances that modulate how and when memory influences decisions are not well understood. Here, we report that the use of memories to guide decisions depends on the context in which these decisions are made. We show that decisions made in the context of familiar images are more likely to be influenced by past events than are decisions made in the context of novel images (Experiment 1), that this bias persists even when a temporal gap is introduced between the image presentation and the decision (Experiment 2), and that contextual novelty facilitates value learning whereas familiarity facilitates the retrieval and use of previously learned values (Experiment 3). These effects are consistent with neurobiological and computational models of memory, which propose that familiar images evoke a lingering "retrieval state" that facilitates the recollection of other episodic memories. Together, these experiments highlight the importance of episodic memory for decision-making and provide an example of how computational and neurobiological theories can lead to new insights into how and when different types of memories guide our choices. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
[Hardware for graphics systems].
Goetz, C
1991-02-01
In all personal computer applications, be it for private or professional use, the decision of which "brand" of computer to buy is of central importance. In the USA Apple computers are mainly used in universities, while in Europe computers of the so-called "industry standard" by IBM (or clones thereof) have been increasingly used for many years. Independently of any brand name considerations, the computer components purchased must meet the current (and projected) needs of the user. Graphic capabilities and standards, processor speed, the use of co-processors, as well as input and output devices such as "mouse", printers and scanners are discussed. This overview is meant to serve as a decision aid. Potential users are given a short but detailed summary of current technical features.
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
Experiences running NASTRAN on the Microvax 2 computer
NASA Technical Reports Server (NTRS)
Butler, Thomas G.; Mitchell, Reginald S.
1987-01-01
The MicroVAX operates NASTRAN so well that the only detectable difference in its operation compared to an 11/780 VAX is in the execution time. On the modest installation described here, the engineer has all of the tools he needs to do an excellent job of analysis. System configuration decisions, system sizing, preparation of the system disk, definition of user quotas, installation, monitoring of system errors, and operation policies are discussed.
Towards a Computational Analysis of Status and Leadership Styles on FDA Panels
NASA Astrophysics Data System (ADS)
Broniatowski, David A.; Magee, Christopher L.
Decisions by committees of technical experts are increasingly impacting society. These decision-makers are typically embedded within a web of social relations. Taken as a whole, these relations define an implicit social structure which can influence the decision outcome. Aspects of this structure are founded on interpersonal affinity between parties to the negotiation, on assigned roles, and on the recognition of status characteristics, such as relevant domain expertise. This paper build upon a methodology aimed at extracting an explicit representation of such social structures using meeting transcripts as a data source. Whereas earlier results demonstrated that the method presented here can identify groups of decision-makers with a contextual affinity (i.e., membership in a given medical specialty or voting clique), we now can extract meaningful status hierarchies, and can identify differing facilitation styles among committee chairs. Use of this method is demonstrated on the transcripts of U.S. Food and Drug Administration (FDA) advisory panel meeting transcripts; nevertheless, the approach presented here is extensible to other domains and requires only a meeting transcript as input.
Mean-Field-Game Model for Botnet Defense in Cyber-Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolokoltsov, V. N., E-mail: v.kolokoltsov@warwick.ac.uk; Bensoussan, A.
We initiate the analysis of the response of computer owners to various offers of defence systems against a cyber-hacker (for instance, a botnet attack), as a stochastic game of a large number of interacting agents. We introduce a simple mean-field game that models their behavior. It takes into account both the random process of the propagation of the infection (controlled by the botner herder) and the decision making process of customers. Its stationary version turns out to be exactly solvable (but not at all trivial) under an additional natural assumption that the execution time of the decisions of the customersmore » (say, switch on or out the defence system) is much faster that the infection rates.« less
2014-01-01
Background Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. Method The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a ‘think aloud’ protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Results Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68–85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their preferred option when using DCIDA. Conclusion Preliminary results suggest that DCIDA has potential to improve the quality of patient decision-making. Next steps include larger studies to test individual components of DCIDA and feasibility testing with patients making real decisions. PMID:25084808
Development of a model-based flood emergency management system in Yujiang River Basin, South China
NASA Astrophysics Data System (ADS)
Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu
2014-06-01
Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.
Dypas: A dynamic payload scheduler for shuttle missions
NASA Technical Reports Server (NTRS)
Davis, Stephen
1988-01-01
Decision and analysis systems have had broad and very practical application areas in the human decision making process. These software systems range from the help sections in simple accounting packages, to the more complex computer configuration programs. Dypas is a decision and analysis system that aids prelaunch shutlle scheduling, and has added functionality to aid the rescheduling done in flight. Dypas is written in Common Lisp on a Symbolics Lisp machine. Dypas differs from other scheduling programs in that it can draw its knowledge from different rule bases and apply them to different rule interpretation schemes. The system has been coded with Flavors, an object oriented extension to Common Lisp on the Symbolics hardware. This allows implementation of objects (experiments) to better match the problem definition, and allows a more coherent solution space to be developed. Dypas was originally developed to test a programmer's aptitude toward Common Lisp and the Symbolics software environment. Since then the system has grown into a large software effort with several programmers and researchers thrown into the effort. Dypas is currently using two expert systems and three inferencing procedures to generate a many object schedule. The paper will review the abilities of Dypas and comment on its functionality.
Westreich, Daniel; Lessler, Justin; Funk, Michele Jonsson
2010-08-01
Propensity scores for the analysis of observational data are typically estimated using logistic regression. Our objective in this review was to assess machine learning alternatives to logistic regression, which may accomplish the same goals but with fewer assumptions or greater accuracy. We identified alternative methods for propensity score estimation and/or classification from the public health, biostatistics, discrete mathematics, and computer science literature, and evaluated these algorithms for applicability to the problem of propensity score estimation, potential advantages over logistic regression, and ease of use. We identified four techniques as alternatives to logistic regression: neural networks, support vector machines, decision trees (classification and regression trees [CART]), and meta-classifiers (in particular, boosting). Although the assumptions of logistic regression are well understood, those assumptions are frequently ignored. All four alternatives have advantages and disadvantages compared with logistic regression. Boosting (meta-classifiers) and, to a lesser extent, decision trees (particularly CART), appear to be most promising for use in the context of propensity score analysis, but extensive simulation studies are needed to establish their utility in practice. Copyright (c) 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
A scientific workflow framework for (13)C metabolic flux analysis.
Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina
2016-08-20
Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Fuchs, Marcus; Nouidui, Thierry
This paper discusses design decisions for exporting Modelica thermofluid flow components as Functional Mockup Units. The purpose is to provide guidelines that will allow building energy simulation programs and HVAC equipment manufacturers to effectively use FMUs for modeling of HVAC components and systems. We provide an analysis for direct input-output dependencies of such components and discuss how these dependencies can lead to algebraic loops that are formed when connecting thermofluid flow components. Based on this analysis, we provide recommendations that increase the computing efficiency of such components and systems that are formed by connecting multiple components. We explain what codemore » optimizations are lost when providing thermofluid flow components as FMUs rather than Modelica code. We present an implementation of a package for FMU export of such components, explain the rationale for selecting the connector variables of the FMUs and finally provide computing benchmarks for different design choices. It turns out that selecting temperature rather than specific enthalpy as input and output signals does not lead to a measurable increase in computing time, but selecting nine small FMUs rather than a large FMU increases computing time by 70%.« less
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
ERIC Educational Resources Information Center
Henard, Ralph E.
Possible future developments in artificial intelligence (AI) as well as its limitations are considered that have implications for institutional research in higher education, and especially decision making and decision support systems. It is noted that computer software programs have been developed that store knowledge and mimic the decision-making…
Networks and games for precision medicine.
Biane, Célia; Delaplace, Franck; Klaudel, Hanna
2016-12-01
Recent advances in omics technologies provide the leverage for the emergence of precision medicine that aims at personalizing therapy to patient. In this undertaking, computational methods play a central role for assisting physicians in their clinical decision-making by combining data analysis and systems biology modelling. Complex diseases such as cancer or diabetes arise from the intricate interplay of various biological molecules. Therefore, assessing drug efficiency requires to study the effects of elementary perturbations caused by diseases on relevant biological networks. In this paper, we propose a computational framework called Network-Action Game applied to best drug selection problem combining Game Theory and discrete models of dynamics (Boolean networks). Decision-making is modelled using Game Theory that defines the process of drug selection among alternative possibilities, while Boolean networks are used to model the effects of the interplay between disease and drugs actions on the patient's molecular system. The actions/strategies of disease and drugs are focused on arc alterations of the interactome. The efficiency of this framework has been evaluated for drug prediction on a model of breast cancer signalling. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W. Nick; Zimmerman, M. Bridget; Ersig, Anne L.
2012-01-01
This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children’s responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, the Children, Parents and Distraction (CPaD), is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure. PMID:22805121
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry
The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
Computational neuroscience across the lifespan: Promises and pitfalls.
van den Bos, Wouter; Bruckner, Rasmus; Nassar, Matthew R; Mata, Rui; Eppinger, Ben
2017-10-13
In recent years, the application of computational modeling in studies on age-related changes in decision making and learning has gained in popularity. One advantage of computational models is that they provide access to latent variables that cannot be directly observed from behavior. In combination with experimental manipulations, these latent variables can help to test hypotheses about age-related changes in behavioral and neurobiological measures at a level of specificity that is not achievable with descriptive analysis approaches alone. This level of specificity can in turn be beneficial to establish the identity of the corresponding behavioral and neurobiological mechanisms. In this paper, we will illustrate applications of computational methods using examples of lifespan research on risk taking, strategy selection and reinforcement learning. We will elaborate on problems that can occur when computational neuroscience methods are applied to data of different age groups. Finally, we will discuss potential targets for future applications and outline general shortcomings of computational neuroscience methods for research on human lifespan development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics
ERIC Educational Resources Information Center
Ciampa, Mark
2013-01-01
Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…
Autonomous perception and decision making in cyber-physical systems
NASA Astrophysics Data System (ADS)
Sarkar, Soumik
2011-07-01
The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.
Taylor, Richard Andrew; Singh Gill, Harman; Marcolini, Evie G; Meyers, H Pendell; Faust, Jeremy Samuel; Newman, David H
2016-10-01
The objective was to determine the testing threshold for lumbar puncture (LP) in the evaluation of aneurysmal subarachnoid hemorrhage (SAH) after a negative head computed tomography (CT). As a secondary aim we sought to identify clinical variables that have the greatest impact on this threshold. A decision analytic model was developed to estimate the testing threshold for patients with normal neurologic findings, being evaluated for SAH, after a negative CT of the head. The testing threshold was calculated as the pretest probability of disease where the two strategies (LP or no LP) are balanced in terms of quality-adjusted life-years. Two-way and probabilistic sensitivity analyses (PSAs) were performed. For the base-case scenario the testing threshold for performing an LP after negative head CT was 4.3%. Results for the two-way sensitivity analyses demonstrated that the test threshold ranged from 1.9% to 15.6%, dominated by the uncertainty in the probability of death from initial missed SAH. In the PSA the mean testing threshold was 4.3% (95% confidence interval = 1.4% to 9.3%). Other significant variables in the model included probability of aneurysmal versus nonaneurysmal SAH after negative head CT, probability of long-term morbidity from initial missed SAH, and probability of renal failure from contrast-induced nephropathy. Our decision analysis results suggest a testing threshold for LP after negative CT to be approximately 4.3%, with a range of 1.4% to 9.3% on robust PSA. In light of these data, and considering the low probability of aneurysmal SAH after a negative CT, classical teaching and current guidelines addressing testing for SAH should be revisited. © 2016 by the Society for Academic Emergency Medicine.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
ERIC Educational Resources Information Center
Humphreys, Patrick; Wisudha, Ayleen
As a demonstration of the application of heuristic devices to decision-theoretical techniques, an interactive computer program known as MAUD (Multiattribute Utility Decomposition) has been designed to support decision or choice problems that can be decomposed into component factors, or to act as a tool for investigating the microstructure of a…
Jethwa, Pinakin R; Punia, Vineet; Patel, Tapan D; Duffis, E Jesus; Gandhi, Chirag D; Prestigiacomo, Charles J
2013-04-01
Recent studies have documented the high sensitivity of computed tomography angiography (CTA) in detecting a ruptured aneurysm in the presence of acute subarachnoid hemorrhage (SAH). The practice of digital subtraction angiography (DSA) when CTA does not reveal an aneurysm has thus been called into question. We examined this dilemma from a cost-effectiveness perspective by using current decision analysis techniques. A decision tree was created with the use of TreeAge Pro Suite 2012; in 1 arm, a CTA-negative SAH was followed up with DSA; in the other arm, patients were observed without further imaging. Based on literature review, costs and utilities were assigned to each potential outcome. Base-case and sensitivity analyses were performed to determine the cost-effectiveness of each strategy. A Monte Carlo simulation was then conducted by sampling each variable over a plausible distribution to evaluate the robustness of the model. With the use of a negative predictive value of 95.7% for CTA, observation was found to be the most cost-effective strategy ($6737/Quality Adjusted Life Year [QALY] vs $8460/QALY) in the base-case analysis. One-way sensitivity analysis demonstrated that DSA became the more cost-effective option if the negative predictive value of CTA fell below 93.72%. The Monte Carlo simulation produced an incremental cost-effectiveness ratio of $83 083/QALY. At the conventional willingness-to-pay threshold of $50 000/QALY, observation was the more cost-effective strategy in 83.6% of simulations. The decision to perform a DSA in CTA-negative SAH depends strongly on the sensitivity of CTA, and therefore must be evaluated at each center treating these types of patients. Given the high sensitivity of CTA reported in the current literature, performing DSA on all patients with CTA negative SAH may not be cost-effective at every institution.
Evaluation of a 'virtual' approach to commissioning health research.
McCourt, Christine A; Morgan, Philip A; Youll, Penny
2006-10-18
The objective of this study was to evaluate the implementation of a 'virtual' (computer-mediated) approach to health research commissioning. This had been introduced experimentally in a DOH programme--the 'Health of Londoners Programme'--in order to assess whether is could enhance the accessibility, transparency and effectiveness of commissioning health research. The study described here was commissioned to evaluate this novel approach, addressing these key questions. A naturalistic-experimental approach was combined with principles of action research. The different commissioning groups within the programme were randomly allocated to either the traditional face-to-face mode or the novel 'virtual' mode. Mainly qualitative data were gathered including observation of all (virtual and face-to-face) commissioning meetings; semi-structured interviews with a purposive sample of participants (n = 32/66); structured questionnaires and interviews with lead researchers of early commissioned projects. All members of the commissioning groups were invited to participate in collaborative enquiry groups which participated actively in the analysis process. The virtual process functioned as intended, reaching timely and relatively transparent decisions that participants had confidence in. Despite the potential for greater access using a virtual approach, few differences were found in practice. Key advantages included physical access, a more flexible and extended time period for discussion, reflection and information gathering and a more transparent decision-making process. Key challenges were the reduction of social cues available in a computer-mediated medium that require novel ways of ensuring appropriate dialogue, feedback and interaction. However, in both modes, the process was influenced by a range of factors and was not technology driven. There is potential for using computer-mediated communication within the research commissioning process. This may enhance access, effectiveness and transparency of decision-making but further development is needed for this to be fully realised, including attention to process as well as the computer-mediated medium.