Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System
ERIC Educational Resources Information Center
Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia
2013-01-01
The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…
Clinical decision-making by midwives: managing case complexity.
Cioffi, J; Markham, R
1997-02-01
In making clinical judgements, it is argued that midwives use 'shortcuts' or heuristics based on estimated probabilities to simplify the decision-making task. Midwives (n = 30) were given simulated patient assessment situations of high and low complexity and were required to think aloud. Analysis of verbal protocols showed that subjective probability judgements (heuristics) were used more frequently in the high than low complexity case and predominated in the last quarter of the assessment period for the high complexity case. 'Representativeness' was identified more frequently in the high than in the low case, but was the dominant heuristic in both. Reports completed after each simulation suggest that heuristics based on memory for particular conditions affect decisions. It is concluded that midwives use heuristics, derived mainly from their clinical experiences, in an attempt to save cognitive effort and to facilitate reasonably accurate decisions in the decision-making process.
Automated detection of heuristics and biases among pathologists in a computer-based system.
Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia
2013-08-01
The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.
NASA Astrophysics Data System (ADS)
Erkol, Şirag; Yücel, Gönenç
In this study, the problem of seed selection is investigated. This problem is mainly treated as an optimization problem, which is proved to be NP-hard. There are several heuristic approaches in the literature which mostly use algorithmic heuristics. These approaches mainly focus on the trade-off between computational complexity and accuracy. Although the accuracy of algorithmic heuristics are high, they also have high computational complexity. Furthermore, in the literature, it is generally assumed that complete information on the structure and features of a network is available, which is not the case in most of the times. For the study, a simulation model is constructed, which is capable of creating networks, performing seed selection heuristics, and simulating diffusion models. Novel metric-based seed selection heuristics that rely only on partial information are proposed and tested using the simulation model. These heuristics use local information available from nodes in the synthetically created networks. The performances of heuristics are comparatively analyzed on three different network types. The results clearly show that the performance of a heuristic depends on the structure of a network. A heuristic to be used should be selected after investigating the properties of the network at hand. More importantly, the approach of partial information provided promising results. In certain cases, selection heuristics that rely only on partial network information perform very close to similar heuristics that require complete network data.
Automated unit-level testing with heuristic rules
NASA Technical Reports Server (NTRS)
Carlisle, W. Homer; Chang, Kai-Hsiung; Cross, James H.; Keleher, William; Shackelford, Keith
1990-01-01
Software testing plays a significant role in the development of complex software systems. Current testing methods generally require significant effort to generate meaningful test cases. The QUEST/Ada system is a prototype system designed using CLIPS to experiment with expert system based test case generation. The prototype is designed to test for condition coverage, and attempts to generate test cases to cover all feasible branches contained in an Ada program. This paper reports on heuristics sued by the system. These heuristics vary according to the amount of knowledge obtained by preprocessing and execution of the boolean conditions in the program.
Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment
NASA Astrophysics Data System (ADS)
Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.
2017-03-01
Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.
Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses
ERIC Educational Resources Information Center
Ings, Welby
2011-01-01
This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…
Automatic Generation of Heuristics for Scheduling
NASA Technical Reports Server (NTRS)
Morris, Robert A.; Bresina, John L.; Rodgers, Stuart M.
1997-01-01
This paper presents a technique, called GenH, that automatically generates search heuristics for scheduling problems. The impetus for developing this technique is the growing consensus that heuristics encode advice that is, at best, useful in solving most, or typical, problem instances, and, at worst, useful in solving only a narrowly defined set of instances. In either case, heuristic problem solvers, to be broadly applicable, should have a means of automatically adjusting to the idiosyncrasies of each problem instance. GenH generates a search heuristic for a given problem instance by hill-climbing in the space of possible multi-attribute heuristics, where the evaluation of a candidate heuristic is based on the quality of the solution found under its guidance. We present empirical results obtained by applying GenH to the real world problem of telescope observation scheduling. These results demonstrate that GenH is a simple and effective way of improving the performance of an heuristic scheduler.
Torrens, George Edward
2018-01-01
Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.
Plan-graph Based Heuristics for Conformant Probabilistic Planning
NASA Technical Reports Server (NTRS)
Ramakrishnan, Salesh; Pollack, Martha E.; Smith, David E.
2004-01-01
In this paper, we introduce plan-graph based heuristics to solve a variation of the conformant probabilistic planning (CPP) problem. In many real-world problems, it is the case that the sensors are unreliable or take too many resources to provide knowledge about the environment. These domains are better modeled as conformant planning problems. POMDP based techniques are currently the most successful approach for solving CPP but have the limitation of state- space explosion. Recent advances in deterministic and conformant planning have shown that plan-graphs can be used to enhance the performance significantly. We show that this enhancement can also be translated to CPP. We describe our process for developing the plan-graph heuristics and estimating the probability of a partial plan. We compare the performance of our planner PVHPOP when used with different heuristics. We also perform a comparison with a POMDP solver to show over a order of magnitude improvement in performance.
ERIC Educational Resources Information Center
Graulich, Nicole; Tiemann, Rudiger; Schreiner, Peter R.
2012-01-01
We investigate the efficiency of domain-specific heuristic strategies in mastering and predicting pericyclic six-electron rearrangements. Based on recent research findings on these types of reactions a new concept has been developed that should help students identify and describe six-electron rearrangements more readily in complex molecules. The…
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.
2016-10-01
Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.
Kheiri, Ahmed; Keedwell, Ed
2017-01-01
Operations research is a well-established field that uses computational systems to support decisions in business and public life. Good solutions to operations research problems can make a large difference to the efficient running of businesses and organisations and so the field often searches for new methods to improve these solutions. The high school timetabling problem is an example of an operations research problem and is a challenging task which requires assigning events and resources to time slots subject to a set of constraints. In this article, a new sequence-based selection hyper-heuristic is presented that produces excellent results on a suite of high school timetabling problems. In this study, we present an easy-to-implement, easy-to-maintain, and effective sequence-based selection hyper-heuristic to solve high school timetabling problems using a benchmark of unified real-world instances collected from different countries. We show that with sequence-based methods, it is possible to discover new best known solutions for a number of the problems in the timetabling domain. Through this investigation, the usefulness of sequence-based selection hyper-heuristics has been demonstrated and the capability of these methods has been shown to exceed the state of the art.
NASA Astrophysics Data System (ADS)
Wang, Wenrui; Wu, Yaohua; Wu, Yingying
2016-05-01
E-commerce, as an emerging marketing mode, has attracted more and more attention and gradually changed the way of our life. However, the existing layout of distribution centers can't fulfill the storage and picking demands of e-commerce sufficiently. In this paper, a modified miniload automated storage/retrieval system is designed to fit these new characteristics of e-commerce in logistics. Meanwhile, a matching problem, concerning with the improvement of picking efficiency in new system, is studied in this paper. The problem is how to reduce the travelling distance of totes between aisles and picking stations. A multi-stage heuristic algorithm is proposed based on statement and model of this problem. The main idea of this algorithm is, with some heuristic strategies based on similarity coefficients, minimizing the transportations of items which can not arrive in the destination picking stations just through direct conveyors. The experimental results based on the cases generated by computers show that the average reduced rate of indirect transport times can reach 14.36% with the application of multi-stage heuristic algorithm. For the cases from a real e-commerce distribution center, the order processing time can be reduced from 11.20 h to 10.06 h with the help of the modified system and the proposed algorithm. In summary, this research proposed a modified system and a multi-stage heuristic algorithm that can reduce the travelling distance of totes effectively and improve the whole performance of e-commerce distribution center.
Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni
2011-08-01
The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.
Prediction-based dynamic load-sharing heuristics
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.
1993-01-01
The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.
Local search heuristic for the discrete leader-follower problem with multiple follower objectives
NASA Astrophysics Data System (ADS)
Kochetov, Yury; Alekseeva, Ekaterina; Mezmaz, Mohand
2016-10-01
We study a discrete bilevel problem, called as well as leader-follower problem, with multiple objectives at the lower level. It is assumed that constraints at the upper level can include variables of both levels. For such ill-posed problem we define feasible and optimal solutions for pessimistic case. A central point of this work is a two stage method to get a feasible solution under the pessimistic case, given a leader decision. The target of the first stage is a follower solution that violates the leader constraints. The target of the second stage is a pessimistic feasible solution. Each stage calls a heuristic and a solver for a series of particular mixed integer programs. The method is integrated inside a local search based heuristic that is designed to find near-optimal leader solutions.
ERIC Educational Resources Information Center
Kingir, Sevgi; Geban, Omer; Gunel, Murat
2012-01-01
This study investigates the effects of the Science Writing Heuristic (SWH), known as an argumentation-based science inquiry approach, on Grade 9 students' performance on a post-test in relation to their academic achievement levels. Four intact classes taught by 2 chemistry teachers from a Turkish public high school were selected for the study; one…
Floros, Nikolaos; Papadakis, Marios; Schelzig, Hubert; Oberhuber, Alexander
2018-03-10
Over the last three decades, the development of systematic and protocol-based algorithms, and advances in available diagnostic tests have become the indispensable parts of practising medicine. Naturally, despite the implementation of meticulous protocols involving diagnostic tests or even trials of empirical therapies, the cause of one's symptoms may still not be obvious. We herein report a case of chronic back pain, which took about 5 years to get accurately diagnosed. The case challenges the diagnostic assumptions and sets ground of discussion for the diagnostic reasoning pitfalls and heuristic biases that mislead the caring physicians and cost years of low quality of life to our patient. This case serves as an example of how anchoring heuristics can interfere in the diagnostic process of a complex and rare entity when combined with a concurrent potentially life-threatening condition. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Pei, Jun; Liu, Xinbao; Pardalos, Panos M.; Fan, Wenjuan; Wang, Ling; Yang, Shanlin
2016-03-01
Motivated by applications in manufacturing industry, we consider a supply chain scheduling problem, where each job is characterised by non-identical sizes, different release times and unequal processing times. The objective is to minimise the makespan by making batching and sequencing decisions. The problem is formalised as a mixed integer programming model and proved to be strongly NP-hard. Some structural properties are presented for both the general case and a special case. Based on these properties, a lower bound is derived, and a novel two-phase heuristic (TP-H) is developed to solve the problem, which guarantees to obtain a worst case performance ratio of ?. Computational experiments with a set of different sizes of random instances are conducted to evaluate the proposed approach TP-H, which is superior to another two heuristics proposed in the literature. Furthermore, the experimental results indicate that TP-H can effectively and efficiently solve large-size problems in a reasonable time.
The probability heuristics model of syllogistic reasoning.
Chater, N; Oaksford, M
1999-03-01
A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-05-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.
"The Gaze Heuristic:" Biography of an Adaptively Rational Decision Process.
Hamlin, Robert P
2017-04-01
This article is a case study that describes the natural and human history of the gaze heuristic. The gaze heuristic is an interception heuristic that utilizes a single input (deviation from a constant angle of approach) repeatedly as a task is performed. Its architecture, advantages, and limitations are described in detail. A history of the gaze heuristic is then presented. In natural history, the gaze heuristic is the only known technique used by predators to intercept prey. In human history the gaze heuristic was discovered accidentally by Royal Air Force (RAF) fighter command just prior to World War II. As it was never discovered by the Luftwaffe, the technique conferred a decisive advantage upon the RAF throughout the war. After the end of the war in America, German technology was combined with the British heuristic to create the Sidewinder AIM9 missile, the most successful autonomous weapon ever built. There are no plans to withdraw it or replace its guiding gaze heuristic. The case study demonstrates that the gaze heuristic is a specific heuristic type that takes a single best input at the best time (take the best 2 ). Its use is an adaptively rational response to specific, rapidly evolving decision environments that has allowed those animals/humans/machines who use it to survive, prosper, and multiply relative to those who do not. Copyright © 2017 Cognitive Science Society, Inc.
An Integrated Planning Representation Using Macros, Abstractions, and Cases
NASA Technical Reports Server (NTRS)
Baltes, Jacky; MacDonald, Bruce
1992-01-01
Planning will be an essential part of future autonomous robots and integrated intelligent systems. This paper focuses on learning problem solving knowledge in planning systems. The system is based on a common representation for macros, abstractions, and cases. Therefore, it is able to exploit both classical and case based techniques. The general operators in a successful plan derivation would be assessed for their potential usefulness, and some stored. The feasibility of this approach was studied through the implementation of a learning system for abstraction. New macros are motivated by trying to improve the operatorset. One heuristic used to improve the operator set is generating operators with more general preconditions than existing ones. This heuristic leads naturally to abstraction hierarchies. This investigation showed promising results on the towers of Hanoi problem. The paper concludes by describing methods for learning other problem solving knowledge. This knowledge can be represented by allowing operators at different levels of abstraction in a refinement.
ERIC Educational Resources Information Center
Moeller, Jeremy D.; Dattilo, John; Rusch, Frank
2015-01-01
This study examined how specific guidelines and heuristics have been used to identify methodological rigor associated with single-case research designs based on quality indicators developed by Horner et al. Specifically, this article describes how literature reviews have applied Horner et al.'s quality indicators and evidence-based criteria.…
Familiarity and Recollection in Heuristic Decision Making
Schwikert, Shane R.; Curran, Tim
2014-01-01
Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the by-products of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the two heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective pre-experimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical frame work that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PMID:25347534
Familiarity and recollection in heuristic decision making.
Schwikert, Shane R; Curran, Tim
2014-12-01
Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the byproducts of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the 2 heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective preexperimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical framework that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Noor-E-Alam, Md.; Doucette, John
2015-08-01
Grid-based location problems (GBLPs) can be used to solve location problems in business, engineering, resource exploitation, and even in the field of medical sciences. To solve these decision problems, an integer linear programming (ILP) model is designed and developed to provide the optimal solution for GBLPs considering fixed cost criteria. Preliminary results show that the ILP model is efficient in solving small to moderate-sized problems. However, this ILP model becomes intractable in solving large-scale instances. Therefore, a decomposition heuristic is proposed to solve these large-scale GBLPs, which demonstrates significant reduction of solution runtimes. To benchmark the proposed heuristic, results are compared with the exact solution via ILP. The experimental results show that the proposed method significantly outperforms the exact method in runtime with minimal (and in most cases, no) loss of optimality.
Usability of a patient education and motivation tool using heuristic evaluation.
Joshi, Ashish; Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew
2009-11-06
Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. PEMT was evaluated by three usability experts using Nielsen's usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT.
Usability of a Patient Education and Motivation Tool Using Heuristic Evaluation
Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew
2009-01-01
Background Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. Objective The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. Methods PEMT was evaluated by three usability experts using Nielsen’s usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. Results A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). Conclusion We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT. PMID:19897458
Direct heuristic dynamic programming for damping oscillations in a large power system.
Lu, Chao; Si, Jennie; Xie, Xiaorong
2008-08-01
This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.
Conflict monitoring in dual process theories of thinking.
De Neys, Wim; Glumicic, Tamara
2008-03-01
Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and description did not conflict this was not the case. Results suggest that whereas the popular characterization of conflict detection as an actively experienced struggle can be questioned there is nevertheless evidence for Sloman's and Epstein's basic claim about the flawless operation of the monitoring. Whenever the base rates and description disagree people will detect this conflict and consequently redirect attention towards a deeper processing of the base rates. Implications for the dual process framework and the rationality debate are discussed.
Doubravsky, Karel; Dohnal, Mirko
2015-01-01
Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662
Doubravsky, Karel; Dohnal, Mirko
2015-01-01
Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.
The Probability Heuristics Model of Syllogistic Reasoning.
ERIC Educational Resources Information Center
Chater, Nick; Oaksford, Mike
1999-01-01
Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…
Martelli, Fabrizio; Sassaroli, Angelo; Pifferi, Antonio; Torricelli, Alessandro; Spinelli, Lorenzo; Zaccanti, Giovanni
2007-12-24
The Green's function of the time dependent radiative transfer equation for the semi-infinite medium is derived for the first time by a heuristic approach based on the extrapolated boundary condition and on an almost exact solution for the infinite medium. Monte Carlo simulations performed both in the simple case of isotropic scattering and of an isotropic point-like source, and in the more realistic case of anisotropic scattering and pencil beam source, are used to validate the heuristic Green's function. Except for the very early times, the proposed solution has an excellent accuracy (> 98 % for the isotropic case, and > 97 % for the anisotropic case) significantly better than the diffusion equation. The use of this solution could be extremely useful in the biomedical optics field where it can be directly employed in conditions where the use of the diffusion equation is limited, e.g. small volume samples, high absorption and/or low scattering media, short source-receiver distances and early times. Also it represents a first step to derive tools for other geometries (e.g. slab and slab with inhomogeneities inside) of practical interest for noninvasive spectroscopy and diffuse optical imaging. Moreover the proposed solution can be useful to several research fields where the study of a transport process is fundamental.
A Priori Knowledge and Heuristic Reasoning in Architectural Design.
ERIC Educational Resources Information Center
Rowe, Peter G.
1982-01-01
It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…
Better ILP models for haplotype assembly.
Etemadi, Maryam; Bagherian, Mehri; Chen, Zhi-Zhong; Wang, Lusheng
2018-02-19
The haplotype assembly problem for diploid is to find a pair of haplotypes from a given set of aligned Single Nucleotide Polymorphism (SNP) fragments (reads). It has many applications in association studies, drug design, and genetic research. Since this problem is computationally hard, both heuristic and exact algorithms have been designed for it. Although exact algorithms are much slower, they are still of great interest because they usually output significantly better solutions than heuristic algorithms in terms of popular measures such as the Minimum Error Correction (MEC) score, the number of switch errors, and the QAN50 score. Exact algorithms are also valuable because they can be used to witness how good a heuristic algorithm is. The best known exact algorithm is based on integer linear programming (ILP) and it is known that ILP can also be used to improve the output quality of every heuristic algorithm with a little decline in speed. Therefore, faster ILP models for the problem are highly demanded. As in previous studies, we consider not only the general case of the problem but also its all-heterozygous case where we assume that if a column of the input read matrix contains at least one 0 and one 1, then it corresponds to a heterozygous SNP site. For both cases, we design new ILP models for the haplotype assembly problem which aim at minimizing the MEC score. The new models are theoretically better because they contain significantly fewer constraints. More importantly, our experimental results show that for both simulated and real datasets, the new model for the all-heterozygous (respectively, general) case can usually be solved via CPLEX (an ILP solver) at least 5 times (respectively, twice) faster than the previous bests. Indeed, the running time can sometimes be 41 times better. This paper proposes a new ILP model for the haplotype assembly problem and its all-heterozygous case, respectively. Experiments with both real and simulated datasets show that the new models can be solved within much shorter time by CPLEX than the previous bests. We believe that the models can be used to improve heuristic algorithms as well.
Assessing the use of cognitive heuristic representativeness in clinical reasoning.
Payne, Velma L; Crowley, Rebecca S; Crowley, Rebecca
2008-11-06
We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors.
Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning
Payne, Velma L.; Crowley, Rebecca S.
2008-01-01
We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors. PMID:18999140
Case-based clinical reasoning in feline medicine: 3: Use of heuristics and illness scripts.
Whitehead, Martin L; Canfield, Paul J; Johnson, Robert; O'Brien, Carolyn R; Malik, Richard
2016-05-01
This is Article 3 of a three-part series on clinical reasoning that encourages practitioners to explore and understand how they think and make case-based decisions. It is hoped that, in the process, they will learn to trust their intuition but, at the same time, put in place safeguards to diminish the impact of bias and misguided logic on their diagnostic decision-making. Article 1, published in the January 2016 issue of JFMS, discussed the relative merits and shortcomings of System 1 thinking (immediate and unconscious) and System 2 thinking (effortful and analytical). In Article 2, published in the March 2016 issue, ways of managing cognitive error, particularly the negative impact of bias, in making a diagnosis were examined. This final article explores the use of heuristics (mental short cuts) and illness scripts in diagnostic reasoning. © The Author(s) 2016.
Focus of attention in an activity-based scheduler
NASA Technical Reports Server (NTRS)
Sadeh, Norman; Fox, Mark S.
1989-01-01
Earlier research in job shop scheduling has demonstrated the advantages of opportunistically combining order-based and resource-based scheduling techniques. An even more flexible approach is investigated where each activity is considered a decision point by itself. Heuristics to opportunistically select the next decision point on which to focus attention (i.e., variable ordering heuristics) and the next decision to be tried at this point (i.e., value ordering heuristics) are described that probabilistically account for both activity precedence and resource requirement interactions. Preliminary experimental results indicate that the variable ordering heuristic greatly increases search efficiency. While least constraining value ordering heuristics have been advocated in the literature, the experimental results suggest that other value ordering heuristics combined with our variable-ordering heuristic can produce much better schedules without significantly increasing search.
NASA Astrophysics Data System (ADS)
Wibisono, E.; Santoso, A.; Sunaryo, M. A.
2017-11-01
XYZ is a distributor of various consumer goods products. The company plans its delivery routes daily and in order to obtain route construction in a short amount of time, it simplifies the process by assigning drivers based on geographic regions. This approach results in inefficient use of vehicles leading to imbalance workloads. In this paper, we propose a combined method involving heuristic and optimization to obtain better solutions in acceptable computation time. The heuristic is based on a time-oriented, nearest neighbor (TONN) to form clusters if the number of locations is higher than a certain value. The optimization part uses a mathematical modeling formulation based on vehicle routing problem that considers heterogeneous vehicles, time windows, and fixed costs (HVRPTWF) and is used to solve routing problem in clusters. A case study using data from one month of the company’s operations is analyzed, and data from one day of operations are detailed in this paper. The analysis shows that the proposed method results in 24% cost savings on that month, but it can be as high as 54% in a day.
Heuristic Reasoning and Beliefs on Immigration: An Approach to an Intercultural Education Programme
ERIC Educational Resources Information Center
Navarro, Santiago Palacios; Lopez de Arechavaleta, Blanca Olalde
2010-01-01
People use mental shortcuts to simplify the amount of information they receive from the environment. Heuristic reasoning can be included among these mental shortcuts. In general, heuristics is useful for making fast decisions and judgements, but in certain cases, it may lead to systematic errors because some relevant aspects presented in the given…
Query Optimization in Distributed Databases.
1982-10-01
general, the strategy a31 a11 a 3 is more time comsuming than the strategy a, a, and sually we do not use it. Since the semijoin of R.XJ> RS requires...analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are difficult to obtain, some...is the study of the analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are
Mohan, Deepika; Rosengart, Matthew R; Fischhoff, Baruch; Angus, Derek C; Farris, Coreen; Yealy, Donald M; Wallace, David J; Barnato, Amber E
2016-11-11
Between 30 and 40 % of patients with severe injuries receive treatment at non-trauma centers (under-triage), largely because of physician decision making. Existing interventions to improve triage by physicians ignore the role that intuition (heuristics) plays in these decisions. One such heuristic is to form an initial impression based on representativeness (how typical does a patient appear of one with severe injuries). We created a video game (Night Shift) to recalibrate physician's representativeness heuristic in trauma triage. We developed Night Shift in collaboration with emergency medicine physicians, trauma surgeons, behavioral scientists, and game designers. Players take on the persona of Andy Jordan, an emergency medicine physician, who accepts a new job in a small town. Through a series of cases that go awry, they gain experience with the contextual cues that distinguish patients with minor and severe injuries (based on the theory of analogical encoding) and receive emotionally-laden feedback on their performance (based on the theory of narrative engagement). The planned study will compare the effect of Night Shift with that of an educational program on physician triage decisions and on physician heuristics. Psychological theory predicts that cognitive load increases reliance on heuristics, thereby increasing the under-triage rate when heuristics are poorly calibrated. We will randomize physicians (n = 366) either to play the game or to review an educational program, and will assess performance using a validated virtual simulation. The validated simulation includes both control and cognitive load conditions. We will compare rates of under-triage after exposure to the two interventions (primary outcome) and will compare the effect of cognitive load on physicians' under-triage rates (secondary outcome). We hypothesize that: a) physicians exposed to Night Shift will have lower rates of under-triage compared to those exposed to the educational program, and b) cognitive load will not degrade triage performance among physicians exposed to Night Shift as much as it will among those exposed to the educational program. Serious games offer a new approach to the problem of poorly-calibrated heuristics in trauma triage. The results of this trial will contribute to the understanding of physician quality improvement and the efficacy of video games as behavioral interventions. clinicaltrials.gov; NCT02857348 ; August 2, 2016.
Maximum likelihood of phylogenetic networks.
Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir
2006-11-01
Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf
Heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer.
Evans, Jonathan St B T; Over, David E
2010-05-01
Marewski, Gaissmaier and Gigerenzer (2009) present a review of research on fast and frugal heuristics, arguing that complex problems are best solved by simple heuristics, rather than the application of knowledge and logical reasoning. We argue that the case for such heuristics is overrated. First, we point out that heuristics can often lead to biases as well as effective responding. Second, we show that the application of logical reasoning can be both necessary and relatively simple. Finally, we argue that the evidence for a logical reasoning system that co-exists with simpler heuristic forms of thinking is overwhelming. Not only is it implausible a priori that we would have evolved such a system that is of no use to us, but extensive evidence from the literature on dual processing in reasoning and judgement shows that many problems can only be solved when this form of reasoning is used to inhibit and override heuristic thinking.
NASA Astrophysics Data System (ADS)
Huang, J. D.; Liu, J. J.; Chen, Q. X.; Mao, N.
2017-06-01
Against a background of heat-treatment operations in mould manufacturing, a two-stage flow-shop scheduling problem is described for minimizing makespan with parallel batch-processing machines and re-entrant jobs. The weights and release dates of jobs are non-identical, but job processing times are equal. A mixed-integer linear programming model is developed and tested with small-scale scenarios. Given that the problem is NP hard, three heuristic construction methods with polynomial complexity are proposed. The worst case of the new constructive heuristic is analysed in detail. A method for computing lower bounds is proposed to test heuristic performance. Heuristic efficiency is tested with sets of scenarios. Compared with the two improved heuristics, the performance of the new constructive heuristic is superior.
ERIC Educational Resources Information Center
Munoz, Marco A.; Rodosky, Robert J.
2011-01-01
This case study provides an illustration of the heuristic practices of a high-performing research department, which in turn, will help build much needed models applicable in the context of large urban districts. This case study examines the accountability, planning, evaluation, testing, and research functions of a research department in a large…
Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).
Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff
2017-01-01
With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.
Hybrid glowworm swarm optimization for task scheduling in the cloud environment
NASA Astrophysics Data System (ADS)
Zhou, Jing; Dong, Shoubin
2018-06-01
In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.
Self-organization in a distributed coordination game through heuristic rules
NASA Astrophysics Data System (ADS)
Agarwal, Shubham; Ghosh, Diptesh; Chakrabarti, Anindya S.
2016-12-01
In this paper, we consider a distributed coordination game played by a large number of agents with finite information sets, which characterizes emergence of a single dominant attribute out of a large number of competitors. Formally, N agents play a coordination game repeatedly, which has exactly N pure strategy Nash equilibria, and all of the equilibria are equally preferred by the agents. The problem is to select one equilibrium out of N possible equilibria in the least number of attempts. We propose a number of heuristic rules based on reinforcement learning to solve the coordination problem. We see that the agents self-organize into clusters with varying intensities depending on the heuristic rule applied, although all clusters but one are transitory in most cases. Finally, we characterize a trade-off in terms of the time requirement to achieve a degree of stability in strategies versus the efficiency of such a solution.
Task Assignment Heuristics for Parallel and Distributed CFD Applications
NASA Technical Reports Server (NTRS)
Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak
2003-01-01
This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.
Sequence-based heuristics for faster annotation of non-coding RNA families.
Weinberg, Zasha; Ruzzo, Walter L
2006-01-01
Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.
NASA Astrophysics Data System (ADS)
Wahid, Juliana; Hussin, Naimah Mohd
2016-08-01
The construction of population of initial solution is a crucial task in population-based metaheuristic approach for solving curriculum-based university course timetabling problem because it can affect the convergence speed and also the quality of the final solution. This paper presents an exploration on combination of graph heuristics in construction approach in curriculum based course timetabling problem to produce a population of initial solutions. The graph heuristics were set as single and combination of two heuristics. In addition, several ways of assigning courses into room and timeslot are implemented. All settings of heuristics are then tested on the same curriculum based course timetabling problem instances and are compared with each other in terms of number of population produced. The result shows that combination of saturation degree followed by largest degree heuristic produce the highest number of population of initial solutions. The results from this study can be used in the improvement phase of algorithm that uses population of initial solutions.
Learning to improve iterative repair scheduling
NASA Technical Reports Server (NTRS)
Zweben, Monte; Davis, Eugene
1992-01-01
This paper presents a general learning method for dynamically selecting between repair heuristics in an iterative repair scheduling system. The system employs a version of explanation-based learning called Plausible Explanation-Based Learning (PEBL) that uses multiple examples to confirm conjectured explanations. The basic approach is to conjecture contradictions between a heuristic and statistics that measure the quality of the heuristic. When these contradictions are confirmed, a different heuristic is selected. To motivate the utility of this approach we present an empirical evaluation of the performance of a scheduling system with respect to two different repair strategies. We show that the scheduler that learns to choose between the heuristics outperforms the same scheduler with any one of two heuristics alone.
Analysis of Levene's Test under Design Imbalance.
ERIC Educational Resources Information Center
Keyes, Tim K.; Levy, Martin S.
1997-01-01
H. Levene (1960) proposed a heuristic test for heteroscedasticity in the case of a balanced two-way layout, based on analysis of variance of absolute residuals. Conditions under which design imbalance affects the test's characteristics are identified, and a simple correction involving leverage is proposed. (SLD)
ERIC Educational Resources Information Center
Thies, Philip Andrew
2017-01-01
The purpose of this narratological heuristic multiple case study is to describe the specific components that teachers need in both their knowledge and skills to meet the individual needs of their students in a blended learning classroom. The study was conducted in six schools--elementary, middle, and high schools--located in a suburban district.…
Cognition of an expert tackling an unfamiliar conceptual physics problem
NASA Astrophysics Data System (ADS)
Schuster, David; Undreiu, Adriana
2009-11-01
We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.
ERIC Educational Resources Information Center
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-01-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in…
Hyper-heuristics with low level parameter adaptation.
Ren, Zhilei; Jiang, He; Xuan, Jifeng; Luo, Zhongxuan
2012-01-01
Recent years have witnessed the great success of hyper-heuristics applying to numerous real-world applications. Hyper-heuristics raise the generality of search methodologies by manipulating a set of low level heuristics (LLHs) to solve problems, and aim to automate the algorithm design process. However, those LLHs are usually parameterized, which may contradict the domain independent motivation of hyper-heuristics. In this paper, we show how to automatically maintain low level parameters (LLPs) using a hyper-heuristic with LLP adaptation (AD-HH), and exemplify the feasibility of AD-HH by adaptively maintaining the LLPs for two hyper-heuristic models. Furthermore, aiming at tackling the search space expansion due to the LLP adaptation, we apply a heuristic space reduction (SAR) mechanism to improve the AD-HH framework. The integration of the LLP adaptation and the SAR mechanism is able to explore the heuristic space more effectively and efficiently. To evaluate the performance of the proposed algorithms, we choose the p-median problem as a case study. The empirical results show that with the adaptation of the LLPs and the SAR mechanism, the proposed algorithms are able to achieve competitive results over the three heterogeneous classes of benchmark instances.
Generation of structural topologies using efficient technique based on sorted compliances
NASA Astrophysics Data System (ADS)
Mazur, Monika; Tajs-Zielińska, Katarzyna; Bochenek, Bogdan
2018-01-01
Topology optimization, although well recognized is still widely developed. It has gained recently more attention since large computational ability become available for designers. This process is stimulated simultaneously by variety of emerging, innovative optimization methods. It is observed that traditional gradient-based mathematical programming algorithms, in many cases, are replaced by novel and e cient heuristic methods inspired by biological, chemical or physical phenomena. These methods become useful tools for structural optimization because of their versatility and easy numerical implementation. In this paper engineering implementation of a novel heuristic algorithm for minimum compliance topology optimization is discussed. The performance of the topology generator is based on implementation of a special function utilizing information of compliance distribution within the design space. With a view to cope with engineering problems the algorithm has been combined with structural analysis system Ansys.
Drake, John H; Özcan, Ender; Burke, Edmund K
2016-01-01
Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.
NASA Astrophysics Data System (ADS)
Yamamoto, Takanori; Bannai, Hideo; Nagasaki, Masao; Miyano, Satoru
We present new decomposition heuristics for finding the optimal solution for the maximum-weight connected graph problem, which is known to be NP-hard. Previous optimal algorithms for solving the problem decompose the input graph into subgraphs using heuristics based on node degree. We propose new heuristics based on betweenness centrality measures, and show through computational experiments that our new heuristics tend to reduce the number of subgraphs in the decomposition, and therefore could lead to the reduction in computational time for finding the optimal solution. The method is further applied to analysis of biological pathway data.
Perceived breast cancer risk: heuristic reasoning and search for a dominance structure.
Katapodi, Maria C; Facione, Noreen C; Humphreys, Janice C; Dodd, Marylin J
2005-01-01
Studies suggest that people construct their risk perceptions by using inferential rules called heuristics. The purpose of this study was to identify heuristics that influence perceived breast cancer risk. We examined 11 interviews from women of diverse ethnic/cultural backgrounds who were recruited from community settings. Narratives in which women elaborated about their own breast cancer risk were analyzed with Argument and Heuristic Reasoning Analysis methodology, which is based on applied logic. The availability, simulation, representativeness, affect, and perceived control heuristics, and search for a dominance structure were commonly used for making risk assessments. Risk assessments were based on experiences with an abnormal breast symptom, experiences with affected family members and friends, beliefs about living a healthy lifestyle, and trust in health providers. Assessment of the potential threat of a breast symptom was facilitated by the search for a dominance structure. Experiences with family members and friends were incorporated into risk assessments through the availability, simulation, representativeness, and affect heuristics. Mistrust in health providers led to an inappropriate dependence on the perceived control heuristic. Identified heuristics appear to create predictable biases and suggest that perceived breast cancer risk is based on common cognitive patterns.
2017-03-01
ARL-TN-0814 ● MAR 2017 US Army Research Laboratory Usability Study and Heuristic Evaluation of the Applied Robotics for...ARL-TN-0814 ● MAR 2017 US Army Research Laboratory Usability Study and Heuristic Evaluation of the Applied Robotics for...Heuristic Evaluation of the Applied Robotics for Installations and Base Operations (ARIBO) Driverless Vehicle Reservation Application ARIBO Mobile 5a
Heuristics for Relevancy Ranking of Earth Dataset Search Results
NASA Astrophysics Data System (ADS)
Lynnes, C.; Quinn, P.; Norton, J.
2016-12-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
Heuristics for Relevancy Ranking of Earth Dataset Search Results
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Quinn, Patrick; Norton, James
2016-01-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
An Examination of xMOOCs: An Embedded Single Case Study Based on Conole's 12 Dimensions
ERIC Educational Resources Information Center
Kocdar, Serpil; Okur, M. Recep; Bozkurt, Aras
2017-01-01
This study intends to examine the xMOOCs offered by one of the mainstream MOOC platforms in Conole's 12 dimensions. For this purpose, the research employed an embedded single case study using heuristic inquiry to collect data. The researchers participated in three xMOOCs and took into consideration the characteristics of these MOOCs by rating them…
A Tabu-Search Heuristic for Deterministic Two-Mode Blockmodeling of Binary Network Matrices.
Brusco, Michael; Steinley, Douglas
2011-10-01
Two-mode binary data matrices arise in a variety of social network contexts, such as the attendance or non-attendance of individuals at events, the participation or lack of participation of groups in projects, and the votes of judges on cases. A popular method for analyzing such data is two-mode blockmodeling based on structural equivalence, where the goal is to identify partitions for the row and column objects such that the clusters of the row and column objects form blocks that are either complete (all 1s) or null (all 0s) to the greatest extent possible. Multiple restarts of an object relocation heuristic that seeks to minimize the number of inconsistencies (i.e., 1s in null blocks and 0s in complete blocks) with ideal block structure is the predominant approach for tackling this problem. As an alternative, we propose a fast and effective implementation of tabu search. Computational comparisons across a set of 48 large network matrices revealed that the new tabu-search heuristic always provided objective function values that were better than those of the relocation heuristic when the two methods were constrained to the same amount of computation time.
A Variable-Selection Heuristic for K-Means Clustering.
ERIC Educational Resources Information Center
Brusco, Michael J.; Cradit, J. Dennis
2001-01-01
Presents a variable selection heuristic for nonhierarchical (K-means) cluster analysis based on the adjusted Rand index for measuring cluster recovery. Subjected the heuristic to Monte Carlo testing across more than 2,200 datasets. Results indicate that the heuristic is extremely effective at eliminating masking variables. (SLD)
Blumenthal-Barby, J S; Krieger, Heather
2015-05-01
The role of cognitive biases and heuristics in medical decision making is of growing interest. The purpose of this study was to determine whether studies on cognitive biases and heuristics in medical decision making are based on actual or hypothetical decisions and are conducted with populations that are representative of those who typically make the medical decision; to categorize the types of cognitive biases and heuristics found and whether they are found in patients or in medical personnel; and to critically review the studies based on standard methodological quality criteria. Data sources were original, peer-reviewed, empirical studies on cognitive biases and heuristics in medical decision making found in Ovid Medline, PsycINFO, and the CINAHL databases published in 1980-2013. Predefined exclusion criteria were used to identify 213 studies. During data extraction, information was collected on type of bias or heuristic studied, respondent population, decision type, study type (actual or hypothetical), study method, and study conclusion. Of the 213 studies analyzed, 164 (77%) were based on hypothetical vignettes, and 175 (82%) were conducted with representative populations. Nineteen types of cognitive biases and heuristics were found. Only 34% of studies (n = 73) investigated medical personnel, and 68% (n = 145) confirmed the presence of a bias or heuristic. Each methodological quality criterion was satisfied by more than 50% of the studies, except for sample size and validated instruments/questions. Limitations are that existing terms were used to inform search terms, and study inclusion criteria focused strictly on decision making. Most of the studies on biases and heuristics in medical decision making are based on hypothetical vignettes, raising concerns about applicability of these findings to actual decision making. Biases and heuristics have been underinvestigated in medical personnel compared with patients. © The Author(s) 2014.
ERIC Educational Resources Information Center
Kustos, Paul Nicholas
2010-01-01
Student difficulty in the study of probability arises in intuitively-based misconceptions derived from heuristics. One such heuristic, the one of note for this research study, is that of representativeness, in which an individual informally assesses the probability of an event based on the degree to which the event is similar to the sample from…
Using Curriculum-Based Measurements for Program Evaluation: Expanding Roles for School Psychologists
ERIC Educational Resources Information Center
Tusing, Mary E.; Breikjern, Nicholle A.
2017-01-01
Educators increasingly need to evaluate schoolwide reform efforts; however, complex program evaluations often are not feasible in schools. Through a case example, we provide a heuristic for program evaluation that is easily replicated in schools. Criterion-referenced interpretations of schoolwide screening data were used to evaluate outcomes…
The Ethics of Archival Research
ERIC Educational Resources Information Center
McKee, Heidi A.; Porter, James E.
2012-01-01
What are the key ethical issues involved in conducting archival research? Based on examination of cases and interviews with leading archival researchers in composition, this article discusses several ethical questions and offers a heuristic to guide ethical decision making. Key to this process is recognizing the person-ness of archival materials.…
Research Supervision: The Research Management Matrix
ERIC Educational Resources Information Center
Maxwell, T. W.; Smyth, Robyn
2010-01-01
We briefly make a case for re-conceptualising research project supervision/advising as the consideration of three inter-related areas: the learning and teaching process; developing the student; and producing the research project/outcome as a social practice. We use this as our theoretical base for an heuristic tool, "the research management…
ERIC Educational Resources Information Center
Haley, Keri; Allsopp, David; Hoppey, David
2018-01-01
Advocating for your child with a disability can be a daunting task for any parent. When the parent is also a school district employee, determining whether advocacy could impact one's position as an employee becomes inherently problematic. Using a heuristic case study approach, this inquiry's intent is to understand the experiences, barriers, and…
Using heuristic evaluations to assess the safety of health information systems.
Carvalho, Christopher J; Borycki, Elizabeth M; Kushniruk, Andre W
2009-01-01
Health information systems (HISs) are typically seen as a mechanism for reducing medical errors. There is, however, evidence to prove that technology may actually be the cause of errors. As a result, it is crucial to fully test any system prior to its implementation. At present, evidence-based evaluation heuristics do not exist for assessing aspects of interface design that lead to medical errors. A three phase study was conducted to develop evidence-based heuristics for evaluating interfaces. Phase 1 consisted of a systematic review of the literature. In Phase 2 a comprehensive list of 33 evaluation heuristics was developed based on the review that could be used to test for potential technology induced errors. Phase 3 involved applying these healthcare specific heuristics to evaluate a HIS.
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
ERIC Educational Resources Information Center
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
Heuristic thinking makes a chemist smart.
Graulich, Nicole; Hopf, Henning; Schreiner, Peter R
2010-05-01
We focus on the virtually neglected use of heuristic principles in understanding and teaching of organic chemistry. As human thinking is not comparable to computer systems employing factual knowledge and algorithms--people rarely make decisions through careful considerations of every possible event and its probability, risks or usefulness--research in science and teaching must include psychological aspects of the human decision making processes. Intuitive analogical and associative reasoning and the ability to categorize unexpected findings typically demonstrated by experienced chemists should be made accessible to young learners through heuristic concepts. The psychology of cognition defines heuristics as strategies that guide human problem-solving and deciding procedures, for example with patterns, analogies, or prototypes. Since research in the field of artificial intelligence and current studies in the psychology of cognition have provided evidence for the usefulness of heuristics in discovery, the status of heuristics has grown into something useful and teachable. In this tutorial review, we present a heuristic analysis of a familiar fundamental process in organic chemistry--the cyclic six-electron case, and we show that this approach leads to a more conceptual insight in understanding, as well as in teaching and learning.
A computing method for spatial accessibility based on grid partition
NASA Astrophysics Data System (ADS)
Ma, Linbing; Zhang, Xinchang
2007-06-01
An accessibility computing method and process based on grid partition was put forward in the paper. As two important factors impacting on traffic, density of road network and relative spatial resistance for difference land use was integrated into computing traffic cost in each grid. A* algorithms was inducted to searching optimum traffic cost of grids path, a detailed searching process and definition of heuristic evaluation function was described in the paper. Therefore, the method can be implemented more simply and its data source is obtained more easily. Moreover, by changing heuristic searching information, more reasonable computing result can be obtained. For confirming our research, a software package was developed with C# language under ArcEngine9 environment. Applying the computing method, a case study on accessibility of business districts in Guangzhou city was carried out.
ERIC Educational Resources Information Center
Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank
2011-01-01
Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…
ERIC Educational Resources Information Center
Veermans, Koen; van Joolingen, Wouter; de Jong, Ton
2006-01-01
This article describes a study into the role of heuristic support in facilitating discovery learning through simulation-based learning. The study compares the use of two such learning environments in the physics domain of collisions. In one learning environment (implicit heuristics) heuristics are only used to provide the learner with guidance…
[Case finding in early prevention networks - a heuristic for ambulatory care settings].
Barth, Michael; Belzer, Florian
2016-06-01
One goal of early prevention is the support of families with small children up to three years who are exposed to psychosocial risks. The identification of these cases is often complex and not well-directed, especially in the ambulatory care setting. Development of a model of a feasible and empirical based strategy for case finding in ambulatory care. Based on the risk factors of postpartal depression, lack of maternal responsiveness, parental stress with regulation disorders and poverty a lexicographic and non-compensatory heuristic model with simple decision rules, will be constructed and empirically tested. Therefore the original data set from an evaluation of the pediatric documentary form on psychosocial issues of families with small children in well-child visits will be used and reanalyzed. The first diagnostic step in the non-compensatory and hierarchical classification process is the assessment of postpartal depression followed by maternal responsiveness, parental stress and poverty. The classification model identifies 89.0 % cases from the original study. Compared to the original study the decision process becomes clearer and more concise. The evidence-based and data-driven model exemplifies a strategy for the assessment of psychosocial risk factors in ambulatory care settings. It is based on four evidence-based risk factors and offers a quick and reliable classification. A further advantage of this model is that after a risk factor is identified the diagnostic procedure will be stopped and the counselling process can commence. For further validation of the model studies, in well suited early prevention networks are needed.
Itô and Stratonovich integrals on compound renewal processes: the normal/Poisson case
NASA Astrophysics Data System (ADS)
Germano, Guido; Politi, Mauro; Scalas, Enrico; Schilling, René L.
2010-06-01
Continuous-time random walks, or compound renewal processes, are pure-jump stochastic processes with several applications in insurance, finance, economics and physics. Based on heuristic considerations, a definition is given for stochastic integrals driven by continuous-time random walks, which includes the Itô and Stratonovich cases. It is then shown how the definition can be used to compute these two stochastic integrals by means of Monte Carlo simulations. Our example is based on the normal compound Poisson process, which in the diffusive limit converges to the Wiener process.
Assessing the validity of using serious game technology to analyze physician decision making.
Mohan, Deepika; Angus, Derek C; Ricketts, Daniel; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R; Yealy, Donald M; Barnato, Amber E
2014-01-01
Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity. We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01). Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03). We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory.
ERIC Educational Resources Information Center
Tandiseru, Selvi Rajuaty
2015-01-01
The problem in this research is the lack of creative thinking skills of students. One of the learning models that is expected to enhance student's creative thinking skill is the local culture-based mathematical heuristic-KR learning model (LC-BMHLM). Heuristic-KR is a learning model which was introduced by Krulik and Rudnick (1995) that is the…
Heuristic decision making in medicine
Marewski, Julian N.; Gigerenzer, Gerd
2012-01-01
Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care. PMID:22577307
Heuristic decision making in medicine.
Marewski, Julian N; Gigerenzer, Gerd
2012-03-01
Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care.
Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.
Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven
2009-01-01
The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.
Coordinated distribution network control of tap changer transformers, capacitors and PV inverters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin
A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less
Coordinated distribution network control of tap changer transformers, capacitors and PV inverters
Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin
2017-06-08
A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less
Tag SNP selection via a genetic algorithm.
Mahdevar, Ghasem; Zahiri, Javad; Sadeghi, Mehdi; Nowzari-Dalini, Abbas; Ahrabian, Hayedeh
2010-10-01
Single Nucleotide Polymorphisms (SNPs) provide valuable information on human evolutionary history and may lead us to identify genetic variants responsible for human complex diseases. Unfortunately, molecular haplotyping methods are costly, laborious, and time consuming; therefore, algorithms for constructing full haplotype patterns from small available data through computational methods, Tag SNP selection problem, are convenient and attractive. This problem is proved to be an NP-hard problem, so heuristic methods may be useful. In this paper we present a heuristic method based on genetic algorithm to find reasonable solution within acceptable time. The algorithm was tested on a variety of simulated and experimental data. In comparison with the exact algorithm, based on brute force approach, results show that our method can obtain optimal solutions in almost all cases and runs much faster than exact algorithm when the number of SNP sites is large. Our software is available upon request to the corresponding author.
A Hierarchy of Heuristic-Based Models of Crowd Dynamics
NASA Astrophysics Data System (ADS)
Degond, P.; Appert-Rolland, C.; Moussaïd, M.; Pettré, J.; Theraulaz, G.
2013-09-01
We derive a hierarchy of kinetic and macroscopic models from a noisy variant of the heuristic behavioral Individual-Based Model of Ngai et al. (Disaster Med. Public Health Prep. 3:191-195,
Discovery and problem solving: Triangulation as a weak heuristic
NASA Technical Reports Server (NTRS)
Rochowiak, Daniel
1987-01-01
Recently the artificial intelligence community has turned its attention to the process of discovery and found that the history of science is a fertile source for what Darden has called compiled hindsight. Such hindsight generates weak heuristics for discovery that do not guarantee that discoveries will be made but do have proven worth in leading to discoveries. Triangulation is one such heuristic that is grounded in historical hindsight. This heuristic is explored within the general framework of the BACON, GLAUBER, STAHL, DALTON, and SUTTON programs. In triangulation different bases of information are compared in an effort to identify gaps between the bases. Thus, assuming that the bases of information are relevantly related, the gaps that are identified should be good locations for discovery and robust analysis.
Of mental models, assumptions and heuristics: The case of acids and acid strength
NASA Astrophysics Data System (ADS)
McClary, Lakeisha Michelle
This study explored what cognitive resources (i.e., units of knowledge necessary to learn) first-semester organic chemistry students used to make decisions about acid strength and how those resources guided the prediction, explanation and justification of trends in acid strength. We were specifically interested in the identifying and characterizing the mental models, assumptions and heuristics that students relied upon to make their decisions, in most cases under time constraints. The views about acids and acid strength were investigated for twenty undergraduate students. Data sources for this study included written responses and individual interviews. The data was analyzed using a qualitative methodology to answer five research questions. Data analysis regarding these research questions was based on existing theoretical frameworks: problem representation (Chi, Feltovich & Glaser, 1981), mental models (Johnson-Laird, 1983); intuitive assumptions (Talanquer, 2006), and heuristics (Evans, 2008). These frameworks were combined to develop the framework from which our data were analyzed. Results indicated that first-semester organic chemistry students' use of cognitive resources was complex and dependent on their understanding of the behavior of acids. Expressed mental models were generated using prior knowledge and assumptions about acids and acid strength; these models were then employed to make decisions. Explicit and implicit features of the compounds in each task mediated participants' attention, which triggered the use of a very limited number of heuristics, or shortcut reasoning strategies. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength.
Drumm, Daniel W; Greentree, Andrew D
2017-11-07
Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.
Fazlollahtabar, Hamed
2010-12-01
Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.
QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm.
Bao, Ying; Lei, Weimin; Zhang, Wei; Zhan, Yuzhuo
2016-01-01
At present, to realize or improve the quality of experience (QoE) is a major goal for network media transmission service, and QoE evaluation is the basis for adjusting the transmission control mechanism. Therefore, a kind of QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm is proposed in this paper, which is concentrated on service score calculation at the server side. The server side collects network transmission quality of service (QoS) parameter, node location data, and user expectation value from client feedback information. Then it manages the historical data in database through the "big data" process mode, and predicts user score according to heuristic rules. On this basis, it completes fuzzy clustering analysis, and generates service QoE score and management message, which will be finally fed back to clients. Besides, this paper mainly discussed service evaluation generative rules, heuristic evaluation rules and fuzzy clustering analysis methods, and presents service-based QoE evaluation processes. The simulation experiments have verified the effectiveness of QoE collaborative evaluation method based on fuzzy clustering heuristic rules.
Analytic and heuristic processes in the detection and resolution of conflict.
Ferreira, Mário B; Mata, André; Donkin, Christopher; Sherman, Steven J; Ihmels, Max
2016-10-01
Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dual-process models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and single-process accounts, which are discussed.
Relevancy Ranking of Satellite Dataset Search Results
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Quinn, Patrick; Norton, James
2017-01-01
As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.
ERIC Educational Resources Information Center
Thompson, Bruce
The relationship between analysis of variance (ANOVA) methods and their analogs (analysis of covariance and multiple analyses of variance and covariance--collectively referred to as OVA methods) and the more general analytic case is explored. A small heuristic data set is used, with a hypothetical sample of 20 subjects, randomly assigned to five…
ERIC Educational Resources Information Center
Burger-Veltmeijer, Agnes E. J.; Minnaert, Alexander E. M. G.; Van den Bosch, Els J.
2016-01-01
The Strengths and Weaknesses Heuristic (S&W Heuristic) was constructed in order to reduce biased assessments of students with (suspicion of) Intellectual Giftedness in co-occurrence with Autism Spectrum Disorder (IG + ASD) and to establish a well-founded interconnection between assessment data and intervention indications. The current study is…
Recursive heuristic classification
NASA Technical Reports Server (NTRS)
Wilkins, David C.
1994-01-01
The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.
Approximation algorithms for a genetic diagnostics problem.
Kosaraju, S R; Schäffer, A A; Biesecker, L G
1998-01-01
We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.
Simon, Eric P
2015-01-01
The legal concepts of emotional and volitional impairment in SVP evaluations are vague and ill-defined. This article reviews the legal terms of emotional and volitional impairment as they have been contemplated in extant SVP statutes, SVP case law, logical constructions, and limited empirical studies. To bridge the gap between psychiatry and the law, a broad, theory-based heuristic framework is furnished for understanding emotional and volitional impairment at a deep psychological (and intra-psychic) level. Specifically discussed are the concepts of transference, repetition compulsion, fixation, cathexis, regression, identification with the aggressor, and the object-relations and self-psychology concepts related to a loss of possession of the self. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Tabu-Search Heuristic for Deterministic Two-Mode Blockmodeling of Binary Network Matrices
ERIC Educational Resources Information Center
Brusco, Michael; Steinley, Douglas
2011-01-01
Two-mode binary data matrices arise in a variety of social network contexts, such as the attendance or non-attendance of individuals at events, the participation or lack of participation of groups in projects, and the votes of judges on cases. A popular method for analyzing such data is two-mode blockmodeling based on structural equivalence, where…
Ad-Hoc Queries over Document Collections - A Case Study
NASA Astrophysics Data System (ADS)
Löser, Alexander; Lutter, Steffen; Düssel, Patrick; Markl, Volker
We discuss the novel problem of supporting analytical business intelligence queries over web-based textual content, e.g., BI-style reports based on 100.000's of documents from an ad-hoc web search result. Neither conventional search engines nor conventional Business Intelligence and ETL tools address this problem, which lies at the intersection of their capabilities. "Google Squared" or our system GOOLAP.info, are examples of these kinds of systems. They execute information extraction methods over one or several document collections at query time and integrate extracted records into a common view or tabular structure. Frequent extraction and object resolution failures cause incomplete records which could not be joined into a record answering the query. Our focus is the identification of join-reordering heuristics maximizing the size of complete records answering a structured query. With respect to given costs for document extraction we propose two novel join-operations: The multi-way CJ-operator joins records from multiple relationships extracted from a single document. The two-way join-operator DJ ensures data density by removing incomplete records from results. In a preliminary case study we observe that our join-reordering heuristics positively impact result size, record density and lower execution costs.
Usability-driven pruning of large ontologies: the case of SNOMED CT.
López-García, Pablo; Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-06-01
To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Graph-traversal heuristics provided high coverage (71-96% of terms in the test sets of discharge summaries) at the expense of subset size (17-51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24-55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available.
Heuristic Processes in Ratings of Leader Behavior: Assessing Item-Induced Availability Biases.
ERIC Educational Resources Information Center
Binning, John F.; Fernandez, Guadalupe
Since observers' memory-based ratings of organizational phenomena provide data in research and decision-making contexts, bias in observers' judgments must be examined. A study was conducted to explore the extent to which leader behavior ratings are more generally biased by the availability heuristic. The availability heuristic is operative when a…
The Generation and Resemblance Heuristics in Face Recognition: Cooperation and Competition
ERIC Educational Resources Information Center
Kleider, Heather M.; Goldinger, Stephen D.
2006-01-01
Like all probabilistic decisions, recognition memory judgments are based on inferences about the strength and quality of stimulus familiarity. In recent articles, B. W. A. Whittlesea and J. Leboe (2000; J. Leboe & B. W. A. Whittlesea, 2002) proposed that such memory decisions entail various heuristics, similar to well-known heuristics in overt…
Sunstein, Cass R
2005-08-01
With respect to questions of fact, people use heuristics--mental short-cuts, or rules of thumb, that generally work well, but that also lead to systematic errors. People use moral heuristics too--moral short-cuts, or rules of thumb, that lead to mistaken and even absurd moral judgments. These judgments are highly relevant not only to morality, but to law and politics as well. examples are given from a number of domains, including risk regulation, punishment, reproduction and sexuality, and the act/omission distinction. in all of these contexts, rapid, intuitive judgments make a great deal of sense, but sometimes produce moral mistakes that are replicated in law and policy. One implication is that moral assessments ought not to be made by appealing to intuitions about exotic cases and problems; those intuitions are particularly unlikely to be reliable. Another implication is that some deeply held moral judgments are unsound if they are products of moral heuristics. The idea of error-prone heuristics is especially controversial in the moral domain, where agreement on the correct answer may be hard to elicit; but in many contexts, heuristics are at work and they do real damage. Moral framing effects, including those in the context of obligations to future generations, are also discussed.
Using decision tree models to depict primary care physicians CRC screening decision heuristics.
Wackerbarth, Sarah B; Tarasenko, Yelena N; Curtis, Laurel A; Joyce, Jennifer M; Haist, Steven A
2007-10-01
The purpose of this study was to identify decision heuristics utilized by primary care physicians in formulating colorectal cancer screening recommendations. Qualitative research using in-depth semi-structured interviews. We interviewed 66 primary care internists and family physicians evenly drawn from academic and community practices. A majority of physicians were male, and almost all were white, non-Hispanic. Three researchers independently reviewed each transcript to determine the physician's decision criteria and developed decision trees. Final trees were developed by consensus. The constant comparative methodology was used to define the categories. Physicians were found to use 1 of 4 heuristics ("age 50," "age 50, if family history, then earlier," "age 50, if family history, then screen at age 40," or "age 50, if family history, then adjust relative to reference case") for the timing recommendation and 5 heuristics ["fecal occult blood test" (FOBT), "colonoscopy," "if not colonoscopy, then...," "FOBT and another test," and "a choice between options"] for the type decision. No connection was found between timing and screening type heuristics. We found evidence of heuristic use. Further research is needed to determine the potential impact on quality of care.
Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas
2014-06-01
Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.
The perceived diversity heuristic: the case of pseudodiversity.
Ayal, Shahar; Zakay, Dan
2009-03-01
One of the normative ways to decrease the risk of a pool with uncertainty prospects is to diversify its resources. Thus, decision makers are advised not to put all their eggs in one basket. The authors suggest that decision makers use a perceived diversity heuristic (PDH) to evaluate the risk of a pool by intuitively assessing the diversity of its sources. This heuristic yields biased judgments in cases of pseudodiversity, in which the perceived diversity of a pool is enhanced, although this fact does not change the pool's normative values. The first 3 studies introduce 2 independent sources of pseudodiversity-distinctiveness and multiplicity-showing that these two sources can lead to overdiversification under conditions of gain. In another set of 3 studies, the authors examine the effect of framing on diversification level. The results support the PDH predictions, according to which diversity seeking is obtained under conditions of gain, whereas diversity aversion is obtained under conditions of loss.
New insights into diversification of hyper-heuristics.
Ren, Zhilei; Jiang, He; Xuan, Jifeng; Hu, Yan; Luo, Zhongxuan
2014-10-01
There has been a growing research trend of applying hyper-heuristics for problem solving, due to their ability of balancing the intensification and the diversification with low level heuristics. Traditionally, the diversification mechanism is mostly realized by perturbing the incumbent solutions to escape from local optima. In this paper, we report our attempt toward providing a new diversification mechanism, which is based on the concept of instance perturbation. In contrast to existing approaches, the proposed mechanism achieves the diversification by perturbing the instance under solving, rather than the solutions. To tackle the challenge of incorporating instance perturbation into hyper-heuristics, we also design a new hyper-heuristic framework HIP-HOP (recursive acronym of HIP-HOP is an instance perturbation-based hyper-heuristic optimization procedure), which employs a grammar guided high level strategy to manipulate the low level heuristics. With the expressive power of the grammar, the constraints, such as the feasibility of the output solution could be easily satisfied. Numerical results and statistical tests over both the Ising spin glass problem and the p -median problem instances show that HIP-HOP is able to achieve promising performances. Furthermore, runtime distribution analysis reveals that, although being relatively slow at the beginning, HIP-HOP is able to achieve competitive solutions once given sufficient time.
Qin, Junping; Sun, Shiwen; Deng, Qingxu; Liu, Limin; Tian, Yonghong
2017-06-02
Object tracking and detection is one of the most significant research areas for wireless sensor networks. Existing indoor trajectory tracking schemes in wireless sensor networks are based on continuous localization and moving object data mining. Indoor trajectory tracking based on the received signal strength indicator ( RSSI ) has received increased attention because it has low cost and requires no special infrastructure. However, RSSI tracking introduces uncertainty because of the inaccuracies of measurement instruments and the irregularities (unstable, multipath, diffraction) of wireless signal transmissions in indoor environments. Heuristic information includes some key factors for trajectory tracking procedures. This paper proposes a novel trajectory tracking scheme based on Delaunay triangulation and heuristic information (TTDH). In this scheme, the entire field is divided into a series of triangular regions. The common side of adjacent triangular regions is regarded as a regional boundary. Our scheme detects heuristic information related to a moving object's trajectory, including boundaries and triangular regions. Then, the trajectory is formed by means of a dynamic time-warping position-fingerprint-matching algorithm with heuristic information constraints. Field experiments show that the average error distance of our scheme is less than 1.5 m, and that error does not accumulate among the regions.
Utility functions and resource management in an oversubscribed heterogeneous computing environment
Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; ...
2014-09-26
We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less
Optimizing Controlling-Value-Based Power Gating with Gate Count and Switching Activity
NASA Astrophysics Data System (ADS)
Chen, Lei; Kimura, Shinji
In this paper, a new heuristic algorithm is proposed to optimize the power domain clustering in controlling-value-based (CV-based) power gating technology. In this algorithm, both the switching activity of sleep signals (p) and the overall numbers of sleep gates (gate count, N) are considered, and the sum of the product of p and N is optimized. The algorithm effectively exerts the total power reduction obtained from the CV-based power gating. Even when the maximum depth is kept to be the same, the proposed algorithm can still achieve power reduction approximately 10% more than that of the prior algorithms. Furthermore, detailed comparison between the proposed heuristic algorithm and other possible heuristic algorithms are also presented. HSPICE simulation results show that over 26% of total power reduction can be obtained by using the new heuristic algorithm. In addition, the effect of dynamic power reduction through the CV-based power gating method and the delay overhead caused by the switching of sleep transistors are also shown in this paper.
Mitigation of epidemics in contact networks through optimal contact adaptation *
Youssef, Mina; Scoglio, Caterina
2013-01-01
This paper presents an optimal control problem formulation to minimize the total number of infection cases during the spread of susceptible-infected-recovered SIR epidemics in contact networks. In the new approach, contact weighted are reduced among nodes and a global minimum contact level is preserved in the network. In addition, the infection cost and the cost associated with the contact reduction are linearly combined in a single objective function. Hence, the optimal control formulation addresses the tradeoff between minimization of total infection cases and minimization of contact weights reduction. Using Pontryagin theorem, the obtained solution is a unique candidate representing the dynamical weighted contact network. To find the near-optimal solution in a decentralized way, we propose two heuristics based on Bang-Bang control function and on a piecewise nonlinear control function, respectively. We perform extensive simulations to evaluate the two heuristics on different networks. Our results show that the piecewise nonlinear control function outperforms the well-known Bang-Bang control function in minimizing both the total number of infection cases and the reduction of contact weights. Finally, our results show awareness of the infection level at which the mitigation strategies are effectively applied to the contact weights. PMID:23906209
Mitigation of epidemics in contact networks through optimal contact adaptation.
Youssef, Mina; Scoglio, Caterina
2013-08-01
This paper presents an optimal control problem formulation to minimize the total number of infection cases during the spread of susceptible-infected-recovered SIR epidemics in contact networks. In the new approach, contact weighted are reduced among nodes and a global minimum contact level is preserved in the network. In addition, the infection cost and the cost associated with the contact reduction are linearly combined in a single objective function. Hence, the optimal control formulation addresses the tradeoff between minimization of total infection cases and minimization of contact weights reduction. Using Pontryagin theorem, the obtained solution is a unique candidate representing the dynamical weighted contact network. To find the near-optimal solution in a decentralized way, we propose two heuristics based on Bang-Bang control function and on a piecewise nonlinear control function, respectively. We perform extensive simulations to evaluate the two heuristics on different networks. Our results show that the piecewise nonlinear control function outperforms the well-known Bang-Bang control function in minimizing both the total number of infection cases and the reduction of contact weights. Finally, our results show awareness of the infection level at which the mitigation strategies are effectively applied to the contact weights.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
Evaluating a Web-Based Interface for Internet Telemedicine
NASA Technical Reports Server (NTRS)
Lathan, Corinna E.; Newman, Dava J.; Sebrechts, Marc M.; Doarn, Charles R.
1997-01-01
The objective is to introduce the usability engineering methodology, heuristic evaluation, to the design and development of a web-based telemedicine system. Using a set of usability criteria, or heuristics, one evaluator examined the Spacebridge to Russia web-site for usability problems. Thirty-four usability problems were found in this preliminary study and all were assigned a severity rating. The value of heuristic analysis in the iterative design of a system is shown because the problems can be fixed before deployment of a system and the problems are of a different nature than those found by actual users of the system. It was therefore determined that there is potential value of heuristic evaluation paired with user testing as a strategy for optimal system performance design.
[Methodology of psychiatric case histories].
Scherbaum, N; Mirzaian, E
1999-05-01
This paper deals with the methodology of psychiatric case histories. Three types of case histories are differentiated. The didactic case history teaches about the typical aspects of a psychiatric disorder or treatment by using an individual patient as an example. In the heuristic case history the individual case gives rise to challenging established concepts or to generate new hypotheses. Such hypotheses drawn from inductive reasoning have then to be tested using representative samples. The focus of hermeneutic case histories is the significance of pathological behaviour and experience in the context of the biography of an individual patient. So-called psychopathographies of important historical figures can also be differentiated according to these types. Based on these methodological considerations, quality standards for the named types of case histories are stated.
SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics.
Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf
2015-08-01
RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of [Formula: see text]. Subsequently, numerous faster 'Sankoff-style' approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity ([Formula: see text] quartic time). Breaking this barrier, we introduce the novel Sankoff-style algorithm 'sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)', which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff's original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. © The Author 2015. Published by Oxford University Press.
ERIC Educational Resources Information Center
Hattori, Masasi; Oaksford, Mike
2007-01-01
In this article, 41 models of covariation detection from 2 x 2 contingency tables were evaluated against past data in the literature and against data from new experiments. A new model was also included based on a limiting case of the normative phi-coefficient under an extreme rarity assumption, which has been shown to be an important factor in…
Automated discovery of local search heuristics for satisfiability testing.
Fukunaga, Alex S
2008-01-01
The development of successful metaheuristic algorithms such as local search for a difficult problem such as satisfiability testing (SAT) is a challenging task. We investigate an evolutionary approach to automating the discovery of new local search heuristics for SAT. We show that several well-known SAT local search algorithms such as Walksat and Novelty are composite heuristics that are derived from novel combinations of a set of building blocks. Based on this observation, we developed CLASS, a genetic programming system that uses a simple composition operator to automatically discover SAT local search heuristics. New heuristics discovered by CLASS are shown to be competitive with the best Walksat variants, including Novelty+. Evolutionary algorithms have previously been applied to directly evolve a solution for a particular SAT instance. We show that the heuristics discovered by CLASS are also competitive with these previous, direct evolutionary approaches for SAT. We also analyze the local search behavior of the learned heuristics using the depth, mobility, and coverage metrics proposed by Schuurmans and Southey.
Assessing the Validity of Using Serious Game Technology to Analyze Physician Decision Making
Mohan, Deepika; Angus, Derek C.; Ricketts, Daniel; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R.; Yealy, Donald M.; Barnato, Amber E.
2014-01-01
Background Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity. Methods We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. Findings We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01). Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03). Conclusions We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory. PMID:25153149
The Historical Ideal-Type as a Heuristic Device for Academic Storytelling by Sport Scholars
ERIC Educational Resources Information Center
Tutka, Patrick; Seifried, Chad
2015-01-01
The goal of this research endeavor is to take the previous calls of sport scholars to expand into alternative research approaches (e.g., history, case study, law reviews, philosophy, etc.) and to show how storytelling can be an effective tool through the use of a heuristic device. The present analysis attempts to focus on the usage of the…
Inhibitory mechanism of the matching heuristic in syllogistic reasoning.
Tse, Ping Ping; Moreno Ríos, Sergio; García-Madruga, Juan Antonio; Bajo Molina, María Teresa
2014-11-01
A number of heuristic-based hypotheses have been proposed to explain how people solve syllogisms with automatic processes. In particular, the matching heuristic employs the congruency of the quantifiers in a syllogism—by matching the quantifier of the conclusion with those of the two premises. When the heuristic leads to an invalid conclusion, successful solving of these conflict problems requires the inhibition of automatic heuristic processing. Accordingly, if the automatic processing were based on processing the set of quantifiers, no semantic contents would be inhibited. The mental model theory, however, suggests that people reason using mental models, which always involves semantic processing. Therefore, whatever inhibition occurs in the processing implies the inhibition of the semantic contents. We manipulated the validity of the syllogism and the congruency of the quantifier of its conclusion with those of the two premises according to the matching heuristic. A subsequent lexical decision task (LDT) with related words in the conclusion was used to test any inhibition of the semantic contents after each syllogistic evaluation trial. In the LDT, the facilitation effect of semantic priming diminished after correctly solved conflict syllogisms (match-invalid or mismatch-valid), but was intact after no-conflict syllogisms. The results suggest the involvement of an inhibitory mechanism of semantic contents in syllogistic reasoning when there is a conflict between the output of the syntactic heuristic and actual validity. Our results do not support a uniquely syntactic process of syllogistic reasoning but fit with the predictions based on mental model theory. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imam, Neena; Koenig, Gregory A; Machovec, Dylan
2016-01-01
Abstract: The worth of completing parallel tasks is modeled using utility functions, which monotonically-decrease with time and represent the importance and urgency of a task. These functions define the utility earned by a task at the time of its completion. The performance of such a system is measured as the total utility earned by all completed tasks over some interval of time (e.g., 24 hours). To maximize system performance when scheduling dynamically arriving parallel tasks onto a high performance computing (HPC) system that is oversubscribed and energy-constrained, we have designed, analyzed, and compared different heuristic techniques. Four utility-aware heuristics (i.e.,more » Max Utility, Max Utility-per-Time, Max Utility-per-Resource, and Max Utility-per-Energy), three FCFS-based heuristics (Conservative Backfilling, EASY Backfilling, and FCFS with Multiple Queues), and a Random heuristic were examined in this study. A technique that is often used with the FCFS-based heuristics is the concept of a permanent reservation. We compare the performance of permanent reservations with temporary place-holders to demonstrate the advantages that place-holders can provide. We also present a novel energy filtering technique that constrains the maximum energy-per-resource used by each task. We conducted a simulation study to evaluate the performance of these heuristics and techniques in an energy-constrained oversubscribed HPC environment. With place-holders, energy filtering, and dropping tasks with low potential utility, our utility-aware heuristics are able to significantly outperform the existing FCFS-based techniques.« less
Runway Scheduling Using Generalized Dynamic Programming
NASA Technical Reports Server (NTRS)
Montoya, Justin; Wood, Zachary; Rathinam, Sivakumar
2011-01-01
A generalized dynamic programming method for finding a set of pareto optimal solutions for a runway scheduling problem is introduced. The algorithm generates a set of runway fight sequences that are optimal for both runway throughput and delay. Realistic time-based operational constraints are considered, including miles-in-trail separation, runway crossings, and wake vortex separation. The authors also model divergent runway takeoff operations to allow for reduced wake vortex separation. A modeled Dallas/Fort Worth International airport and three baseline heuristics are used to illustrate preliminary benefits of using the generalized dynamic programming method. Simulated traffic levels ranged from 10 aircraft to 30 aircraft with each test case spanning 15 minutes. The optimal solution shows a 40-70 percent decrease in the expected delay per aircraft over the baseline schedulers. Computational results suggest that the algorithm is promising for real-time application with an average computation time of 4.5 seconds. For even faster computation times, two heuristics are developed. As compared to the optimal, the heuristics are within 5% of the expected delay per aircraft and 1% of the expected number of runway operations per hour ad can be 100x faster.
Wijeyekoon, Skanda; Kharicha, Kalpa; Iliffe, Steve
2015-09-01
To evaluate heuristics (rules of thumb) for recognition of undetected vision loss in older patients in primary care. Vision loss is associated with ageing, and its prevalence is increasing. Visual impairment has a broad impact on health, functioning and well-being. Unrecognised vision loss remains common, and screening interventions have yet to reduce its prevalence. An alternative approach is to enhance practitioners' skills in recognising undetected vision loss, by having a more detailed picture of those who are likely not to act on vision changes, report symptoms or have eye tests. This paper describes a qualitative technology development study to evaluate heuristics for recognition of undetected vision loss in older patients in primary care. Using a previous modelling study, two heuristics in the form of mnemonics were developed to aid pattern recognition and allow general practitioners to identify potential cases of unreported vision loss. These heuristics were then analysed with experts. Findings It was concluded that their implementation in modern general practice was unsuitable and an alternative solution should be sort.
Does the inherence heuristic take us to psychological essentialism?
Marmodoro, Anna; Murphy, Robin A; Baker, A G
2014-10-01
We argue that the claim that essence-based causal explanations emerge, hydra-like, from an inherence heuristic is incomplete. No plausible mechanism for the transition from concrete properties, or cues, to essences is provided. Moreover, the fundamental shotgun and storytelling mechanisms of the inherence heuristic are not clearly enough specified to distinguish them, developmentally, from associative or causal networks.
The Memory State Heuristic: A Formal Model Based on Repeated Recognition Judgments
ERIC Educational Resources Information Center
Castela, Marta; Erdfelder, Edgar
2017-01-01
The recognition heuristic (RH) theory predicts that, in comparative judgment tasks, if one object is recognized and the other is not, the recognized one is chosen. The memory-state heuristic (MSH) extends the RH by assuming that choices are not affected by recognition judgments per se, but by the memory states underlying these judgments (i.e.,…
Dynamic minimum set problem for reserve design: Heuristic solutions for large problems
Sabbadin, Régis; Johnson, Fred A.; Stith, Bradley
2018-01-01
Conversion of wild habitats to human dominated landscape is a major cause of biodiversity loss. An approach to mitigate the impact of habitat loss consists of designating reserves where habitat is preserved and managed. Determining the most valuable areas to preserve in a landscape is called the reserve design problem. There exists several possible formulations of the reserve design problem, depending on the objectives and the constraints. In this article, we considered the dynamic problem of designing a reserve that contains a desired area of several key habitats. The dynamic case implies that the reserve cannot be designed in one time step, due to budget constraints, and that habitats can be lost before they are reserved, due for example to climate change or human development. We proposed two heuristics strategies that can be used to select sites to reserve each year for large reserve design problem. The first heuristic is a combination of the Marxan and site-ordering algorithms and the second heuristic is an augmented version of the common naive myopic heuristic. We evaluated the strategies on several simulated examples and showed that the augmented greedy heuristic is particularly interesting when some of the habitats to protect are particularly threatened and/or the compactness of the network is accounted for. PMID:29543830
A general heuristic for genome rearrangement problems.
Dias, Ulisses; Galvão, Gustavo Rodrigues; Lintzmayer, Carla Négri; Dias, Zanoni
2014-06-01
In this paper, we present a general heuristic for several problems in the genome rearrangement field. Our heuristic does not solve any problem directly, it is rather used to improve the solutions provided by any non-optimal algorithm that solve them. Therefore, we have implemented several algorithms described in the literature and several algorithms developed by ourselves. As a whole, we implemented 23 algorithms for 9 well known problems in the genome rearrangement field. A total of 13 algorithms were implemented for problems that use the notions of prefix and suffix operations. In addition, we worked on 5 algorithms for the classic problem of sorting by transposition and we conclude the experiments by presenting results for 3 approximation algorithms for the sorting by reversals and transpositions problem and 2 approximation algorithms for the sorting by reversals problem. Another algorithm with better approximation ratio can be found for the last genome rearrangement problem, but it is purely theoretical with no practical implementation. The algorithms we implemented in addition to our heuristic lead to the best practical results in each case. In particular, we were able to improve results on the sorting by transpositions problem, which is a very special case because many efforts have been made to generate algorithms with good results in practice and some of these algorithms provide results that equal the optimum solutions in many cases. Our source codes and benchmarks are freely available upon request from the authors so that it will be easier to compare new approaches against our results.
The limited use of the fluency heuristic: Converging evidence across different procedures.
Pohl, Rüdiger F; Erdfelder, Edgar; Michalkiewicz, Martha; Castela, Marta; Hilbig, Benjamin E
2016-10-01
In paired comparisons based on which of two objects has the larger criterion value, decision makers could use the subjectively experienced difference in retrieval fluency of the objects as a cue. According to the fluency heuristic (FH) theory, decision makers use fluency-as indexed by recognition speed-as the only cue for pairs of recognized objects, and infer that the object retrieved more speedily has the larger criterion value (ignoring all other cues and information). Model-based analyses, however, have previously revealed that only a small portion of such inferences are indeed based on fluency alone. In the majority of cases, other information enters the decision process. However, due to the specific experimental procedures, the estimates of FH use are potentially biased: Some procedures may have led to an overestimated and others to an underestimated, or even to actually reduced, FH use. In the present article, we discuss and test the impacts of such procedural variations by reanalyzing 21 data sets. The results show noteworthy consistency across the procedural variations revealing low FH use. We discuss potential explanations and implications of this finding.
Model for the design of distributed data bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ram, S.
This research focuses on developing a model to solve the File Allocation Problem (FAP). The model integrates two major design issues, namely Concurrently Control and Data Distribution. The central node locking mechanism is incorporated in developing a nonlinear integer programming model. Two solution algorithms are proposed, one of which was implemented in FORTRAN.V. The allocation of data bases and programs are examined using this heuristic. Several decision rules were also formulated based on the results of the heuristic. A second more comprehensive heuristic was proposed, based on the knapsack problem. The development and implementation of this algorithm has been leftmore » as a topic for future research.« less
1977-09-01
Interpolation algorithm allows this to be done when the transition boundaries are defined close together and parallel to one another. In this case the...in the variable kernel esti- -mates.) In [2] a goodness-of-fit criterion for a set of sam- One question of great interest to us in this study pies...an estimate /(x) is For the unimodal case the ab.olute minimum okV .based on the variables ocurs at k .= 100, ce 5. At this point we have j Mean
Using tree diversity to compare phylogenetic heuristics.
Sul, Seung-Jin; Matthews, Suzanne; Williams, Tiffani L
2009-04-29
Evolutionary trees are family trees that represent the relationships between a group of organisms. Phylogenetic heuristics are used to search stochastically for the best-scoring trees in tree space. Given that better tree scores are believed to be better approximations of the true phylogeny, traditional evaluation techniques have used tree scores to determine the heuristics that find the best scores in the fastest time. We develop new techniques to evaluate phylogenetic heuristics based on both tree scores and topologies to compare Pauprat and Rec-I-DCM3, two popular Maximum Parsimony search algorithms. Our results show that although Pauprat and Rec-I-DCM3 find the trees with the same best scores, topologically these trees are quite different. Furthermore, the Rec-I-DCM3 trees cluster distinctly from the Pauprat trees. In addition to our heatmap visualizations of using parsimony scores and the Robinson-Foulds distance to compare best-scoring trees found by the two heuristics, we also develop entropy-based methods to show the diversity of the trees found. Overall, Pauprat identifies more diverse trees than Rec-I-DCM3. Overall, our work shows that there is value to comparing heuristics beyond the parsimony scores that they find. Pauprat is a slower heuristic than Rec-I-DCM3. However, our work shows that there is tremendous value in using Pauprat to reconstruct trees-especially since it finds identical scoring but topologically distinct trees. Hence, instead of discounting Pauprat, effort should go in improving its implementation. Ultimately, improved performance measures lead to better phylogenetic heuristics and will result in better approximations of the true evolutionary history of the organisms of interest.
NASA Astrophysics Data System (ADS)
Pasam, Gopi Krishna; Manohar, T. Gowri
2016-09-01
Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
NASA Astrophysics Data System (ADS)
Caukin, Nancy S.
The purpose of this mixed-methods study was to determine if employing the writing-to-learn strategy known as a "Science Writing Heuristic" would positively effect students' science achievement, science self-efficacy, and scientific epistemological view. The publications Science for All American, Blueprints for Reform: Project 2061 (AAAS, 1990; 1998) and National Science Education Standards (NRC 1996) strongly encourage science education that is student-centered, inquiry-based, active rather than passive, increases students' science literacy, and moves students towards a constructivist view of science. The capacity to learn, reason, problem solve, think critically and construct new knowledge can potentially be experienced through writing (Irmscher, 1979; Klein, 1999; Applebee, 1984). Science Writing Heuristic (SWH) is a tool for designing science experiences that move away from "cookbook" experiences and allows students to design experiences based on their own ideas and questions. This non-traditional classroom strategy focuses on claims that students make based on evidence, compares those claims with their peers and compares those claims with the established science community. Students engage in reflection, meaning making based on their experiences, and demonstrate those understandings in multiple ways (Hand, 2004; Keys et al, 1999, Poock, nd.). This study involved secondary honors chemistry students in a rural prek-12 school in Middle Tennessee. There were n = 23 students in the group and n = 8 in the control group. Both groups participated in a five-week study of gases. The treatment group received the instructional strategy known as Science Writing Heuristic and the control group received traditional teacher-centered science instruction. The quantitative results showed that females in the treatment group outscored their male counterparts by 11% on the science achievement portion of the study and the males in the control group had a more constructivist scientific epistemological view after the study than the males in the treatment group. Two representative students, one male and one female, were chosen to participate in a case study for the qualitative portion of the study. Results of the case study showed that these students constructed meaning and enhanced their understanding of how gases behave, had a neutral (male) or positive (female) perception of how employing Science Writing Heuristic helped them to learn, had a favorable experience that positively influenced their self-confidence in science, and increased their scientific literacy as they engaged in science as scientist do.
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
Usability-driven pruning of large ontologies: the case of SNOMED CT
Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-01-01
Objectives To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Materials and Methods Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Results Graph-traversal heuristics provided high coverage (71–96% of terms in the test sets of discharge summaries) at the expense of subset size (17–51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24–55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Discussion Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Conclusion Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available. PMID:22268217
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
BiCluE - Exact and heuristic algorithms for weighted bi-cluster editing of biomedical data
2013-01-01
Background The explosion of biological data has dramatically reformed today's biology research. The biggest challenge to biologists and bioinformaticians is the integration and analysis of large quantity of data to provide meaningful insights. One major problem is the combined analysis of data from different types. Bi-cluster editing, as a special case of clustering, which partitions two different types of data simultaneously, might be used for several biomedical scenarios. However, the underlying algorithmic problem is NP-hard. Results Here we contribute with BiCluE, a software package designed to solve the weighted bi-cluster editing problem. It implements (1) an exact algorithm based on fixed-parameter tractability and (2) a polynomial-time greedy heuristics based on solving the hardest part, edge deletions, first. We evaluated its performance on artificial graphs. Afterwards we exemplarily applied our implementation on real world biomedical data, GWAS data in this case. BiCluE generally works on any kind of data types that can be modeled as (weighted or unweighted) bipartite graphs. Conclusions To our knowledge, this is the first software package solving the weighted bi-cluster editing problem. BiCluE as well as the supplementary results are available online at http://biclue.mpi-inf.mpg.de. PMID:24565035
Clark, David Glenn
2012-01-01
Background: Despite general agreement that aphasic individuals exhibit difficulty understanding complex sentences, the nature of sentence complexity itself is unresolved. In addition, aphasic individuals appear to make use of heuristic strategies for understanding sentences. This research is a comparison of predictions derived from two approaches to the quantification of sentence complexity, one based on the hierarchical structure of sentences, and the other based on dependency locality theory (DLT). Complexity metrics derived from these theories are evaluated under various assumptions of heuristic use. Method: A set of complexity metrics was derived from each general theory of sentence complexity and paired with assumptions of heuristic use. Probability spaces were generated that summarized the possible patterns of performance across 16 different sentence structures. The maximum likelihood of comprehension scores of 42 aphasic individuals was then computed for each probability space and the expected scores from the best-fitting points in the space were recorded for comparison to the actual scores. Predictions were then compared using measures of fit quality derived from linear mixed effects models. Results: All three of the metrics that provide the most consistently accurate predictions of patient scores rely on storage costs based on the DLT. Patients appear to employ an Agent–Theme heuristic, but vary in their tendency to accept heuristically generated interpretations. Furthermore, the ability to apply the heuristic may be degraded in proportion to aphasia severity. Conclusion: DLT-derived storage costs provide the best prediction of sentence comprehension patterns in aphasia. Because these costs are estimated by counting incomplete syntactic dependencies at each point in a sentence, this finding suggests that aphasia is associated with reduced availability of cognitive resources for maintaining these dependencies. PMID:22590462
Clark, David Glenn
2012-01-01
Despite general agreement that aphasic individuals exhibit difficulty understanding complex sentences, the nature of sentence complexity itself is unresolved. In addition, aphasic individuals appear to make use of heuristic strategies for understanding sentences. This research is a comparison of predictions derived from two approaches to the quantification of sentence complexity, one based on the hierarchical structure of sentences, and the other based on dependency locality theory (DLT). Complexity metrics derived from these theories are evaluated under various assumptions of heuristic use. A set of complexity metrics was derived from each general theory of sentence complexity and paired with assumptions of heuristic use. Probability spaces were generated that summarized the possible patterns of performance across 16 different sentence structures. The maximum likelihood of comprehension scores of 42 aphasic individuals was then computed for each probability space and the expected scores from the best-fitting points in the space were recorded for comparison to the actual scores. Predictions were then compared using measures of fit quality derived from linear mixed effects models. All three of the metrics that provide the most consistently accurate predictions of patient scores rely on storage costs based on the DLT. Patients appear to employ an Agent-Theme heuristic, but vary in their tendency to accept heuristically generated interpretations. Furthermore, the ability to apply the heuristic may be degraded in proportion to aphasia severity. DLT-derived storage costs provide the best prediction of sentence comprehension patterns in aphasia. Because these costs are estimated by counting incomplete syntactic dependencies at each point in a sentence, this finding suggests that aphasia is associated with reduced availability of cognitive resources for maintaining these dependencies.
Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions
Cassidy, Rachel N; Raiff, Bethany R
2013-01-01
Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668
A hop count based heuristic routing protocol for mobile delay tolerant networks.
You, Lei; Li, Jianbo; Wei, Changjiang; Dai, Chenqu; Xu, Jixing; Hu, Lejuan
2014-01-01
Routing in delay tolerant networks (DTNs) is a challenge since it must handle network partitioning, long delays, and dynamic topology. Meanwhile, routing protocols of the traditional mobile ad hoc networks (MANETs) cannot work well due to the failure of its assumption that most network connections are available. In this paper, we propose a hop count based heuristic routing protocol by utilizing the information carried by the peripatetic packets in the network. A heuristic function is defined to help in making the routing decision. We formally define a custom operation for square matrices so as to transform the heuristic value calculation into matrix manipulation. Finally, the performance of our proposed algorithm is evaluated by the simulation results, which show the advantage of such self-adaptive routing protocol in the diverse circumstance of DTNs.
A Hop Count Based Heuristic Routing Protocol for Mobile Delay Tolerant Networks
Wei, Changjiang; Dai, Chenqu; Xu, Jixing; Hu, Lejuan
2014-01-01
Routing in delay tolerant networks (DTNs) is a challenge since it must handle network partitioning, long delays, and dynamic topology. Meanwhile, routing protocols of the traditional mobile ad hoc networks (MANETs) cannot work well due to the failure of its assumption that most network connections are available. In this paper, we propose a hop count based heuristic routing protocol by utilizing the information carried by the peripatetic packets in the network. A heuristic function is defined to help in making the routing decision. We formally define a custom operation for square matrices so as to transform the heuristic value calculation into matrix manipulation. Finally, the performance of our proposed algorithm is evaluated by the simulation results, which show the advantage of such self-adaptive routing protocol in the diverse circumstance of DTNs. PMID:25110736
Impact of heuristics in clustering large biological networks.
Shafin, Md Kishwar; Kabir, Kazi Lutful; Ridwan, Iffatur; Anannya, Tasmiah Tamzid; Karim, Rashid Saadman; Hoque, Mohammad Mozammel; Rahman, M Sohel
2015-12-01
Traditional clustering algorithms often exhibit poor performance for large networks. On the contrary, greedy algorithms are found to be relatively efficient while uncovering functional modules from large biological networks. The quality of the clusters produced by these greedy techniques largely depends on the underlying heuristics employed. Different heuristics based on different attributes and properties perform differently in terms of the quality of the clusters produced. This motivates us to design new heuristics for clustering large networks. In this paper, we have proposed two new heuristics and analyzed the performance thereof after incorporating those with three different combinations in a recently celebrated greedy clustering algorithm named SPICi. We have extensively analyzed the effectiveness of these new variants. The results are found to be promising. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Weibo; Jin, Yan; Price, Mark
2016-10-01
A new heuristic based on the Nawaz-Enscore-Ham algorithm is proposed in this article for solving a permutation flow-shop scheduling problem. A new priority rule is proposed by accounting for the average, mean absolute deviation, skewness and kurtosis, in order to fully describe the distribution style of processing times. A new tie-breaking rule is also introduced for achieving effective job insertion with the objective of minimizing both makespan and machine idle time. Statistical tests illustrate better solution quality of the proposed algorithm compared to existing benchmark heuristics.
NASA Astrophysics Data System (ADS)
Mahnam, Mehdi; Gendreau, Michel; Lahrichi, Nadia; Rousseau, Louis-Martin
2017-07-01
In this paper, we propose a novel heuristic algorithm for the volumetric-modulated arc therapy treatment planning problem, optimizing the trade-off between delivery time and treatment quality. We present a new mixed integer programming model in which the multi-leaf collimator leaf positions, gantry speed, and dose rate are determined simultaneously. Our heuristic is based on column generation; the aperture configuration is modeled in the columns and the dose distribution and time restriction in the rows. To reduce the number of voxels and increase the efficiency of the master model, we aggregate similar voxels using a clustering technique. The efficiency of the algorithm and the treatment quality are evaluated on a benchmark clinical prostate cancer case. The computational results show that a high-quality treatment is achievable using a four-thread CPU. Finally, we analyze the effects of the various parameters and two leaf-motion strategies.
Massively parallel support for a case-based planning system
NASA Technical Reports Server (NTRS)
Kettler, Brian P.; Hendler, James A.; Anderson, William A.
1993-01-01
Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.
Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks
NASA Astrophysics Data System (ADS)
Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu
Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.
Development of heuristic bias detection in elementary school.
De Neys, Wim; Feremans, Vicky
2013-02-01
Although human reasoning is often biased by intuitive heuristics, recent studies have shown that adults and adolescents detect the biased nature of their judgments. The present study focused on the development of this critical bias sensitivity by examining the detection skills of young children in elementary school. Third and 6th graders were presented with child-friendly versions of classic base-rate problems in which a cued heuristic response could be inconsistent or consistent with the base rates. After each problem children were asked to indicate their subjective response confidence to assess their bias detection skills. Results indicated that 6th graders showed a clear confidence decrease when they gave a heuristic response that conflicted with the base rates. However, this confidence decrease was not observed for 3rd graders, suggesting that they did not yet acknowledge that their judgment was not fully warranted. Implications for the development of efficient training programs and the debate on human rationality are discussed. (c) 2013 APA, all rights reserved.
Impact of Blended Learning Environments Based on Algo-Heuristic Theory on Some Variables
ERIC Educational Resources Information Center
Aygün, Mustafa; Korkmaz, Özgen
2012-01-01
In this study, the effects of Algo-Heuristic Theory based blended learning environments on students' computer skills in their preparation of presentations, levels of attitudes towards computers, and levels of motivation regarding the information technology course were investigated. The research sample was composed of 71 students. A semi-empirical…
Exact and Heuristic Algorithms for Runway Scheduling
NASA Technical Reports Server (NTRS)
Malik, Waqar A.; Jung, Yoon C.
2016-01-01
This paper explores the Single Runway Scheduling (SRS) problem with arrivals, departures, and crossing aircraft on the airport surface. Constraints for wake vortex separations, departure area navigation separations and departure time window restrictions are explicitly considered. The main objective of this research is to develop exact and heuristic based algorithms that can be used in real-time decision support tools for Air Traffic Control Tower (ATCT) controllers. The paper provides a multi-objective dynamic programming (DP) based algorithm that finds the exact solution to the SRS problem, but may prove unusable for application in real-time environment due to large computation times for moderate sized problems. We next propose a second algorithm that uses heuristics to restrict the search space for the DP based algorithm. A third algorithm based on a combination of insertion and local search (ILS) heuristics is then presented. Simulation conducted for the east side of Dallas/Fort Worth International Airport allows comparison of the three proposed algorithms and indicates that the ILS algorithm performs favorably in its ability to find efficient solutions and its computation times.
Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?
Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin
2014-08-01
Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.
SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics
Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf
2015-01-01
Motivation: RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of O(n6). Subsequently, numerous faster ‘Sankoff-style’ approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity (≥ quartic time). Results: Breaking this barrier, we introduce the novel Sankoff-style algorithm ‘sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)’, which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff’s original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. Availability and implementation: SPARSE is freely available at http://www.bioinf.uni-freiburg.de/Software/SPARSE. Contact: backofen@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25838465
Paranoid thinking as a heuristic.
Preti, Antonio; Cella, Matteo
2010-08-01
Paranoid thinking can be viewed as a human heuristic used by individuals to deal with uncertainty during stressful situations. Under stress, individuals are likely to emphasize the threatening value of neutral stimuli and increase the reliance on paranoia-based heuristic to interpreter events and guide their decisions. Paranoid thinking can also be activated by stress arising from the possibility of losing a good opportunity; this may result in an abnormal allocation of attentional resources to social agents. A better understanding of the interplay between cognitive heuristics and emotional processes may help to detect situations in which paranoid thinking is likely to exacerbate and improve intervention for individuals with delusional disorders.
Instructional Design: Case Studies in Communities of Practice
ERIC Educational Resources Information Center
Keppell, Michael, Ed.
2007-01-01
"Instructional Design: Case Studies in Communities of Practice" documents real-world experiences of instructional designers and staff developers who work in communities of practice. "Instructional Design: Case Studies in Communities of Practice" explains the strategies and heuristics used by instructional designers when working…
ERIC Educational Resources Information Center
Morsanyi, Kinga; Handley, Simon J.
2008-01-01
We examined the relationship between cognitive capacity and heuristic responding on four types of reasoning and decision-making tasks. A total of 84 children, between 5 years 2 months and 11 years 7 months of age, participated in the study. There was a marked increase in heuristic responding with age that was related to increases in cognitive…
2005-05-01
made. 4. Do military decision makers identify / analyze adverse consequences presently? Few do based on this research and most don’t do it effectively ...A HEURISTIC DECISION MAKING MODEL TO MITIGATE ADVERSE CONSEQUENCES IN A NETWORK CENTRIC WARFARE / SENSE AND RESPOND SYSTEM...ENS/05-01 A HEURISTIC DECISION MAKING MODEL TO MITIGATE ADVERSE CONSEQUENCES IN A NETWORK CENTRIC WARFARE / SENSE AND RESPOND SYSTEM
Activity Recognition for Personal Time Management
NASA Astrophysics Data System (ADS)
Prekopcsák, Zoltán; Soha, Sugárka; Henk, Tamás; Gáspár-Papanek, Csaba
We describe an accelerometer based activity recognition system for mobile phones with a special focus on personal time management. We compare several data mining algorithms for the automatic recognition task in the case of single user and multiuser scenario, and improve accuracy with heuristics and advanced data mining methods. The results show that daily activities can be recognized with high accuracy and the integration with the RescueTime software can give good insights for personal time management.
A simple strategy for varying the restart parameter in GMRES(m)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, A H; Jessup, E R; Kolev, T V
2007-10-02
When solving a system of linear equations with the restarted GMRES method, a fixed restart parameter is typically chosen. We present numerical experiments that demonstrate the beneficial effects of changing the value of the restart parameter in each restart cycle on the total time to solution. We propose a simple strategy for varying the restart parameter and provide some heuristic explanations for its effectiveness based on analysis of the symmetric case.
Parental investment: how an equity motive can produce inequality.
Hertwig, Ralph; Davis, Jennifer Nerissa; Sulloway, Frank J
2002-09-01
The equity heuristic is a decision rule specifying that parents should attempt to subdivide resources more or less equally among their children. This investment rule coincides with the prescription from optimality models in economics and biology in cases in which expected future return for each offspring is equal. In this article, the authors present a counterintuitive implication of the equity heuristic: Whereas an equity motive produces a fair distribution at any given point in time, it yields a cumulative distribution of investments that is unequal. The authors test this analytical observation against evidence reported in studies exploring parental investment and show how the equity heuristic can provide an explanation of why the literature reports a diversity of birth order effects with respect to parental resource allocation.
Unified heuristics to solve routing problem of reverse logistics in sustainable supply chain
NASA Astrophysics Data System (ADS)
Anbuudayasankar, S. P.; Ganesh, K.; Lenny Koh, S. C.; Mohandas, K.
2010-03-01
A reverse logistics problem, motivated by many real-life applications, is examined where bottles/cans in which products are delivered from a processing depot to customers in one period are available for return to the depot in the following period. The picked-up bottles/cans need to be adjusted in the place of delivery load. This problem is termed as simultaneous delivery and pick-up problem with constrained capacity (SDPC). We develop three unified heuristics based on extended branch and bound heuristic, genetic algorithm and simulated annealing to solve SDPC. These heuristics are also designed to solve standard travelling salesman problem (TSP) and TSP with simultaneous delivery and pick-up (TSDP). We tested the heuristics on standard, derived and randomly generated datasets of TSP, TSDP and SDPC and obtained satisfying results with high convergence in reasonable time.
Monkman, Helen; Griffith, Janessa; Kushniruk, Andre W
2015-01-01
Heuristic evaluations have proven to be valuable for identifying usability issues in systems. Commonly used sets of heuritics exist; however, they may not always be the most suitable, given the specific goal of the analysis. One such example is seeking to evaluate the demands on eHealth literacy and usability of consumer health information systems. In this study, eight essential heuristics and three optional heuristics subsumed from the evidence on eHealth/health literacy and usability were tested for their utility in assessing a mobile blood pressure tracking application (app). This evaluation revealed a variety of ways the design of the app could both benefit and impede users with limited eHealth literacy. This study demonstrated the utility of a low-cost, single evaluation approach for identifying both eHealth literacy and usability issues based on existing evidence in the literature.
Performance of Optimization Heuristics for the Operational Planning of Multi-energy Storage Systems
NASA Astrophysics Data System (ADS)
Haas, J.; Schradi, J.; Nowak, W.
2016-12-01
In the transition to low-carbon energy sources, energy storage systems (ESS) will play an increasingly important role. Particularly in the context of solar power challenges (variability, uncertainty), ESS can provide valuable services: energy shifting, ramping, robustness against forecast errors, frequency support, etc. However, these qualities are rarely modelled in the operational planning of power systems because of the involved computational burden, especially when multiple ESS technologies are involved. This work assesses two optimization heuristics for speeding up the optimal operation problem. It compares their accuracy (in terms of costs) and speed against a reference solution. The first heuristic (H1) is based on a merit order. Here, the ESS are sorted from lower to higher operational costs (including cycling costs). For each time step, the cheapest available ESS is used first, followed by the second one and so on, until matching the net load (demand minus available renewable generation). The second heuristic (H2) uses the Fourier transform to detect the main frequencies that compose the net load. A specific ESS is assigned to each frequency range, aiming to smoothen the net load. Finally, the reference solution is obtained with a mixed integer linear program (MILP). H1, H2 and MILP are subject to technical constraints (energy/power balance, ramping rates, on/off states...). Costs due to operation, replacement (cycling) and unserved energy are considered. Four typical days of a system with a high share of solar energy were used in several test cases, varying the resolution from one second to fifteen minutes. H1 and H2 achieve accuracies of about 90% and 95% in average, and speed-up times of two to three and one to two orders of magnitude, respectively. The use of the heuristics looks promising in the context of planning the expansion of power systems, especially when their loss of accuracy is outweighed by solar or wind forecast errors.
DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.
Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal
2015-06-01
Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.
Can the inherence heuristic explain vitalistic reasoning?
Bastian, Brock
2014-10-01
Inherence is an important component of psychological essentialism. By drawing on vitalism as a way in which to explain this link, however, the authors appear to conflate causal explanations based on fixed features with those based on general causal forces. The disjuncture between these two types of explanatory principles highlights potential new avenues for the inherence heuristic.
Reasoning by analogy as an aid to heuristic theorem proving.
NASA Technical Reports Server (NTRS)
Kling, R. E.
1972-01-01
When heuristic problem-solving programs are faced with large data bases that contain numbers of facts far in excess of those needed to solve any particular problem, their performance rapidly deteriorates. In this paper, the correspondence between a new unsolved problem and a previously solved analogous problem is computed and invoked to tailor large data bases to manageable sizes. This paper outlines the design of an algorithm for generating and exploiting analogies between theorems posed to a resolution-logic system. These algorithms are believed to be the first computationally feasible development of reasoning by analogy to be applied to heuristic theorem proving.
Scheduling and rescheduling with iterative repair
NASA Technical Reports Server (NTRS)
Zweben, Monte; Davis, Eugene; Daun, Brian; Deale, Michael
1992-01-01
This paper describes the GERRY scheduling and rescheduling system being applied to coordinate Space Shuttle Ground Processing. The system uses constraint-based iterative repair, a technique that starts with a complete but possibly flawed schedule and iteratively improves it by using constraint knowledge within repair heuristics. In this paper we explore the tradeoff between the informedness and the computational cost of several repair heuristics. We show empirically that some knowledge can greatly improve the convergence speed of a repair-based system, but that too much knowledge, such as the knowledge embodied within the MIN-CONFLICTS lookahead heuristic, can overwhelm a system and result in degraded performance.
Nash, Mark S; Cowan, Rachel E; Kressler, Jochen
2012-09-01
Component and coalesced health risks of the cardiometabolic syndrome (CMS) are commonly reported in persons with spinal cord injuries (SCIs). These CMS hazards are also co-morbid with physical deconditioning and elevated pro-atherogenic inflammatory cytokines, both of which are common after SCI and worsen the prognosis for all-cause cardiovascular disease. This article describes a systematic procedure for individualized CMS risk assessment after SCI, and emphasizes evidence-based and intuition-centered countermeasures to disease. A unified approach will propose therapeutic lifestyle intervention as a routine plan for aggressive primary prevention in this risk-susceptible population. Customization of dietary and exercise plans then follow, identifying shortfalls in diet and activity patterns, and ways in which these healthy lifestyles can be more substantially embraced by both stakeholders with SCI and their health care providers. In cases where lifestyle intervention utilizing diet and exercise is unsuccessful in countering risks, available pharmacotherapies and a preferred therapeutic agent are proposed according to authoritative standards. The over-arching purpose of the monograph is to create an operational framework in which existing evidence-based approaches or heuristic modeling becomes best practice. In this way persons with SCI can lead more active and healthy lives.
Précis of Simple heuristics that make us smart.
Todd, P M; Gigerenzer, G
2000-10-01
How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.
Put a limit on it: The protective effects of scarcity heuristics when self-control is low
Cheung, Tracy TL; Kroese, Floor M; Fennis, Bob M; De Ridder, Denise TD
2015-01-01
Low self-control is a state in which consumers are assumed to be vulnerable to making impulsive choices that hurt long-term goals. Rather than increasing self-control, the current research exploits the tendency for heuristic-based thinking in low self-control by employing scarcity heuristics to promote better consumption choices. Results indicate that consumers low in self-control especially benefited and selected more healthy choices when marketed as “scarce” (Study 1), and that a demand (vs supply) scarcity heuristic was most effective in promoting utilitarian products (Study 2) suggests low self-control involves both an enhanced reward orientation and increased tendency to conform to descriptive norms. PMID:28070377
A two-stage stochastic rule-based model to determine pre-assembly buffer content
NASA Astrophysics Data System (ADS)
Gunay, Elif Elcin; Kula, Ufuk
2018-01-01
This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.
Response demands and the recruitment of heuristic strategies in syllogistic reasoning.
Reverberi, Carlo; Rusconi, Patrice; Paulesu, Eraldo; Cherubini, Paolo
2009-03-01
Two experiments investigated whether dealing with a homogeneous subset of syllogisms with time-constrained responses encouraged participants to develop and use heuristics for abstract (Experiment 1) and thematic (Experiment 2) syllogisms. An atmosphere-based heuristic accounted for most responses with both abstract and thematic syllogisms. With thematic syllogisms, a weaker effect of a belief heuristic was also observed, mainly where the correct response was inconsistent with the atmosphere of the premises. Analytic processes appear to have played little role in the time-constrained condition, whereas their involvement increased in a self-paced, unconstrained condition. From a dual-process perspective, the results further specify how task demands affect the recruitment of heuristic and analytic systems of reasoning. Because the syllogisms and experimental procedure were the same as those used in a previous neuroimaging study by Goel, Buchel, Frith, and Dolan (2000), the result also deepen our understanding of the cognitive processes investigated by that study.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505
A health literacy and usability heuristic evaluation of a mobile consumer health application.
Monkman, Helen; Kushniruk, Andre
2013-01-01
Usability and health literacy are two critical factors in the design and evaluation of consumer health information systems. However, methods for evaluating these two factors in conjunction remain limited. This study adapted a set of existing guidelines for the design of consumer health Web sites into evidence-based evaluation heuristics tailored specifically for mobile consumer health applications. In order to test the approach, a mobile consumer health application (app) was then evaluated using these heuristics. In addition to revealing ways to improve the usability of the system, this analysis identified opportunities to augment the content to make it more understandable by users with limited health literacy. This study successfully demonstrated the utility of converting existing design guidelines into heuristics for the evaluation of usability and health literacy. The heuristics generated could be applied for assessing and revising other existing consumer health information systems.
Minimizing conflicts: A heuristic repair method for constraint-satisfaction and scheduling problems
NASA Technical Reports Server (NTRS)
Minton, Steve; Johnston, Mark; Philips, Andrew; Laird, Phil
1992-01-01
This paper describes a simple heuristic approach to solving large-scale constraint satisfaction and scheduling problems. In this approach one starts with an inconsistent assignment for a set of variables and searches through the space of possible repairs. The search can be guided by a value-ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. The heuristic can be used with a variety of different search strategies. We demonstrate empirically that on the n-queens problem, a technique based on this approach performs orders of magnitude better than traditional backtracking techniques. We also describe a scheduling application where the approach has been used successfully. A theoretical analysis is presented both to explain why this method works well on certain types of problems and to predict when it is likely to be most effective.
Balzan, Ryan; Delfabbro, Paul; Galletly, Cherrie; Woodward, Todd
2012-01-01
Hypersalience of evidence-hypothesis matches has recently been proposed as the cognitive mechanism responsible for the cognitive biases which, in turn, may contribute to the formation and maintenance of delusions. However, the construct lacks empirical support. The current paper investigates the possibility that individuals with delusions are hypersalient to evidence-hypothesis matches using a series of cognitive tasks designed to elicit the representativeness and availability reasoning heuristics. It was hypothesised that hypersalience of evidence-hypothesis matches may increase a person's propensity to rely on judgements of representativeness (i.e., when the probability of an outcome is based on its similarity with its parent population) and availability (i.e., estimates of frequency based on the ease with which relevant events come to mind). A total of 75 participants (25 diagnosed with schizophrenia with a history of delusions; 25 nonclinical delusion-prone; 25 nondelusion-prone controls) completed four heuristics tasks based on the original Tversky and Kahnemann experiments. These included two representativeness tasks ("coin-toss" random sequence task; "lawyer-engineer" base-rates task) and two availability tasks ("famous-names" and "letter-frequency" tasks). The results across these four heuristics tasks showed that participants with schizophrenia were more susceptible than nonclinical groups to both the representativeness and availability reasoning heuristics. These results suggest that delusional ideation is linked to a hypersalience of evidence-hypothesis matches. The theoretical implications of this cognitive mechanism on the formation and maintenance of delusions are discussed.
Heuristic Evaluation on Mobile Interfaces: A New Checklist
Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis
2014-01-01
The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. PMID:25295300
Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.
Stupple, Edward J N; Waterhouse, Eleanor F
2009-08-01
An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.
The Case Study as Research Heuristic: Lessons from the R&D Value Mapping Project.
ERIC Educational Resources Information Center
Bozeman, Barry; Klein, Hans K.
1999-01-01
Examines the role of prototype case studies as the foundation for later evaluation through two studies from the "R&D Value Mapping Project," a study that will involve more than 30 cases. Explores the usefulness of case studies in defining and assessing subsequent research efforts. (SLD)
Heuristic evaluation of eNote: an electronic notes system.
Bright, Tiffani J; Bakken, Suzanne; Johnson, Stephen B
2006-01-01
eNote is an electronic health record (EHR) system based on semi-structured narrative documents. A heuristic evaluation was conducted with a sample of five usability experts. eNote performed highly in: 1)consistency with standards and 2)recognition rather than recall. eNote needs improvement in: 1)help and documentation, 2)aesthetic and minimalist design, 3)error prevention, 4)helping users recognize, diagnosis, and recover from errors, and 5)flexibility and efficiency of use. The heuristic evaluation was an efficient method of evaluating our interface.
ERIC Educational Resources Information Center
Findler, Nicholas V.; And Others
1992-01-01
Describes SHRIF, a System for Heuristic Retrieval of Information and Facts, and the medical knowledge base that was used in its development. Highlights include design decisions; the user-machine interface, including the language processor; and the organization of the knowledge base in an artificial intelligence (AI) project like this one. (57…
A new graph-based method for pairwise global network alignment
Klau, Gunnar W
2009-01-01
Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162
Heuristic evaluation of infusion pumps: implications for patient safety in Intensive Care Units.
Graham, Mark J; Kubose, Tate K; Jordan, Desmond; Zhang, Jiajie; Johnson, Todd R; Patel, Vimla L
2004-11-01
The goal of this research was to use a heuristic evaluation methodology to uncover design and interface deficiencies of infusion pumps that are currently in use in Intensive Care Units (ICUs). Because these infusion systems cannot be readily replaced due to lease agreements and large-scale institutional purchasing procedures, we argue that it is essential to systematically identify the existing usability problems so that the possible causes of errors can be better understood, passed on to the end-users (e.g., critical care nurses), and used to make policy recommendations. Four raters conducted the heuristic evaluation of the three-channel infusion pump interface. Three raters had a cognitive science background as well as experience with the heuristic evaluation methodology. The fourth rater was a veteran critical care nurse who had extensive experience operating the pumps. The usability experts and the domain expert independently evaluated the user interface and physical design of the infusion pump and generated a list of heuristic violations based upon a set of 14 heuristics developed in previous research. The lists were compiled and then rated on the severity of the violation. From 14 usability heuristics considered in this evaluation of the Infusion Pump, there were 231 violations. Two heuristics, "Consistency" and "Language", were found to have the most violations. The one with fewest violations was "Document". While some heuristic evaluation categories had more violations than others, the most severe ones were not confined to one type. The Primary interface location (e.g., where loading the pump, changing doses, and confirming drug settings takes place) had the most occurrences of heuristic violations. We believe that the Heuristic Evaluation methodology provides a simple and cost-effective approach to discovering medical device deficiencies that affect a patient's general well being. While this methodology provides information for the infusion pump designs of the future, it also identifies important insights concerning equipment that is currently in use in critical care environments.
How Monte Carlo heuristics aid to identify the physical processes of drug release kinetics.
Lecca, Paola
2018-01-01
We implement a Monte Carlo heuristic algorithm to model drug release from a solid dosage form. We show that with Monte Carlo simulations it is possible to identify and explain the causes of the unsatisfactory predictive power of current drug release models. It is well known that the power-law, the exponential models, as well as those derived from or inspired by them accurately reproduce only the first 60% of the release curve of a drug from a dosage form. In this study, by using Monte Carlo simulation approaches, we show that these models fit quite accurately almost the entire release profile when the release kinetics is not governed by the coexistence of different physico-chemical mechanisms. We show that the accuracy of the traditional models are comparable with those of Monte Carlo heuristics when these heuristics approximate and oversimply the phenomenology of drug release. This observation suggests to develop and use novel Monte Carlo simulation heuristics able to describe the complexity of the release kinetics, and consequently to generate data more similar to those observed in real experiments. Implementing Monte Carlo simulation heuristics of the drug release phenomenology may be much straightforward and efficient than hypothesizing and implementing from scratch complex mathematical models of the physical processes involved in drug release. Identifying and understanding through simulation heuristics what processes of this phenomenology reproduce the observed data and then formalize them in mathematics may allow avoiding time-consuming, trial-error based regression procedures. Three bullet points, highlighting the customization of the procedure. •An efficient heuristics based on Monte Carlo methods for simulating drug release from solid dosage form encodes is presented. It specifies the model of the physical process in a simple but accurate way in the formula of the Monte Carlo Micro Step (MCS) time interval.•Given the experimentally observed curve of drug release, we point out how Monte Carlo heuristics can be integrated in an evolutionary algorithmic approach to infer the mode of MCS best fitting the observed data, and thus the observed release kinetics.•The software implementing the method is written in R language, the free most used language in the bioinformaticians community.
NASA Astrophysics Data System (ADS)
Aungkulanon, P.; Luangpaiboon, P.
2010-10-01
Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2018-01-01
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained significance for students (Version A, M = 7.28, SD = 3.46; Version B, M = 5.82, SD = 3.22), t(153) = 2.67, p = .008, and residents (Version A, M = 7.19, SD = 3.24; Version B, M = 5.56, SD = 2.72), t(77) = 2.32, p = .02, but not attendings. Authors developed an instrument to isolate and quantify bias produced by the availability and representativeness heuristics, and illustrated the utility of their instrument by demonstrating decreased heuristic bias within medical contexts at higher training levels.
Triplet supertree heuristics for the tree of life
Lin, Harris T; Burleigh, J Gordon; Eulenstein, Oliver
2009-01-01
Background There is much interest in developing fast and accurate supertree methods to infer the tree of life. Supertree methods combine smaller input trees with overlapping sets of taxa to make a comprehensive phylogenetic tree that contains all of the taxa in the input trees. The intrinsically hard triplet supertree problem takes a collection of input species trees and seeks a species tree (supertree) that maximizes the number of triplet subtrees that it shares with the input trees. However, the utility of this supertree problem has been limited by a lack of efficient and effective heuristics. Results We introduce fast hill-climbing heuristics for the triplet supertree problem that perform a step-wise search of the tree space, where each step is guided by an exact solution to an instance of a local search problem. To realize time efficient heuristics we designed the first nontrivial algorithms for two standard search problems, which greatly improve on the time complexity to the best known (naïve) solutions by a factor of n and n2 (the number of taxa in the supertree). These algorithms enable large-scale supertree analyses based on the triplet supertree problem that were previously not possible. We implemented hill-climbing heuristics that are based on our new algorithms, and in analyses of two published supertree data sets, we demonstrate that our new heuristics outperform other standard supertree methods in maximizing the number of triplets shared with the input trees. Conclusion With our new heuristics, the triplet supertree problem is now computationally more tractable for large-scale supertree analyses, and it provides a potentially more accurate alternative to existing supertree methods. PMID:19208181
NASA Astrophysics Data System (ADS)
Shiri, Jalal
2018-06-01
Among different reference evapotranspiration (ETo) modeling approaches, mass transfer-based methods have been less studied. These approaches utilize temperature and wind speed records. On the other hand, the empirical equations proposed in this context generally produce weak simulations, except when a local calibration is used for improving their performance. This might be a crucial drawback for those equations in case of local data scarcity for calibration procedure. So, application of heuristic methods can be considered as a substitute for improving the performance accuracy of the mass transfer-based approaches. However, given that the wind speed records have usually higher variation magnitudes than the other meteorological parameters, application of a wavelet transform for coupling with heuristic models would be necessary. In the present paper, a coupled wavelet-random forest (WRF) methodology was proposed for the first time to improve the performance accuracy of the mass transfer-based ETo estimation approaches using cross-validation data management scenarios in both local and cross-station scales. The obtained results revealed that the new coupled WRF model (with the minimum scatter index values of 0.150 and 0.192 for local and external applications, respectively) improved the performance accuracy of the single RF models as well as the empirical equations to great extent.
The normalization heuristic: an untested hypothesis that may misguide medical decisions.
Aberegg, Scott K; O'Brien, James M
2009-06-01
Medical practice is increasingly informed by the evidence from randomized controlled trials. When such evidence is not available, clinical hypotheses based on pathophysiological reasoning and common sense guide clinical decision making. One commonly utilized general clinical hypothesis is the assumption that normalizing abnormal laboratory values and physiological parameters will lead to improved patient outcomes. We refer to the general use of this clinical hypothesis to guide medical therapeutics as the "normalization heuristic". In this paper, we operationally define this heuristic and discuss its limitations as a rule of thumb for clinical decision making. We review historical and contemporaneous examples of normalization practices as empirical evidence for the normalization heuristic and to highlight its frailty as a guide for clinical decision making.
Rieger, Marc Oliver; Wang, Mei
2008-01-01
Comments on the article by E. Brandstätter, G. Gigerenzer, and R. Hertwig. The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral economics. They also discuss how general models for decisions under risk based on a heuristic approach can be understood mathematically to gain some insight in their limitations. They finally consider whether the priority heuristic model can lead to some understanding of the decision process of individuals or whether it is better seen as an as-if model. (c) 2008 APA, all rights reserved
Fourth Graders' Heuristic Problem-Solving Behavior.
ERIC Educational Resources Information Center
Lee, Kil S.
1982-01-01
Eight boys and eight girls from a rural elementary school participated in the investigation. Specific heuristics were adopted from Polya; and the students selected represented two substages of Piaget's concrete operational stage. Five hypotheses were generated, based on observed results and the study's theoretical rationale. (MP)
Operational Planning of Channel Airlift Missions Using Forecasted Demand
2013-03-01
tailored to the specific problem ( Metaheuristics , 2005). As seen in the section Cargo Loading Algorithm , heuristic methods are often iterative...that are equivalent to the forecasted cargo amount. The simulated pallets are then used in a heuristic cargo loading algorithm . The loading... algorithm places cargo onto available aircraft (based on real schedules) given the date and the destination and outputs statistics based on the aircraft ton
A Simulation of Readiness-Based Sparing Policies
2017-06-01
variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...available in the optimization tools. 14. SUBJECT TERMS readiness-based sparing, discrete event simulation, optimization, multi-indenture...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the
A heuristic re-mapping algorithm reducing inter-level communication in SAMR applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steensland, Johan; Ray, Jaideep
2003-07-01
This paper aims at decreasing execution time for large-scale structured adaptive mesh refinement (SAMR) applications by proposing a new heuristic re-mapping algorithm and experimentally showing its effectiveness in reducing inter-level communication. Tests were done for five different SAMR applications. The overall goal is to engineer a dynamically adaptive meta-partitioner capable of selecting and configuring the most appropriate partitioning strategy at run-time based on current system and application state. Such a metapartitioner can significantly reduce execution times for general SAMR applications. Computer simulations of physical phenomena are becoming increasingly popular as they constitute an important complement to real-life testing. In manymore » cases, such simulations are based on solving partial differential equations by numerical methods. Adaptive methods are crucial to efficiently utilize computer resources such as memory and CPU. But even with adaption, the simulations are computationally demanding and yield huge data sets. Thus parallelization and the efficient partitioning of data become issues of utmost importance. Adaption causes the workload to change dynamically, calling for dynamic (re-) partitioning to maintain efficient resource utilization. The proposed heuristic algorithm reduced inter-level communication substantially. Since the complexity of the proposed algorithm is low, this decrease comes at a relatively low cost. As a consequence, we draw the conclusion that the proposed re-mapping algorithm would be useful to lower overall execution times for many large SAMR applications. Due to its usefulness and its parameterization, the proposed algorithm would constitute a natural and important component of the meta-partitioner.« less
Neural basis of scientific innovation induced by heuristic prototype.
Luo, Junlong; Li, Wenfu; Qiu, Jiang; Wei, Dongtao; Liu, Yijun; Zhang, Qinlin
2013-01-01
A number of major inventions in history have been based on bionic imitation. Heuristics, by applying biological systems to the creation of artificial devices and machines, might be one of the most critical processes in scientific innovation. In particular, prototype heuristics propositions that innovation may engage automatic activation of a prototype such as a biological system to form novel associations between a prototype's function and problem-solving. We speculated that the cortical dissociation between the automatic activation and forming novel associations in innovation is critical point to heuristic creativity. In the present study, novel and old scientific innovations (NSI and OSI) were selected as experimental materials in using learning-testing paradigm to explore the neural basis of scientific innovation induced by heuristic prototype. College students were required to resolve NSI problems (to which they did not know the answers) and OSI problems (to which they knew the answers). From two fMRI experiments, our results showed that the subjects could resolve NSI when provided with heuristic prototypes. In Experiment 1, it was found that the lingual gyrus (LG; BA18) might be related to prototype heuristics in college students resolving NSI after learning a relative prototype. In Experiment 2, the LG (BA18) and precuneus (BA31) were significantly activated for NSI compared to OSI when college students learned all prototypes one day before the test. In addition, the mean beta-values of these brain regions of NSI were all correlated with the behavior accuracy of NSI. As our hypothesis indicated, the findings suggested that the LG might be involved in forming novel associations using heuristic information, while the precuneus might be involved in the automatic activation of heuristic prototype during scientific innovation.
Neural Basis of Scientific Innovation Induced by Heuristic Prototype
Qiu, Jiang; Wei, Dongtao; Liu, Yijun; Zhang, Qinlin
2013-01-01
A number of major inventions in history have been based on bionic imitation. Heuristics, by applying biological systems to the creation of artificial devices and machines, might be one of the most critical processes in scientific innovation. In particular, prototype heuristics propositions that innovation may engage automatic activation of a prototype such as a biological system to form novel associations between a prototype's function and problem-solving. We speculated that the cortical dissociation between the automatic activation and forming novel associations in innovation is critical point to heuristic creativity. In the present study, novel and old scientific innovations (NSI and OSI) were selected as experimental materials in using learning-testing paradigm to explore the neural basis of scientific innovation induced by heuristic prototype. College students were required to resolve NSI problems (to which they did not know the answers) and OSI problems (to which they knew the answers). From two fMRI experiments, our results showed that the subjects could resolve NSI when provided with heuristic prototypes. In Experiment 1, it was found that the lingual gyrus (LG; BA18) might be related to prototype heuristics in college students resolving NSI after learning a relative prototype. In Experiment 2, the LG (BA18) and precuneus (BA31) were significantly activated for NSI compared to OSI when college students learned all prototypes one day before the test. In addition, the mean beta-values of these brain regions of NSI were all correlated with the behavior accuracy of NSI. As our hypothesis indicated, the findings suggested that the LG might be involved in forming novel associations using heuristic information, while the precuneus might be involved in the automatic activation of heuristic prototype during scientific innovation. PMID:23372641
Khader, Patrick H; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rösler, Frank
2011-11-01
Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by activating long-term memory representations of only those attributes that are necessary for the decision. However, from behavioral studies alone, it is unclear whether using heuristics is indeed associated with limited memory search. The present study tested this assumption by monitoring the activation of specific long-term-memory representations with fMRI while participants made memory-based decisions using the "take-the-best" heuristic. For different decision trials, different numbers and types of information had to be retrieved and processed. The attributes consisted of visual information known to be represented in different parts of the posterior cortex. We found that the amount of information required for a decision was mirrored by a parametric activation of the dorsolateral PFC. Such a parametric pattern was also observed in all posterior areas, suggesting that activation was not limited to those attributes required for a decision. However, the posterior increases were systematically modulated by the relative importance of the information for making a decision. These findings suggest that memory-based decision-making is mediated by the dorsolateral PFC, which selectively controls posterior storage areas. In addition, the systematic modulations of the posterior activations indicate a selective boosting of activation of decision-relevant attributes.
Welch, Brandon; Brinda, FNU
2017-01-01
Background Telemedicine is the use of technology to provide and support health care when distance separates the clinical service and the patient. Home-based telemedicine systems involve the use of such technology for medical support and care connecting the patient from the comfort of their homes with the clinician. In order for such a system to be used extensively, it is necessary to understand not only the issues faced by the patients in using them but also the clinician. Objectives The aim of this study was to conduct a heuristic evaluation of 4 telemedicine software platforms—Doxy.me, Polycom, Vidyo, and VSee—to assess possible problems and limitations that could affect the usability of the system from the clinician’s perspective. Methods It was found that 5 experts individually evaluated all four systems using Nielsen’s list of heuristics, classifying the issues based on a severity rating scale. Results A total of 46 unique problems were identified by the experts. The heuristics most frequently violated were visibility of system status and Error prevention amounting to 24% (11/46 issues) each. Esthetic and minimalist design was second contributing to 13% (6/46 issues) of the total errors. Conclusions Heuristic evaluation coupled with a severity rating scale was found to be an effective method for identifying problems with the systems. Prioritization of these problems based on the rating provides a good starting point for resolving the issues affecting these platforms. There is a need for better transparency and a more streamlined approach for how physicians use telemedicine systems. Visibility of the system status and speaking the users’ language are keys for achieving this. PMID:28438724
ERIC Educational Resources Information Center
McDonough, Ian M.; Gallo, David A.
2008-01-01
Retrieval monitoring enhances episodic memory accuracy. For instance, false recognition is reduced when participants base their decisions on more distinctive recollections, a retrieval monitoring process called the distinctiveness heuristic. The experiments reported here tested the hypothesis that autobiographical elaboration during study (i.e.,…
NASA Astrophysics Data System (ADS)
Garcia-Santiago, C. A.; Del Ser, J.; Upton, C.; Quilligan, F.; Gil-Lopez, S.; Salcedo-Sanz, S.
2015-11-01
When seeking near-optimal solutions for complex scheduling problems, meta-heuristics demonstrate good performance with affordable computational effort. This has resulted in a gravitation towards these approaches when researching industrial use-cases such as energy-efficient production planning. However, much of the previous research makes assumptions about softer constraints that affect planning strategies and about how human planners interact with the algorithm in a live production environment. This article describes a job-shop problem that focuses on minimizing energy consumption across a production facility of shared resources. The application scenario is based on real facilities made available by the Irish Center for Manufacturing Research. The formulated problem is tackled via harmony search heuristics with random keys encoding. Simulation results are compared to a genetic algorithm, a simulated annealing approach and a first-come-first-served scheduling. The superior performance obtained by the proposed scheduler paves the way towards its practical implementation over industrial production chains.
Katz, David; Detsky, Allan S
2016-02-01
This Perspective proposes the introduction of metacognition (thinking about thinking) into the existing format of hospital-based morbidity and mortality rounds. It is placed in the context of historical movements to advance quality improvement by expanding the spectrum of the causes of medical error from systems-based issues to flawed human decision-making capabilities. We suggest that the current approach that focuses on systems-based issues can be improved by exploiting the opportunities to educate physicians about predictable errors committed by reliance on cognitive heuristics. In addition, because the field of educating clinicians about cognitive heuristics has shown mixed results, this proposal represents fertile ground for further research. Educating clinicians about cognitive heuristics may improve metacognition and perhaps be the next frontier in quality improvement. © 2015 Society of Hospital Medicine.
NASA Astrophysics Data System (ADS)
de O. Rocha, Helder R.; Castellani, Carlos E. S.; Silva, Jair A. L.; Pontes, Maria J.; Segatto, Marcelo E. V.
2015-01-01
We report a simple budget heuristic for a fast optimization of multipump Raman amplifiers based on the reallocation of the pump wavelengths and the optical powers. A set of different optical fibers are analyzed as the Raman gain medium, and a four-pump amplifier setup is optimized for each of them in order to achieve ripples close to 1 dB and gains up to 20 dB in the C band. Later, a comparison between our proposed heuristic and a multiobjective optimization based on a nondominated sorting genetic algorithm is made, highlighting the fact that our new approach can give similar solutions after at least an order of magnitude fewer iterations. The results shown in this paper can potentially pave the way for real-time optimization of multipump Raman amplifier systems.
Solving Inverse Kinematics of Robot Manipulators by Means of Meta-Heuristic Optimisation
NASA Astrophysics Data System (ADS)
Wichapong, Kritsada; Bureerat, Sujin; Pholdee, Nantiwat
2018-05-01
This paper presents the use of meta-heuristic algorithms (MHs) for solving inverse kinematics of robot manipulators based on using forward kinematic. Design variables are joint angular displacements used to move a robot end-effector to the target in the Cartesian space while the design problem is posed to minimize error between target points and the positions of the robot end-effector. The problem is said to be a dynamic problem as the target points always changed by a robot user. Several well established MHs are used to solve the problem and the results obtained from using different meta-heuristics are compared based on the end-effector error and searching speed of the algorithms. From the study, the best performer will be obtained for setting as the baseline for future development of MH-based inverse kinematic solving.
Neural substrates of similarity and rule-based strategies in judgment
von Helversen, Bettina; Karlsson, Linnea; Rasch, Björn; Rieskamp, Jörg
2014-01-01
Making accurate judgments is a core human competence and a prerequisite for success in many areas of life. Plenty of evidence exists that people can employ different judgment strategies to solve identical judgment problems. In categorization, it has been demonstrated that similarity-based and rule-based strategies are associated with activity in different brain regions. Building on this research, the present work tests whether solving two identical judgment problems recruits different neural substrates depending on people's judgment strategies. Combining cognitive modeling of judgment strategies at the behavioral level with functional magnetic resonance imaging (fMRI), we compare brain activity when using two archetypal judgment strategies: a similarity-based exemplar strategy and a rule-based heuristic strategy. Using an exemplar-based strategy should recruit areas involved in long-term memory processes to a larger extent than a heuristic strategy. In contrast, using a heuristic strategy should recruit areas involved in the application of rules to a larger extent than an exemplar-based strategy. Largely consistent with our hypotheses, we found that using an exemplar-based strategy led to relatively higher BOLD activity in the anterior prefrontal and inferior parietal cortex, presumably related to retrieval and selective attention processes. In contrast, using a heuristic strategy led to relatively higher activity in areas in the dorsolateral prefrontal and the temporal-parietal cortex associated with cognitive control and information integration. Thus, even when people solve identical judgment problems, different neural substrates can be recruited depending on the judgment strategy involved. PMID:25360099
2013-01-01
Background Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients’ condition, the necessity of the treatment, and the patients’ preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient’s recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. Methods The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model’s cost factors. Results A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model’s cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). Conclusions In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning. PMID:23289448
Schmidt, Robert; Geisler, Sandra; Spreckelsen, Cord
2013-01-07
Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients' condition, the necessity of the treatment, and the patients' preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient's recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model's cost factors. A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model's cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning.
A derived heuristics based multi-objective optimization procedure for micro-grid scheduling
NASA Astrophysics Data System (ADS)
Li, Xin; Deb, Kalyanmoy; Fang, Yanjun
2017-06-01
With the availability of different types of power generators to be used in an electric micro-grid system, their operation scheduling as the load demand changes with time becomes an important task. Besides satisfying load balance constraints and the generator's rated power, several other practicalities, such as limited availability of grid power and restricted ramping of power output from generators, must all be considered during the operation scheduling process, which makes it difficult to decide whether the optimization results are accurate and satisfactory. In solving such complex practical problems, heuristics-based customized optimization algorithms are suggested. However, due to nonlinear and complex interactions of variables, it is difficult to come up with heuristics in such problems off-hand. In this article, a two-step strategy is proposed in which the first task deciphers important heuristics about the problem and the second task utilizes the derived heuristics to solve the original problem in a computationally fast manner. Specifically, the specific operation scheduling is considered from a two-objective (cost and emission) point of view. The first task develops basic and advanced level knowledge bases offline from a series of prior demand-wise optimization runs and then the second task utilizes them to modify optimized solutions in an application scenario. Results on island and grid connected modes and several pragmatic formulations of the micro-grid operation scheduling problem clearly indicate the merit of the proposed two-step procedure.
Community-aware task allocation for social networked multiagent systems.
Wang, Wanyuan; Jiang, Yichuan
2014-09-01
In this paper, we propose a novel community-aware task allocation model for social networked multiagent systems (SN-MASs), where the agent' cooperation domain is constrained in community and each agent can negotiate only with its intracommunity member agents. Under such community-aware scenarios, we prove that it remains NP-hard to maximize system overall profit. To solve this problem effectively, we present a heuristic algorithm that is composed of three phases: 1) task selection: select the desirable task to be allocated preferentially; 2) allocation to community: allocate the selected task to communities based on a significant task-first heuristics; and 3) allocation to agent: negotiate resources for the selected task based on a nonoverlap agent-first and breadth-first resource negotiation mechanism. Through the theoretical analyses and experiments, the advantages of our presented heuristic algorithm and community-aware task allocation model are validated. 1) Our presented heuristic algorithm performs very closely to the benchmark exponential brute-force optimal algorithm and the network flow-based greedy algorithm in terms of system overall profit in small-scale applications. Moreover, in the large-scale applications, the presented heuristic algorithm achieves approximately the same overall system profit, but significantly reduces the computational load compared with the greedy algorithm. 2) Our presented community-aware task allocation model reduces the system communication cost compared with the previous global-aware task allocation model and improves the system overall profit greatly compared with the previous local neighbor-aware task allocation model.
Three hybridization models based on local search scheme for job shop scheduling problem
NASA Astrophysics Data System (ADS)
Balbi Fraga, Tatiana
2015-05-01
This work presents three different hybridization models based on the general schema of Local Search Heuristics, named Hybrid Successive Application, Hybrid Neighborhood, and Hybrid Improved Neighborhood. Despite similar approaches might have already been presented in the literature in other contexts, in this work these models are applied to analyzes the solution of the job shop scheduling problem, with the heuristics Taboo Search and Particle Swarm Optimization. Besides, we investigate some aspects that must be considered in order to achieve better solutions than those obtained by the original heuristics. The results demonstrate that the algorithms derived from these three hybrid models are more robust than the original algorithms and able to get better results than those found by the single Taboo Search.
NASA Astrophysics Data System (ADS)
Kumar, Ravi; Singh, Surya Prakash
2017-11-01
The dynamic cellular facility layout problem (DCFLP) is a well-known NP-hard problem. It has been estimated that the efficient design of DCFLP reduces the manufacturing cost of products by maintaining the minimum material flow among all machines in all cells, as the material flow contributes around 10-30% of the total product cost. However, being NP hard, solving the DCFLP optimally is very difficult in reasonable time. Therefore, this article proposes a novel similarity score-based two-phase heuristic approach to solve the DCFLP optimally considering multiple products in multiple times to be manufactured in the manufacturing layout. In the first phase of the proposed heuristic, a machine-cell cluster is created based on similarity scores between machines. This is provided as an input to the second phase to minimize inter/intracell material handling costs and rearrangement costs over the entire planning period. The solution methodology of the proposed approach is demonstrated. To show the efficiency of the two-phase heuristic approach, 21 instances are generated and solved using the optimization software package LINGO. The results show that the proposed approach can optimally solve the DCFLP in reasonable time.
Katapodi, Maria C; Dodd, Marylin J; Facione, Noreen C; Humphreys, Janice C; Lee, Kathryn A
2010-01-01
Perceived risk to a health problem is formed by inferential rules called heuristics and by comparative judgments that assess how one's risk compares to the risk of others. The purpose of this cross-sectional, community-based survey was to examine how experiences with breast cancer, knowledge of risk factors, and specific heuristics inform risk judgments for oneself, for friends/peers, and comparative judgments for breast cancer (risk friends/peers - risk self). We recruited an English-speaking, multicultural (57% nonwhite) sample of 184 middle-aged (47 + or - 12 years old), well-educated women. Fifty percent of participants perceived that their breast cancer risk was the same as the risk of their friends/peers; 10% were pessimistic (risk friends/peers - risk self < 0), whereas 40% were optimistic (risk friends/peers - risk self > 0). Family history of breast cancer and worry informed risk judgments for oneself. The availability and cultural heuristics specific for black women informed risk judgments for friends/peers. Knowledge of risk factors and interactions of knowledge with the availability, representativeness, and simulation heuristics informed comparative judgments (risk friends/peers - risk self). We discuss cognitive mechanisms with which experiences, knowledge, and heuristics influence comparative breast cancer risk judgments. Risk communication interventions should assess knowledge deficits, contextual variables, and specific heuristics that activate differential information processing mechanisms.
Heuristic Reasoning in Chemistry: Making decisions about acid strength
NASA Astrophysics Data System (ADS)
McClary, LaKeisha; Talanquer, Vicente
2011-07-01
The characterization of students' reasoning strategies is of central importance in the development of instructional strategies that foster meaningful learning. In particular, the identification of shortcut reasoning procedures (heuristics) used by students to reduce cognitive load can help us devise strategies to facilitate the development of more analytical ways of thinking. The central goal of this qualitative study was thus to investigate heuristic reasoning as used by organic chemistry college students, focusing our attention on their ability to predict the relative acid strength of chemical compounds represented using explicit composition and structural features (i.e., structural formulas). Our results indicated that many study participants relied heavily on one or more of the following heuristics to make most of their decisions: reduction, representativeness, and lexicographic. Despite having visual access to reach structural information about the substances included in each ranking task, many students relied on isolated composition features to make their decisions. However, the specific characteristics of the tasks seemed to trigger heuristic reasoning in different ways. Although the use of heuristics allowed students to simplify some components of the ranking tasks and generate correct responses, it often led them astray. Very few study participants predicted the correct trends based on scientifically acceptable arguments. Our results suggest the need for instructional interventions that explicitly develop college chemistry students' abilities to monitor their thinking and evaluate the effectiveness of analytical versus heuristic reasoning strategies in different contexts.
Putting cognitive psychology to work: Improving decision-making in the medical encounter.
Schwab, Abraham P
2008-12-01
Empirical research in social psychology has provided robust support for the accuracy of the heuristics and biases approach to human judgment. This research, however, has not been systematically investigated regarding its potential applications for specific health care decision-makers. This paper makes the case for investigating the heuristics and biases approach in the patient-physician relationship and recommends strategic empirical research. It is argued that research will be valuable for particular decisions in the clinic and for examining and altering the background conditions of patient and physician decision-making.
Pieterse, Arwen H; de Vries, Marieke
2013-09-01
Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference-sensitive health-care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic-based VCMs. To critically analyse the suitability of the 'take the best' (TTB) and 'tallying' fast and frugal heuristics in the context of patient decision making. Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. The specific nature of patient preference-sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. © 2011 John Wiley & Sons Ltd.
Savoy, April; Patel, Himalaya; Flanagan, Mindy E; Weiner, Michael; Russ, Alissa L
2017-08-01
We assessed the usability of consultation order templates and identified problems to prioritize in design efforts for improving referral communication. With a sample of 26 consultation order templates, three evaluators performed a usability heuristic evaluation. The evaluation used 14 domain-independent heuristics and the following three supplemental references: 1 new domain-specific heuristic, 6 usability goals, and coded clinicians' statements regarding ease of use for 10 sampled templates. Evaluators found 201 violations, a mean of 7.7 violations per template. Minor violations outnumbered major violations almost twofold, 115 (57%) to 62 (31%). Approximately 68% of violations were linked to 5 heuristics: aesthetic and minimalist design (17%), error prevention (16%), consistency and standards (14%), recognition rather than recall (11%), and meet referrers' information needs (10%). Severe violations were attributed mostly to meet referrers' information needs and recognition rather than recall. Recorded violations yielded potential negative consequences for efficiency, effectiveness, safety, learnability, and utility. Evaluators and clinicians demonstrated 80% agreement in usability assessment. Based on frequency and severity of usability heuristic violations, the consultation order templates reviewed may impede clinical efficiency and risk patient safety. Results support the following design considerations: communicate consultants' requirements, facilitate information seeking, and support communication. While the most frequent heuristic violations involved interaction design and presentation, the most severe violations lacked information desired by referring clinicians. Violations related to templates' inability to support referring clinicians' information needs had the greatest potential negative impact on efficiency and safety usability goals. Heuristics should be prioritized in future design efforts.
Take the first heuristic, self-efficacy, and decision-making in sport.
Hepler, Teri J; Feltz, Deborah L
2012-06-01
Can taking the first (TTF) option in decision-making lead to the best decisions in sports contexts? And, is one's decision-making self-efficacy in that context linked to TTF decisions? The purpose of this study was to examine the role of the TTF heuristic and self-efficacy in decision-making on a simulated sports task. Undergraduate and graduate students (N = 72) participated in the study and performed 13 trials in each of two video-based basketball decision tasks. One task required participants to verbally generate options before making a final decision on what to do next, while the other task simply asked participants to make a decision regarding the next move as quickly as possible. Decision-making self-efficacy was assessed using a 10-item questionnaire comprising various aspects of decision-making in basketball. Participants also rated their confidence in the final decision. Results supported many of the tenets of the TTF heuristic, such that people used the heuristic on a majority of the trials (70%), earlier generated options were better than later ones, first options were meaningfully generated, and final options were meaningfully selected. Results did not support differences in dynamic inconsistency or decision confidence based on the number of options. Findings also supported the link between self-efficacy and the TTF heuristic. Participants with higher self-efficacy beliefs used TTF more frequently and generated fewer options than those with low self-efficacy. Thus, not only is TTF an important heuristic when making decisions in dynamic, time-pressure situations, but self-efficacy plays an influential role in TTF.
Tashkova, Katerina; Korošec, Peter; Silc, Jurij; Todorovski, Ljupčo; Džeroski, Sašo
2011-10-11
We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology.
2011-01-01
Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology. PMID:21989196
Nash, Mark S.; Cowan, Rachel E.; Kressler, Jochen
2012-01-01
Component and coalesced health risks of the cardiometabolic syndrome (CMS) are commonly reported in persons with spinal cord injuries (SCIs). These CMS hazards are also co-morbid with physical deconditioning and elevated pro-atherogenic inflammatory cytokines, both of which are common after SCI and worsen the prognosis for all-cause cardiovascular disease. This article describes a systematic procedure for individualized CMS risk assessment after SCI, and emphasizes evidence-based and intuition-centered countermeasures to disease. A unified approach will propose therapeutic lifestyle intervention as a routine plan for aggressive primary prevention in this risk-susceptible population. Customization of dietary and exercise plans then follow, identifying shortfalls in diet and activity patterns, and ways in which these healthy lifestyles can be more substantially embraced by both stakeholders with SCI and their health care providers. In cases where lifestyle intervention utilizing diet and exercise is unsuccessful in countering risks, available pharmacotherapies and a preferred therapeutic agent are proposed according to authoritative standards. The over-arching purpose of the monograph is to create an operational framework in which existing evidence-based approaches or heuristic modeling becomes best practice. In this way persons with SCI can lead more active and healthy lives. PMID:23031165
Case-based clinical reasoning in feline medicine: 1: Intuitive and analytical systems.
Canfield, Paul J; Whitehead, Martin L; Johnson, Robert; O'Brien, Carolyn R; Malik, Richard
2016-01-01
This is Article 1 of a three-part series on clinical reasoning that encourages practitioners to explore and understand how they think and make case-based decisions. It is hoped that, in the process, they will learn to trust their intuition but, at the same time, put in place safeguards to diminish the impact of bias and misguided logic on their diagnostic decision-making. This first article discusses the relative merits and shortcomings of System 1 thinking (immediate and unconscious) and System 2 thinking (effortful and analytical). Articles 2 and 3, to appear in the March and May 2016 issues of JFMS, respectively, will examine managing cognitive error, and use of heuristics (mental short cuts) and illness scripts in diagnostic reasoning. © The Author(s) 2016.
Common-sense chemistry: The use of assumptions and heuristics in problem solving
NASA Astrophysics Data System (ADS)
Maeyer, Jenine Rachel
Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many interviewees seemed to view chemical reactions as macroscopic reassembling processes where favorability was related to the perceived ease with which reactants broke apart or products formed. Students also expressed spurious chemical assumptions based on the misinterpretation and overgeneralization of periodicity and electronegativity. Our findings suggest the need to create more opportunities for college chemistry students to monitor their thinking, develop and apply analytical ways of reasoning, and evaluate the effectiveness of shortcut reasoning procedures in different contexts.
Clinical Case Studies in Psychoanalytic and Psychodynamic Treatment
Willemsen, Jochem; Della Rosa, Elena; Kegerreis, Sue
2017-01-01
This manuscript provides a review of the clinical case study within the field of psychoanalytic and psychodynamic treatment. The method has been contested for methodological reasons and because it would contribute to theoretical pluralism in the field. We summarize how the case study method is being applied in different schools of psychoanalysis, and we clarify the unique strengths of this method and areas for improvement. Finally, based on the literature and on our own experience with case study research, we come to formulate nine guidelines for future case study authors: (1) basic information to include, (2) clarification of the motivation to select a particular patient, (3) information about informed consent and disguise, (4) patient background and context of referral or self-referral, (5) patient's narrative, therapist's observations and interpretations, (6) interpretative heuristics, (7) reflexivity and counter-transference, (8) leaving room for interpretation, and (9) answering the research question, and comparison with other cases. PMID:28210235
Example-Based Learning in Heuristic Domains: A Cognitive Load Theory Account
ERIC Educational Resources Information Center
Renkl, Alexander; Hilbert, Tatjana; Schworm, Silke
2009-01-01
One classical instructional effect of cognitive load theory (CLT) is the worked-example effect. Although the vast majority of studies have focused on well-structured and algorithmic sub-domains of mathematics or physics, more recent studies have also analyzed learning with examples from complex domains in which only heuristic solution strategies…
The Heuristic Sandbox: Developing Teacher Know-How through Play in simSchool
ERIC Educational Resources Information Center
Hopper, Susan B.
2018-01-01
simSchool is a game-based, virtual, and interactive tool that allows pre-service teachers to acquire new skills while constructing knowledge through experimentation with learning situations. Pre-service teachers develop know-how--or heuristic knowledge--through repeated practice in the "Personality Plus Higher-Order Thinking" module to…
Balancing Self-Directed Learning with Expert Mentoring: The Science Writing Heuristic Approach
ERIC Educational Resources Information Center
Shelley, Mack; Fostvedt, Luke; Gonwa-Reeves, Christopher; Baenziger, Joan; McGill, Michael; Seefeld, Ashley; Hand, Brian; Therrien, William; Taylor, Jonte; Villanueva, Mary Grace
2012-01-01
This study focuses on the implementation of the Science Writing Heuristic (SWH) curriculum (Hand, 2007), which combines current understandings of learning as a cognitive and negotiated process with the techniques of argument-based inquiry, critical thinking skills, and writing to strengthen student outcomes. Success of SWH is dependent on the…
One-Reason Decision Making Unveiled: A Measurement Model of the Recognition Heuristic
ERIC Educational Resources Information Center
Hilbig, Benjamin E.; Erdfelder, Edgar; Pohl, Rudiger F.
2010-01-01
The fast-and-frugal recognition heuristic (RH) theory provides a precise process description of comparative judgments. It claims that, in suitable domains, judgments between pairs of objects are based on recognition alone, whereas further knowledge is ignored. However, due to the confound between recognition and further knowledge, previous…
Web-Based Family Life Education: Spotlight on User Experience
ERIC Educational Resources Information Center
Doty, Jennifer; Doty, Matthew; Dwrokin, Jodi
2011-01-01
Family Life Education (FLE) websites can benefit from the field of user experience, which makes technology easy to use. A heuristic evaluation of five FLE sites was performed using Neilson's heuristics, guidelines for making sites user friendly. Greater site complexity resulted in more potential user problems. Sites most frequently had problems…
Augmented neural networks and problem structure-based heuristics for the bin-packing problem
NASA Astrophysics Data System (ADS)
Kasap, Nihat; Agarwal, Anurag
2012-08-01
In this article, we report on a research project where we applied augmented-neural-networks (AugNNs) approach for solving the classical bin-packing problem (BPP). AugNN is a metaheuristic that combines a priority rule heuristic with the iterative search approach of neural networks to generate good solutions fast. This is the first time this approach has been applied to the BPP. We also propose a decomposition approach for solving harder BPP, in which subproblems are solved using a combination of AugNN approach and heuristics that exploit the problem structure. We discuss the characteristics of problems on which such problem structure-based heuristics could be applied. We empirically show the effectiveness of the AugNN and the decomposition approach on many benchmark problems in the literature. For the 1210 benchmark problems tested, 917 problems were solved to optimality and the average gap between the obtained solution and the upper bound for all the problems was reduced to under 0.66% and computation time averaged below 33 s per problem. We also discuss the computational complexity of our approach.
Norris, Gareth
2015-01-01
The increasing use of multi-media applications, trial presentation software and computer generated exhibits (CGE) has raised questions as to the potential impact of the use of presentation technology on juror decision making. A significant amount of the commentary on the manner in which CGE exerts legal influence is largely anecdotal; empirical examinations too are often devoid of established theoretical rationalisations. This paper will examine a range of established judgement heuristics (for example, the attribution error, representativeness, simulation), in order to establish their appropriate application for comprehending legal decisions. Analysis of both past cases and empirical studies will highlight the potential for heuristics and biases to be restricted or confounded by the use of CGE. The paper will conclude with some wider discussion on admissibility, access to justice, and emerging issues in the use of multi-media in court. Copyright © 2015 Elsevier Ltd. All rights reserved.
[The relation of circadian variations of heuristic behavior and CNS radioresistance in animals].
Ushakov, I B; Davydova, O E
1996-01-01
There has been studied the influence of g-radiation (60Co, 62.5 Gy, craniocaudal) on circadian dynamics of heuristic behaviour (the elements of rational-discriminative activity) of male white rats. There has been found equivocal nature of radiation action: mostly manifestations of some symptoms of neurologic disturbances observed in definite daily periods make difficult realizing behavioural act, but in the other cases such event is not observed (acrophases of both processes coincide). After disappearance of observed neurologic manifestations of central nervous system damage (symptoms of early transitory neurologic disturbances) during the short period of time after exposure to radiation the inversion of circadian rhythm of heuristic behaviour has not been found. The changes are expressed in significant increase of values of extremums and mesor in comparison with control groups not exposed to radiation. By 30 minute after exposure the process loses signs of rhythm, acquires smooth character and mesor response significantly decreases.
The quasi-optimality criterion in the linear functional strategy
NASA Astrophysics Data System (ADS)
Kindermann, Stefan; Pereverzyev, Sergiy, Jr.; Pilipenko, Andrey
2018-07-01
The linear functional strategy for the regularization of inverse problems is considered. For selecting the regularization parameter therein, we propose the heuristic quasi-optimality principle and some modifications including the smoothness of the linear functionals. We prove convergence rates for the linear functional strategy with these heuristic rules taking into account the smoothness of the solution and the functionals and imposing a structural condition on the noise. Furthermore, we study these noise conditions in both a deterministic and stochastic setup and verify that for mildly-ill-posed problems and Gaussian noise, these conditions are satisfied almost surely, where on the contrary, in the severely-ill-posed case and in a similar setup, the corresponding noise condition fails to hold. Moreover, we propose an aggregation method for adaptively optimizing the parameter choice rule by making use of improved rates for linear functionals. Numerical results indicate that this method yields better results than the standard heuristic rule.
Yang, S; Wang, D
2000-01-01
This paper presents a constraint satisfaction adaptive neural network, together with several heuristics, to solve the generalized job-shop scheduling problem, one of NP-complete constraint satisfaction problems. The proposed neural network can be easily constructed and can adaptively adjust its weights of connections and biases of units based on the sequence and resource constraints of the job-shop scheduling problem during its processing. Several heuristics that can be combined with the neural network are also presented. In the combined approaches, the neural network is used to obtain feasible solutions, the heuristic algorithms are used to improve the performance of the neural network and the quality of the obtained solutions. Simulations have shown that the proposed neural network and its combined approaches are efficient with respect to the quality of solutions and the solving speed.
Testing process predictions of models of risky choice: a quantitative model comparison approach
Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard
2013-01-01
This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472
Chen, Pei-Hua
2017-05-01
This rejoinder responds to the commentary by van der Linden and Li entiled "Comment on Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" on the article "Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" by Chen. Van der Linden and Li made a strong statement calling for the cessation of test assembly heuristics development, and instead encouraged embracing mixed integer programming (MIP). This article points out the nondeterministic polynomial (NP)-hard nature of MIP problems and how solutions found using heuristics could be useful in an MIP context. Although van der Linden and Li provided several practical examples of test assembly supporting their view, the examples ignore the cases in which a slight change of constraints or item pool data might mean it would not be possible to obtain solutions as quickly as before. The article illustrates the use of heuristic solutions to improve both the performance of MIP solvers and the quality of solutions. Additional responses to the commentary by van der Linden and Li are included.
Column generation algorithms for virtual network embedding in flexi-grid optical networks.
Lin, Rongping; Luo, Shan; Zhou, Jingwei; Wang, Sheng; Chen, Bin; Zhang, Xiaoning; Cai, Anliang; Zhong, Wen-De; Zukerman, Moshe
2018-04-16
Network virtualization provides means for efficient management of network resources by embedding multiple virtual networks (VNs) to share efficiently the same substrate network. Such virtual network embedding (VNE) gives rise to a challenging problem of how to optimize resource allocation to VNs and to guarantee their performance requirements. In this paper, we provide VNE algorithms for efficient management of flexi-grid optical networks. We provide an exact algorithm aiming to minimize the total embedding cost in terms of spectrum cost and computation cost for a single VN request. Then, to achieve scalability, we also develop a heuristic algorithm for the same problem. We apply these two algorithms for a dynamic traffic scenario where many VN requests arrive one-by-one. We first demonstrate by simulations for the case of a six-node network that the heuristic algorithm obtains very close blocking probabilities to exact algorithm (about 0.2% higher). Then, for a network of realistic size (namely, USnet) we demonstrate that the blocking probability of our new heuristic algorithm is about one magnitude lower than a simpler heuristic algorithm, which was a component of an earlier published algorithm.
Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph
2008-01-01
E. Brandstätter, G. Gigerenzer, and R. Hertwig (2006) showed that the priority heuristic matches or outperforms modifications of expected utility theory in predicting choice in 4 diverse problem sets. M. H. Birnbaum (2008) argued that sets exist in which the opposite is true. The authors agree--but stress that all choice strategies have regions of good and bad performance. The accuracy of various strategies systematically depends on choice difficulty, which the authors consider a triggering variable underlying strategy selection. Agreeing with E. J. Johnson, M. Schulte-Mecklenbeck, and M. C. Willemsen (2008) that process (not "as-if") models need to be formulated, the authors show how quantitative predictions can be derived and test them. Finally, they demonstrate that many of Birnbaum's and M. O. Rieger and M. Wang's (2008) case studies championing their preferred models involved biased tests in which the priority heuristic predicted data, whereas the parameterized models were fitted to the same data. The authors propose an adaptive toolbox approach of risky choice, according to which people first seek a no-conflict solution before resorting to conflict-resolving strategies such as the priority heuristic. (c) 2008 APA, all rights reserved
Design and usability of heuristic-based deliberation tools for women facing amniocentesis.
Durand, Marie-Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn
2012-03-01
Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). The 'Take The Best' heuristic (i.e. selection of a 'most important reason') and 'The Tallying' integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web-based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health-care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of 'your most important reason' (Take The Best) and 'weighing it up' (Tallying). Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health-care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that 'weighing it up' and 'your most important reason' were not appropriate when facing such a difficult and emotional decision. Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real-world decisions about amniocentesis. © 2011 Blackwell Publishing Ltd.
Strategy selection in cue-based decision making.
Bryant, David J
2014-06-01
People can make use of a range of heuristic and rational, compensatory strategies to perform a multiple-cue judgment task. It has been proposed that people are sensitive to the amount of cognitive effort required to employ decision strategies. Experiment 1 employed a dual-task methodology to investigate whether participants' preference for heuristic versus compensatory decision strategies can be altered by increasing the cognitive demands of the task. As indicated by participants' decision times, a secondary task interfered more with the performance of a heuristic than compensatory decision strategy but did not affect the proportions of participants using either type of strategy. A stimulus set effect suggested that the conjunction of cue salience and cue validity might play a determining role in strategy selection. The results of Experiment 2 indicated that when a perceptually salient cue was also the most valid, the majority of participants preferred a single-cue heuristic strategy. Overall, the results contradict the view that heuristics are more likely to be adopted when a task is made more cognitively demanding. It is argued that people employ 2 learning processes during training, one an associative learning process in which cue-outcome associations are developed by sampling multiple cues, and another that involves the sequential examination of single cues to serve as a basis for a single-cue heuristic.
The fallacy of financial heuristics.
Langabeer, James
2007-01-01
In turbulent times, the financial policies and decisions about cash and debt make or break hospitals' financial condition. Decisions about whether to continue saving cash or reduce debt burdens are probably the most vital policy decision for the hospital CFO. Unfortunately, my research shows that most administrators are relying on judgment, or best-guess heuristics to address these policy issues. This article explores one of the most common heuristics in health finance-ratios gauging debt and cash on hand. The subject is explored through the research and analysis of over 40 hospitals in a very competitive marketplace-the boroughs of New York City. Analyses of financial strength, through various statistical models, were conducted to explore the linkages between traditional heuristics and long-term economic results. Data were collected for 30 operational and financial indicators. Findings suggest that organizations require different cash-debt positions based on their overall financial health, and that a one-number heuristic does not fit all. Extremely financially constrained hospitals (those approaching bankruptcy conditions) should be building free cash flow and minimizing debt service, while financially secure hospitals need to minimize cash on hand while reducing debt. If all hospitals continue to try to meet an arbitrary days of cash heuristic, this simplification could cripple an organization. A much more effective metric requires each organization to model decisions more comprehensively.
Gartner, Daniel; Zhang, Yiye; Padman, Rema
2018-06-01
Order sets are a critical component in hospital information systems that are expected to substantially reduce physicians' physical and cognitive workload and improve patient safety. Order sets represent time interval-clustered order items, such as medications prescribed at hospital admission, that are administered to patients during their hospital stay. In this paper, we develop a mathematical programming model and an exact and a heuristic solution procedure with the objective of minimizing physicians' cognitive workload associated with prescribing order sets. Furthermore, we provide structural insights into the problem which lead us to a valid lower bound on the order set size. In a case study using order data on Asthma patients with moderate complexity from a major pediatric hospital, we compare the hospital's current solution with the exact and heuristic solutions on a variety of performance metrics. Our computational results confirm our lower bound and reveal that using a time interval decomposition approach substantially reduces computation times for the mathematical program, as does a K -means clustering based decomposition approach which, however, does not guarantee optimality because it violates the lower bound. The results of comparing the mathematical program with the current order set configuration in the hospital indicates that cognitive workload can be reduced by about 20.2% by allowing 1 to 5 order sets, respectively. The comparison of the K -means based decomposition with the hospital's current configuration reveals a cognitive workload reduction of about 19.5%, also by allowing 1 to 5 order sets, respectively. We finally provide a decision support system to help practitioners analyze the current order set configuration, the results of the mathematical program and the heuristic approach.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
A set-covering based heuristic algorithm for the periodic vehicle routing problem
Cacchiani, V.; Hemmelmayr, V.C.; Tricoire, F.
2014-01-01
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems. PMID:24748696
Student-Created Definitions of Sequence Convergence: A Case Study
ERIC Educational Resources Information Center
Fisher, Brian
2016-01-01
This paper describes the development of an instructional sequence designed to allow students to reinvent the definition of sequence convergence in an introductory proof course. The sequence follows a heuristic of guided reinvention that encourages students to independently create their own mathematical definitions. This case study reports on how…
Smart strategies for doctors and doctors-in-training: heuristics in medicine.
Wegwarth, Odette; Gaissmaier, Wolfgang; Gigerenzer, Gerd
2009-08-01
How do doctors make sound decisions when confronted with probabilistic data, time pressures and a heavy workload? One theory that has been embraced by many researchers is based on optimisation, which emphasises the need to integrate all information in order to arrive at sound decisions. This notion makes heuristics, which use less than complete information, appear as second-best strategies. In this article, we challenge this pessimistic view of heuristics. We introduce two medical problems that involve decision making to the reader: one concerns coronary care issues and the other macrolide prescriptions. In both settings, decision-making tools grounded in the principles of optimisation and heuristics, respectively, have been developed to assist doctors in making decisions. We explain the structure of each of these tools and compare their performance in terms of their facilitation of correct predictions. For decisions concerning both the coronary care unit and the prescribing of macrolides, we demonstrate that sacrificing information does not necessarily imply a forfeiting of predictive accuracy, but can sometimes even lead to better decisions. Subsequently, we discuss common misconceptions about heuristics and explain when and why ignoring parts of the available information can lead to the making of more robust predictions. Heuristics are neither good nor bad per se, but, if applied in situations to which they have been adapted, can be helpful companions for doctors and doctors-in-training. This, however, requires that heuristics in medicine be openly discussed, criticised, refined and then taught to doctors-in-training rather than being simply dismissed as harmful or irrelevant. A more uniform use of explicit and accepted heuristics has the potential to reduce variations in diagnoses and to improve medical care for patients.
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
A Heuristic Bioinspired for 8-Piece Puzzle
NASA Astrophysics Data System (ADS)
Machado, M. O.; Fabres, P. A.; Melo, J. C. L.
2017-10-01
This paper investigates a mathematical model inspired by nature, and presents a Meta-Heuristic that is efficient in improving the performance of an informed search, when using strategy A * using a General Search Tree as data structure. The work hypothesis suggests that the investigated meta-heuristic is optimal in nature and may be promising in minimizing the computational resources required by an objective-based agent in solving high computational complexity problems (n-part puzzle) as well as In the optimization of objective functions for local search agents. The objective of this work is to describe qualitatively the characteristics and properties of the mathematical model investigated, correlating the main concepts of the A * function with the significant variables of the metaheuristic used. The article shows that the amount of memory required to perform this search when using the metaheuristic is less than using the A * function to evaluate the nodes of a general search tree for the eight-piece puzzle. It is concluded that the meta-heuristic must be parameterized according to the chosen heuristic and the level of the tree that contains the possible solutions to the chosen problem.
On the psychology of the recognition heuristic: retrieval primacy as a key determinant of its use.
Pachur, Thorsten; Hertwig, Ralph
2006-09-01
The recognition heuristic is a prime example of a boundedly rational mind tool that rests on an evolved capacity, recognition, and exploits environmental structures. When originally proposed, it was conjectured that no other probabilistic cue reverses the recognition-based inference (D. G. Goldstein & G. Gigerenzer, 2002). More recent studies challenged this view and gave rise to the argument that recognition enters inferences just like any other probabilistic cue. By linking research on the heuristic with research on recognition memory, the authors argue that the retrieval of recognition information is not tantamount to the retrieval of other probabilistic cues. Specifically, the retrieval of subjective recognition precedes that of an objective probabilistic cue and occurs at little to no cognitive cost. This retrieval primacy gives rise to 2 predictions, both of which have been empirically supported: Inferences in line with the recognition heuristic (a) are made faster than inferences inconsistent with it and (b) are more prevalent under time pressure. Suspension of the heuristic, in contrast, requires additional time, and direct knowledge of the criterion variable, if available, can trigger such suspension. Copyright 2006 APA
Boolean Reasoning and Informed Search in the Minimization of Logic Circuits
1992-03-01
motivation of this project as well as a definition of the problem. The scope of the effort was presented, as well as the assumptions found to be...in the resulting formula than the expansion-based product operation. The primary motive for using the expansion-based product versus a cross-product...eliminant is formed is the least-binate-variable heuristic described in Chapter 2. The motivation for this heuristic was illustrated in Example 3.3. The
Scheid, Anika; Nebel, Markus E
2012-07-09
Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case - without sacrificing much of the accuracy of the results. Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms.
2012-01-01
Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case – without sacrificing much of the accuracy of the results. Conclusions Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms. PMID:22776037
Douali, Nassim; Csaba, Huszka; De Roo, Jos; Papageorgiou, Elpiniki I; Jaulent, Marie-Christine
2014-01-01
Several studies have described the prevalence and severity of diagnostic errors. Diagnostic errors can arise from cognitive, training, educational and other issues. Examples of cognitive issues include flawed reasoning, incomplete knowledge, faulty information gathering or interpretation, and inappropriate use of decision-making heuristics. We describe a new approach, case-based fuzzy cognitive maps, for medical diagnosis and evaluate it by comparison with Bayesian belief networks. We created a semantic web framework that supports the two reasoning methods. We used database of 174 anonymous patients from several European hospitals: 80 of the patients were female and 94 male with an average age 45±16 (average±stdev). Thirty of the 80 female patients were pregnant. For each patient, signs/symptoms/observables/age/sex were taken into account by the system. We used a statistical approach to compare the two methods. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Basnyat, Iccha; Lim, Cheryl
2017-07-06
Human papillomavirus (HPV) vaccination uptake in Singapore is low among young women. Low uptake has been found to be linked to low awareness. Thus, this study aimed to understand active and passive vaccine information-seeking behavior. Furthermore, guided by the Elaboration Likelihood Model (ELM), this study examined young women's (aged 21-26 years) processing of information they acquired in their decision to get vaccinated. ELM postulates that information processing could be through the central (i.e., logic-based) or peripheral (i.e., heuristic-based) route. Twenty-six in-depth interviews were conducted from January to March 2016. Data were analyzed using thematic analysis. Two meta-themes-information acquisition and vaccination decision-revealed the heuristic-based information processing was employed. These young women acquired information passively within their social network and actively in healthcare settings. However, they used heuristic cues, such as closeness and trust, to process the information. Similarly, vaccination decisions revealed that women relied on heuristic cues, such as sense of belonging and validation among peers and source credibility and likability in medical settings, in their decision to get vaccinated. The findings of this study highlight that intervention efforts should focus on strengthening social support among personal networks to increase the uptake of the vaccine.
On the Psychology of the Recognition Heuristic: Retrieval Primacy as a Key Determinant of Its Use
ERIC Educational Resources Information Center
Pachur, Thorsten; Hertwig, Ralph
2006-01-01
The recognition heuristic is a prime example of a boundedly rational mind tool that rests on an evolved capacity, recognition, and exploits environmental structures. When originally proposed, it was conjectured that no other probabilistic cue reverses the recognition-based inference (D. G. Goldstein & G. Gigerenzer, 2002). More recent studies…
Heuristics for Planning University Study at a Distance.
ERIC Educational Resources Information Center
Dodds, Agnes E.; Lawrence, Jeanette A.
A model to describe how adults work on university courses at a distance from campus was developed at an Australian university. The model was designed to describe how students define the task/goal and plan their study, based on G. Ploya's (1957) Heuristic and A. Newell's and H. A. Simon's (1972) General Problem Solver. Verbal reports were obtained…
ERIC Educational Resources Information Center
Hilbig, Benjamin E.; Pohl, Rudiger F.
2009-01-01
According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments--and its duration--is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of…
NASA Technical Reports Server (NTRS)
Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.
2009-01-01
This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.
Agnisarman, Sruthy; Narasimha, Shraddhaa; Chalil Madathil, Kapil; Welch, Brandon; Brinda, Fnu; Ashok, Aparna; McElligott, James
2017-04-24
Telemedicine is the use of technology to provide and support health care when distance separates the clinical service and the patient. Home-based telemedicine systems involve the use of such technology for medical support and care connecting the patient from the comfort of their homes with the clinician. In order for such a system to be used extensively, it is necessary to understand not only the issues faced by the patients in using them but also the clinician. The aim of this study was to conduct a heuristic evaluation of 4 telemedicine software platforms-Doxy.me, Polycom, Vidyo, and VSee-to assess possible problems and limitations that could affect the usability of the system from the clinician's perspective. It was found that 5 experts individually evaluated all four systems using Nielsen's list of heuristics, classifying the issues based on a severity rating scale. A total of 46 unique problems were identified by the experts. The heuristics most frequently violated were visibility of system status and Error prevention amounting to 24% (11/46 issues) each. Esthetic and minimalist design was second contributing to 13% (6/46 issues) of the total errors. Heuristic evaluation coupled with a severity rating scale was found to be an effective method for identifying problems with the systems. Prioritization of these problems based on the rating provides a good starting point for resolving the issues affecting these platforms. There is a need for better transparency and a more streamlined approach for how physicians use telemedicine systems. Visibility of the system status and speaking the users' language are keys for achieving this. ©Sruthy Agnisarman, Shraddhaa Narasimha, Kapil Chalil Madathil, Brandon Welch, FNU Brinda, Aparna Ashok, James McElligott. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 24.04.2017.
An investigation of the use of temporal decomposition in space mission scheduling
NASA Technical Reports Server (NTRS)
Bullington, Stanley E.; Narayanan, Venkat
1994-01-01
This research involves an examination of techniques for solving scheduling problems in long-duration space missions. The mission timeline is broken up into several time segments, which are then scheduled incrementally. Three methods are presented for identifying the activities that are to be attempted within these segments. The first method is a mathematical model, which is presented primarily to illustrate the structure of the temporal decomposition problem. Since the mathematical model is bound to be computationally prohibitive for realistic problems, two heuristic assignment procedures are also presented. The first heuristic method is based on dispatching rules for activity selection, and the second heuristic assigns performances of a model evenly over timeline segments. These heuristics are tested using a sample Space Station mission and a Spacelab mission. The results are compared with those obtained by scheduling the missions without any problem decomposition. The applicability of this approach to large-scale mission scheduling problems is also discussed.
Leveraging social system networks in ubiquitous high-data-rate health systems.
Massey, Tammara; Marfia, Gustavo; Stoelting, Adam; Tomasi, Riccardo; Spirito, Maurizio A; Sarrafzadeh, Majid; Pau, Giovanni
2011-05-01
Social system networks with high data rates and limited storage will discard data if the system cannot connect and upload the data to a central server. We address the challenge of limited storage capacity in mobile health systems during network partitions with a heuristic that achieves efficiency in storage capacity by modifying the granularity of the medical data during long intercontact periods. Patterns in the connectivity, reception rate, distance, and location are extracted from the social system network and leveraged in the global algorithm and online heuristic. In the global algorithm, the stochastic nature of the data is modeled with maximum likelihood estimation based on the distribution of the reception rates. In the online heuristic, the correlation between system position and the reception rate is combined with patterns in human mobility to estimate the intracontact and intercontact time. The online heuristic performs well with a low data loss of 2.1%-6.1%.
Petri nets SM-cover-based on heuristic coloring algorithm
NASA Astrophysics Data System (ADS)
Tkacz, Jacek; Doligalski, Michał
2015-09-01
In the paper, coloring heuristic algorithm of interpreted Petri nets is presented. Coloring is used to determine the State Machines (SM) subnets. The present algorithm reduces the Petri net in order to reduce the computational complexity and finds one of its possible State Machines cover. The proposed algorithm uses elements of interpretation of Petri nets. The obtained result may not be the best, but it is sufficient for use in rapid prototyping of logic controllers. Found SM-cover will be also used in the development of algorithms for decomposition, and modular synthesis and implementation of parallel logic controllers. Correctness developed heuristic algorithm was verified using Gentzen formal reasoning system.
Newell, Ben R
2005-01-01
The appeal of simple algorithms that take account of both the constraints of human cognitive capacity and the structure of environments has been an enduring theme in cognitive science. A novel version of such a boundedly rational perspective views the mind as containing an 'adaptive toolbox' of specialized cognitive heuristics suited to different problems. Although intuitively appealing, when this version was proposed, empirical evidence for the use of such heuristics was scant. I argue that in the light of empirical studies carried out since then, it is time this 'vision of rationality' was revised. An alternative view based on integrative models rather than collections of heuristics is proposed.
NASA Astrophysics Data System (ADS)
Suthikarnnarunai, N.; Olinick, E.
2009-01-01
We present a case study on the application of techniques for solving the Vehicle Routing Problem (VRP) to improve the transportation service provided by the University of The Thai Chamber of Commerce to its staff. The problem is modeled as VRP with time windows, split deliveries, and a mixed fleet. An exact algorithm and a heuristic solution procedure are developed to solve the problem and implemented in the AMPL modeling language and CPLEX Integer Programming solver. Empirical results indicate that the heuristic can find relatively good solutions in a small fraction of the time required by the exact method. We also perform sensitivity analysis and find that a savings in outsourcing cost can be achieved with a small increase in vehicle capacity.
Artificial Intelligence: Bayesian versus Heuristic Method for Diagnostic Decision Support.
Elkin, Peter L; Schlegel, Daniel R; Anderson, Michael; Komm, Jordan; Ficheur, Gregoire; Bisson, Leslie
2018-04-01
Evoking strength is one of the important contributions of the field of Biomedical Informatics to the discipline of Artificial Intelligence. The University at Buffalo's Orthopedics Department wanted to create an expert system to assist patients with self-diagnosis of knee problems and to thereby facilitate referral to the right orthopedic subspecialist. They had two independent sports medicine physicians review 469 cases. A board-certified orthopedic sports medicine practitioner, L.B., reviewed any disagreements until a gold standard diagnosis was reached. For each case, the patients entered 126 potential answers to 26 questions into a Web interface. These were modeled by an expert sports medicine physician and the answers were reviewed by L.B. For each finding, the clinician specified the sensitivity (term frequency) and both specificity (Sp) and the heuristic evoking strength (ES). Heuristics are methods of reasoning with only partial evidence. An expert system was constructed that reflected the posttest odds of disease-ranked list for each case. We compare the accuracy of using Sp to that of using ES (original model, p < 0.0008; term importance * disease importance [DItimesTI] model, p < 0.0001: Wilcoxon ranked sum test). For patient referral assignment, Sp in the DItimesTI model was superior to the use of ES. By the fifth diagnosis, the advantage was lost and so there is no difference between the techniques when serving as a reminder system. Schattauer GmbH Stuttgart.
"First, know thyself": cognition and error in medicine.
Elia, Fabrizio; Aprà, Franco; Verhovez, Andrea; Crupi, Vincenzo
2016-04-01
Although error is an integral part of the world of medicine, physicians have always been little inclined to take into account their own mistakes and the extraordinary technological progress observed in the last decades does not seem to have resulted in a significant reduction in the percentage of diagnostic errors. The failure in the reduction in diagnostic errors, notwithstanding the considerable investment in human and economic resources, has paved the way to new strategies which were made available by the development of cognitive psychology, the branch of psychology that aims at understanding the mechanisms of human reasoning. This new approach led us to realize that we are not fully rational agents able to take decisions on the basis of logical and probabilistically appropriate evaluations. In us, two different and mostly independent modes of reasoning coexist: a fast or non-analytical reasoning, which tends to be largely automatic and fast-reactive, and a slow or analytical reasoning, which permits to give rationally founded answers. One of the features of the fast mode of reasoning is the employment of standardized rules, termed "heuristics." Heuristics lead physicians to correct choices in a large percentage of cases. Unfortunately, cases exist wherein the heuristic triggered fails to fit the target problem, so that the fast mode of reasoning can lead us to unreflectively perform actions exposing us and others to variable degrees of risk. Cognitive errors arise as a result of these cases. Our review illustrates how cognitive errors can cause diagnostic problems in clinical practice.
NASA Technical Reports Server (NTRS)
Mengshoel, Ole J.; Roth, Dan; Wilkins, David C.
2001-01-01
Portfolio methods support the combination of different algorithms and heuristics, including stochastic local search (SLS) heuristics, and have been identified as a promising approach to solve computationally hard problems. While successful in experiments, theoretical foundations and analytical results for portfolio-based SLS heuristics are less developed. This article aims to improve the understanding of the role of portfolios of heuristics in SLS. We emphasize the problem of computing most probable explanations (MPEs) in Bayesian networks (BNs). Algorithmically, we discuss a portfolio-based SLS algorithm for MPE computation, Stochastic Greedy Search (SGS). SGS supports the integration of different initialization operators (or initialization heuristics) and different search operators (greedy and noisy heuristics), thereby enabling new analytical and experimental results. Analytically, we introduce a novel Markov chain model tailored to portfolio-based SLS algorithms including SGS, thereby enabling us to analytically form expected hitting time results that explain empirical run time results. For a specific BN, we show the benefit of using a homogenous initialization portfolio. To further illustrate the portfolio approach, we consider novel additive search heuristics for handling determinism in the form of zero entries in conditional probability tables in BNs. Our additive approach adds rather than multiplies probabilities when computing the utility of an explanation. We motivate the additive measure by studying the dramatic impact of zero entries in conditional probability tables on the number of zero-probability explanations, which again complicates the search process. We consider the relationship between MAXSAT and MPE, and show that additive utility (or gain) is a generalization, to the probabilistic setting, of MAXSAT utility (or gain) used in the celebrated GSAT and WalkSAT algorithms and their descendants. Utilizing our Markov chain framework, we show that expected hitting time is a rational function - i.e. a ratio of two polynomials - of the probability of applying an additive search operator. Experimentally, we report on synthetically generated BNs as well as BNs from applications, and compare SGSs performance to that of Hugin, which performs BN inference by compilation to and propagation in clique trees. On synthetic networks, SGS speeds up computation by approximately two orders of magnitude compared to Hugin. In application networks, our approach is highly competitive in Bayesian networks with a high degree of determinism. In addition to showing that stochastic local search can be competitive with clique tree clustering, our empirical results provide an improved understanding of the circumstances under which portfolio-based SLS outperforms clique tree clustering and vice versa.
Analyzing Historical Primary Source Open Educational Resources: A Blended Pedagogical Approach
ERIC Educational Resources Information Center
Oliver, Kevin M.; Purichia, Heather R.
2018-01-01
This qualitative case study addresses the need for pedagogical approaches to working with open educational resources (OER). Drawing on a mix of historical thinking heuristics and case analysis approaches, a blended pedagogical strategy and primary source database were designed to build student understanding of historical records with transfer of…
Pieterse, Arwen H.; de Vries, Marieke
2011-01-01
Abstract Background Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference‐sensitive health‐care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic‐based VCMs. Objective To critically analyse the suitability of the ‘take the best’ (TTB) and ‘tallying’ fast and frugal heuristics in the context of patient decision making. Strategy Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. Conclusion The specific nature of patient preference‐sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. PMID:21902770
How do people judge risks: availability heuristic, affect heuristic, or both?
Pachur, Thorsten; Hertwig, Ralph; Steinmann, Florian
2012-09-01
How does the public reckon which risks to be concerned about? The availability heuristic and the affect heuristic are key accounts of how laypeople judge risks. Yet, these two accounts have never been systematically tested against each other, nor have their predictive powers been examined across different measures of the public's risk perception. In two studies, we gauged risk perception in student samples by employing three measures (frequency, value of a statistical life, and perceived risk) and by using a homogeneous (cancer) and a classic set of heterogeneous causes of death. Based on these judgments of risk, we tested precise models of the availability heuristic and the affect heuristic and different definitions of availability and affect. Overall, availability-by-recall, a heuristic that exploits people's direct experience of occurrences of risks in their social network, conformed to people's responses best. We also found direct experience to carry a high degree of ecological validity (and one that clearly surpasses that of affective information). However, the relative impact of affective information (as compared to availability) proved more pronounced in value-of-a-statistical-life and perceived-risk judgments than in risk-frequency judgments. Encounters with risks in the media, in contrast, played a negligible role in people's judgments. Going beyond the assumption of exclusive reliance on either availability or affect, we also found evidence for mechanisms that combine both, either sequentially or in a composite fashion. We conclude with a discussion of policy implications of our results, including how to foster people's risk calibration and the success of education campaigns.
Heuristic evaluation of online COPD respiratory therapy and education video resource center.
Stellefson, Michael; Chaney, Beth; Chaney, Don
2014-10-01
Abstract Purpose: Because of limited accessibility to pulmonary rehabilitation programs, patients with chronic obstructive pulmonary disease (COPD) are infrequently provided with patient education resources. To help educate patients with COPD on how to live a better life with diminished breathing capacity, we developed a novel social media resource center containing COPD respiratory therapy and education videos called "COPDFlix." A heuristic evaluation of COPDFlix was conducted as part of a larger study to determine whether the prototype was successful in adhering to formal Web site usability guidelines for older adults. A purposive sample of three experts, with expertise in Web design and health communications technology, was recruited (a) to identify usability violations and (b) to propose solutions to improve the functionality of the COPDFlix prototype. Each expert evaluated 18 heuristics in four categories of task-based criteria (i.e., interaction and navigation, information architecture, presentation design, and information design). Seventy-six subcriteria across these four categories were assessed. Quantitative ratings and qualitative comments from each expert were compiled into a single master list, noting the violated heuristic and type/location of problem(s). Sixty-one usability violations were identified across the 18 heuristics. Evaluators rated the majority of heuristic subcriteria as either a "minor hindrance" (n=32) or "no problem" (n=132). Moreover, only 2 of the 18 heuristic categories were noted as "major" violations, with mean severity scores of ≥3. Mixed-methods data analysis helped the multidisciplinary research team to categorize and prioritize usability problems and solutions, leading to 26 discrete design modifications within the COPDFlix prototype.
Keshavarz, Behrang; Campos, Jennifer L; DeLucia, Patricia R; Oberfeld, Daniel
2017-04-01
Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background). Here we evaluated TTC estimates by using a traffic scene with an approaching vehicle to evaluate the weights of visual and auditory TTC cues under more realistic conditions. Younger (18-39 years) and older (65+ years) participants made TTC estimates in three sensory conditions: visual-only, auditory-only, and audio-visual. Stimuli were presented within an immersive virtual-reality environment, and cue weights were calculated for both visual cues (e.g., visual τ, final optical size) and auditory cues (e.g., auditory τ, final sound pressure level). The results demonstrated the use of visual τ as well as heuristic cues in the visual-only condition. TTC estimates in the auditory-only condition, however, were primarily based on an auditory heuristic cue (final sound pressure level), rather than on auditory τ. In the audio-visual condition, the visual cues dominated overall, with the highest weight being assigned to visual τ by younger adults, and a more equal weighting of visual τ and heuristic cues in older adults. Overall, better characterizing the effects of combined sensory inputs, stimulus characteristics, and age on the cues used to estimate TTC will provide important insights into how these factors may affect everyday behavior.
Exact and heuristic algorithms for Space Information Flow.
Uwitonze, Alfred; Huang, Jiaqing; Ye, Yuanqing; Cheng, Wenqing; Li, Zongpeng
2018-01-01
Space Information Flow (SIF) is a new promising research area that studies network coding in geometric space, such as Euclidean space. The design of algorithms that compute the optimal SIF solutions remains one of the key open problems in SIF. This work proposes the first exact SIF algorithm and a heuristic SIF algorithm that compute min-cost multicast network coding for N (N ≥ 3) given terminal nodes in 2-D Euclidean space. Furthermore, we find that the Butterfly network in Euclidean space is the second example besides the Pentagram network where SIF is strictly better than Euclidean Steiner minimal tree. The exact algorithm design is based on two key techniques: Delaunay triangulation and linear programming. Delaunay triangulation technique helps to find practically good candidate relay nodes, after which a min-cost multicast linear programming model is solved over the terminal nodes and the candidate relay nodes, to compute the optimal multicast network topology, including the optimal relay nodes selected by linear programming from all the candidate relay nodes and the flow rates on the connection links. The heuristic algorithm design is also based on Delaunay triangulation and linear programming techniques. The exact algorithm can achieve the optimal SIF solution with an exponential computational complexity, while the heuristic algorithm can achieve the sub-optimal SIF solution with a polynomial computational complexity. We prove the correctness of the exact SIF algorithm. The simulation results show the effectiveness of the heuristic SIF algorithm.
Dramatic Consequences: Integrating Rhetorical Performance across the Disciplines and Curriculum
ERIC Educational Resources Information Center
Marquez, Loren
2015-01-01
Just as WAC pedagogy and writing studies both stress the ways that writing and communication practices can act as both heuristics and products of genre-based, discipline- specific knowledge, in much the same way, performance, too, can be used as a heuristic and as a product and should be more fully explored in WAC theory and pedagogy. This article…
Heuristic Model Of The Composite Quality Index Of Environmental Assessment
NASA Astrophysics Data System (ADS)
Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.
2017-01-01
The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.
Slime moulds use heuristics based on within-patch experience to decide when to leave.
Latty, Tanya; Beekman, Madeleine
2015-04-15
Animals foraging in patchy, non-renewing or slowly renewing environments must make decisions about how long to remain within a patch. Organisms can use heuristics ('rules of thumb') based on available information to decide when to leave the patch. Here, we investigated proximate patch-departure heuristics in two species of giant, brainless amoeba: the slime moulds Didymium bahiense and Physarum polycephalum. We explicitly tested the importance of information obtained through experience by eliminating chemosensory cues of patch quality. In P. polycephalum, patch departure was influenced by the consumption of high, and to a much lesser extent low, quality food items such that engulfing a food item increased patch-residency time. Physarum polycephalum also tended to forage for longer in darkened, 'safe' patches. In D. bahiense, engulfment of any food item increased patch residency irrespective of that food item's quality. Exposure to light had no effect on the patch-residency time of D. bahiense. Given that these organisms lack a brain, our results illustrate how the use of simple heuristics can give the impression that individuals make sophisticated foraging decisions. © 2015. Published by The Company of Biologists Ltd.
Heuristic lipophilicity potential for computer-aided rational drug design
NASA Astrophysics Data System (ADS)
Du, Qishi; Arteca, Gustavo A.; Mezey, Paul G.
1997-09-01
In this contribution we suggest a heuristic molecular lipophilicitypotential (HMLP), which is a structure-based technique requiring noempirical indices of atomic lipophilicity. The input data used in thisapproach are molecular geometries and molecular surfaces. The HMLP is amodified electrostatic potential, combined with the averaged influences fromthe molecular environment. Quantum mechanics is used to calculate theelectron density function ρ(r) and the electrostatic potential V(r), andfrom this information a lipophilicity potential L(r) is generated. The HMLPis a unified lipophilicity and hydrophilicity potential. The interactions ofdipole and multipole moments, hydrogen bonds, and charged atoms in amolecule are included in the hydrophilic interactions in this model. TheHMLP is used to study hydrogen bonds and water-octanol partitioncoefficients in several examples. The calculated results show that the HMLPgives qualitatively and quantitatively correct, as well as chemicallyreasonable, results in cases where comparisons are available. Thesecomparisons indicate that the HMLP has advantages over the empiricallipophilicity potential in many aspects. The HMLP is a three-dimensional andeasily visualizable representation of molecular lipophilicity, suggested asa potential tool in computer-aided three-dimensional drug design.
Meta-heuristic algorithms as tools for hydrological science
NASA Astrophysics Data System (ADS)
Yoo, Do Guen; Kim, Joong Hoon
2014-12-01
In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.
Masicampo, E J; Baumeister, Roy F
2008-03-01
This experiment used the attraction effect to test the hypothesis that ingestion of sugar can reduce reliance on intuitive, heuristic-based decision making. In the attraction effect, a difficult choice between two options is swayed by the presence of a seemingly irrelevant "decoy" option. We replicated this effect and the finding that the effect increases when people have depleted their mental resources performing a previous self-control task. Our hypothesis was based on the assumption that effortful processes require and consume relatively large amounts of glucose (brain fuel), and that this use of glucose is why people use heuristic strategies after exerting self-control. Before performing any tasks, some participants drank lemonade sweetened with sugar, which restores blood glucose, whereas others drank lemonade containing a sugar substitute. Only lemonade with sugar reduced the attraction effect. These results show one way in which the body (blood glucose) interacts with the mind (self-control and reliance on heuristics).
Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD).
Khowaja, Kamran; Salim, Siti Salwah; Asemi, Adeleh
2015-01-01
In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen's set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen's heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system.
Intuitive and deliberate judgments are based on common principles.
Kruglanski, Arie W; Gigerenzer, Gerd
2011-01-01
A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.
Whitaker, Roger M.; Colombo, Gualtiero B.; Allen, Stuart M.; Dunbar, Robin I. M.
2016-01-01
Cooperation is a fundamental human trait but our understanding of how it functions remains incomplete. Indirect reciprocity is a particular case in point, where one-shot donations are made to unrelated beneficiaries without any guarantee of payback. Existing insights are largely from two independent perspectives: i) individual-level cognitive behaviour in decision making, and ii) identification of conditions that favour evolution of cooperation. We identify a fundamental connection between these two areas by examining social comparison as a means through which indirect reciprocity can evolve. Social comparison is well established as an inherent human disposition through which humans navigate the social world by self-referential evaluation of others. Donating to those that are at least as reputable as oneself emerges as a dominant heuristic, which represents aspirational homophily. This heuristic is found to be implicitly present in the current knowledge of conditions that favour indirect reciprocity. The effective social norms for updating reputation are also observed to support this heuristic. We hypothesise that the cognitive challenge associated with social comparison has contributed to cerebral expansion and the disproportionate human brain size, consistent with the social complexity hypothesis. The findings have relevance for the evolution of autonomous systems that are characterised by one-shot interactions. PMID:27515119
Whitaker, Roger M; Colombo, Gualtiero B; Allen, Stuart M; Dunbar, Robin I M
2016-08-12
Cooperation is a fundamental human trait but our understanding of how it functions remains incomplete. Indirect reciprocity is a particular case in point, where one-shot donations are made to unrelated beneficiaries without any guarantee of payback. Existing insights are largely from two independent perspectives: i) individual-level cognitive behaviour in decision making, and ii) identification of conditions that favour evolution of cooperation. We identify a fundamental connection between these two areas by examining social comparison as a means through which indirect reciprocity can evolve. Social comparison is well established as an inherent human disposition through which humans navigate the social world by self-referential evaluation of others. Donating to those that are at least as reputable as oneself emerges as a dominant heuristic, which represents aspirational homophily. This heuristic is found to be implicitly present in the current knowledge of conditions that favour indirect reciprocity. The effective social norms for updating reputation are also observed to support this heuristic. We hypothesise that the cognitive challenge associated with social comparison has contributed to cerebral expansion and the disproportionate human brain size, consistent with the social complexity hypothesis. The findings have relevance for the evolution of autonomous systems that are characterised by one-shot interactions.
NASA Astrophysics Data System (ADS)
Whitaker, Roger M.; Colombo, Gualtiero B.; Allen, Stuart M.; Dunbar, Robin I. M.
2016-08-01
Cooperation is a fundamental human trait but our understanding of how it functions remains incomplete. Indirect reciprocity is a particular case in point, where one-shot donations are made to unrelated beneficiaries without any guarantee of payback. Existing insights are largely from two independent perspectives: i) individual-level cognitive behaviour in decision making, and ii) identification of conditions that favour evolution of cooperation. We identify a fundamental connection between these two areas by examining social comparison as a means through which indirect reciprocity can evolve. Social comparison is well established as an inherent human disposition through which humans navigate the social world by self-referential evaluation of others. Donating to those that are at least as reputable as oneself emerges as a dominant heuristic, which represents aspirational homophily. This heuristic is found to be implicitly present in the current knowledge of conditions that favour indirect reciprocity. The effective social norms for updating reputation are also observed to support this heuristic. We hypothesise that the cognitive challenge associated with social comparison has contributed to cerebral expansion and the disproportionate human brain size, consistent with the social complexity hypothesis. The findings have relevance for the evolution of autonomous systems that are characterised by one-shot interactions.
NASA Astrophysics Data System (ADS)
Kim, Byung Soo; Lee, Woon-Seek; Koh, Shiegheun
2012-07-01
This article considers an inbound ordering and outbound dispatching problem for a single product in a third-party warehouse, where the demands are dynamic over a discrete and finite time horizon, and moreover, each demand has a time window in which it must be satisfied. Replenishing orders are shipped in containers and the freight cost is proportional to the number of containers used. The problem is classified into two cases, i.e. non-split demand case and split demand case, and a mathematical model for each case is presented. An in-depth analysis of the models shows that they are very complicated and difficult to find optimal solutions as the problem size becomes large. Therefore, genetic algorithm (GA) based heuristic approaches are designed to solve the problems in a reasonable time. To validate and evaluate the algorithms, finally, some computational experiments are conducted.
A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.
Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao
2011-08-01
The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.
Evaluating Heuristics for Planning Effective and Efficient Inspections
NASA Technical Reports Server (NTRS)
Shull, Forrest J.; Seaman, Carolyn B.; Diep, Madeline M.; Feldmann, Raimund L.; Godfrey, Sara H.; Regardie, Myrna
2010-01-01
A significant body of knowledge concerning software inspection practice indicates that the value of inspections varies widely both within and across organizations. Inspection effectiveness and efficiency can be measured in numerous ways, and may be affected by a variety of factors such as Inspection planning, the type of software, the developing organization, and many others. In the early 1990's, NASA formulated heuristics for inspection planning based on best practices and early NASA inspection data. Over the intervening years, the body of data from NASA inspections has grown. This paper describes a multi-faceted exploratory analysis performed on this · data to elicit lessons learned in general about conducting inspections and to recommend improvements to the existing heuristics. The contributions of our results include support for modifying some of the original inspection heuristics (e.g. Increasing the recommended page rate), evidence that Inspection planners must choose between efficiency and effectiveness, as a good tradeoff between them may not exist, and Identification of small subsets of inspections for which new inspection heuristics are needed. Most Importantly, this work illustrates the value of collecting rich data on software Inspections, and using it to gain insight into, and Improve, inspection practice.
Zhang, Weizhe; Bai, Enci; He, Hui; Cheng, Albert M.K.
2015-01-01
Reducing energy consumption is becoming very important in order to keep battery life and lower overall operational costs for heterogeneous real-time multiprocessor systems. In this paper, we first formulate this as a combinatorial optimization problem. Then, a successful meta-heuristic, called Shuffled Frog Leaping Algorithm (SFLA) is proposed to reduce the energy consumption. Precocity remission and local optimal avoidance techniques are proposed to avoid the precocity and improve the solution quality. Convergence acceleration significantly reduces the search time. Experimental results show that the SFLA-based energy-aware meta-heuristic uses 30% less energy than the Ant Colony Optimization (ACO) algorithm, and 60% less energy than the Genetic Algorithm (GA) algorithm. Remarkably, the running time of the SFLA-based meta-heuristic is 20 and 200 times less than ACO and GA, respectively, for finding the optimal solution. PMID:26110406
A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems
NASA Astrophysics Data System (ADS)
Abtahi, Amir-Reza; Bijari, Afsane
2017-03-01
In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.
Heuristic and analytic processes in reasoning: an event-related potential study of belief bias.
Banks, Adrian P; Hope, Christopher
2014-03-01
Human reasoning involves both heuristic and analytic processes. This study of belief bias in relational reasoning investigated whether the two processes occur serially or in parallel. Participants evaluated the validity of problems in which the conclusions were either logically valid or invalid and either believable or unbelievable. Problems in which the conclusions presented a conflict between the logically valid response and the believable response elicited a more positive P3 than problems in which there was no conflict. This shows that P3 is influenced by the interaction of belief and logic rather than either of these factors on its own. These findings indicate that belief and logic influence reasoning at the same time, supporting models in which belief-based and logical evaluations occur in parallel but not theories in which belief-based heuristic evaluations precede logical analysis.
A New Powered Lower Limb Prosthesis Control Framework Based on Adaptive Dynamic Programming.
Wen, Yue; Si, Jennie; Gao, Xiang; Huang, Stephanie; Huang, He Helen
2017-09-01
This brief presents a novel application of adaptive dynamic programming (ADP) for optimal adaptive control of powered lower limb prostheses, a type of wearable robots to assist the motor function of the limb amputees. Current control of these robotic devices typically relies on finite state impedance control (FS-IC), which lacks adaptability to the user's physical condition. As a result, joint impedance settings are often customized manually and heuristically in clinics, which greatly hinder the wide use of these advanced medical devices. This simulation study aimed at demonstrating the feasibility of ADP for automatic tuning of the twelve knee joint impedance parameters during a complete gait cycle to achieve balanced walking. Given that the accurate models of human walking dynamics are difficult to obtain, the model-free ADP control algorithms were considered. First, direct heuristic dynamic programming (dHDP) was applied to the control problem, and its performance was evaluated on OpenSim, an often-used dynamic walking simulator. For the comparison purposes, we selected another established ADP algorithm, the neural fitted Q with continuous action (NFQCA). In both cases, the ADP controllers learned to control the right knee joint and achieved balanced walking, but dHDP outperformed NFQCA in this application during a 200 gait cycle-based testing.
OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.
Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein
2018-01-01
Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.
Fast marching methods for the continuous traveling salesman problem.
Andrews, June; Sethian, J A
2007-01-23
We consider a problem in which we are given a domain, a cost function which depends on position at each point in the domain, and a subset of points ("cities") in the domain. The goal is to determine the cheapest closed path that visits each city in the domain once. This can be thought of as a version of the traveling salesman problem, in which an underlying known metric determines the cost of moving through each point of the domain, but in which the actual shortest path between cities is unknown at the outset. We describe algorithms for both a heuristic and an optimal solution to this problem. The complexity of the heuristic algorithm is at worst case M.N log N, where M is the number of cities, and N the size of the computational mesh used to approximate the solutions to the shortest paths problems. The average runtime of the heuristic algorithm is linear in the number of cities and O(N log N) in the size N of the mesh.
Keresztes, Janos C; John Koshel, R; D'huys, Karlien; De Ketelaere, Bart; Audenaert, Jan; Goos, Peter; Saeys, Wouter
2016-12-26
A novel meta-heuristic approach for minimizing nonlinear constrained problems is proposed, which offers tolerance information during the search for the global optimum. The method is based on the concept of design and analysis of computer experiments combined with a novel two phase design augmentation (DACEDA), which models the entire merit space using a Gaussian process, with iteratively increased resolution around the optimum. The algorithm is introduced through a series of cases studies with increasing complexity for optimizing uniformity of a short-wave infrared (SWIR) hyperspectral imaging (HSI) illumination system (IS). The method is first demonstrated for a two-dimensional problem consisting of the positioning of analytical isotropic point sources. The method is further applied to two-dimensional (2D) and five-dimensional (5D) SWIR HSI IS versions using close- and far-field measured source models applied within the non-sequential ray-tracing software FRED, including inherent stochastic noise. The proposed method is compared to other heuristic approaches such as simplex and simulated annealing (SA). It is shown that DACEDA converges towards a minimum with 1 % improvement compared to simplex and SA, and more importantly requiring only half the number of simulations. Finally, a concurrent tolerance analysis is done within DACEDA for to the five-dimensional case such that further simulations are not required.
Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD)
Khowaja, Kamran; Salim, Siti Salwah
2015-01-01
In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen’s set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen’s heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system. PMID:26196385
Heuristics of reasoning and analogy in children's visual perspective taking.
Yaniv, I; Shatz, M
1990-10-01
We propose that children's reasoning about others' visual perspectives is guided by simple heuristics based on a perceiver's line of sight and salient features of the object met by that line. In 3 experiments employing a 2-perceiver analogy task, children aged 3-6 were generally better able to reproduce a perceiver's perspective if a visual cue in the perceiver's line of sight sufficed to distinguish it from alternatives. Children had greater difficulty when the task hinged on attending to configural cues. Availability of distinctive cues affixed on the objects' sides facilitated solution of the symmetrical orientations. These and several other related findings reported in the literature are traced to children's reliance on heuristics of reasoning.
Morsanyi, Kinga; Handley, Simon J
2008-01-01
We examined the relationship between cognitive capacity and heuristic responding on four types of reasoning and decision-making tasks. A total of 84 children, between 5 years 2 months and 11 years 7 months of age, participated in the study. There was a marked increase in heuristic responding with age that was related to increases in cognitive capacity. These findings are inconsistent with the predominant dual-process accounts of reasoning and decision making as applied to development. We offer an alternative explanation of the findings, considering them in the context of recent claims concerning the role of working memory in contextualized reasoning.
How Single-site Mutation Affects HP Lattice Proteins
NASA Astrophysics Data System (ADS)
Shi, Guangjie; Landau, David P.; Vogel, Thomas; Wüst, Thomas; Li, Ying Wai
2014-03-01
We developed a heuristic method based on Wang-Landauand multicanonical sampling for determining the ground-state degeneracy of HP lattice proteins . Our algorithm allowed the most precise estimations of the (sometimes substantial) ground-state degeneracies of some widely studied HP sequences. We investigated the effects of single-site mutation on specific long HP lattice proteins comprehensively, including structural changes in ground-states, changes of ground-state degeneracy and thermodynamic properties of the systems. Both extremely sensitive and insensitive cases have been observed; consequently, properties such as specific heat, tortuosities etc. may be either largely unaffected or may change significantly due to mutation. More interestingly, mutation can even induce a lower ground-state energy in a few cases. Supported by NSF.
Robust registration in case of different scaling
NASA Astrophysics Data System (ADS)
Gluhchev, Georgi J.; Shalev, Shlomo
1993-09-01
The problem of robust registration in the case of anisotropic scaling has been investigated. Registration of two images using corresponding sets of fiducial points is sensitive to inaccuracies in point placement due to poor image quality or non-rigid distortions, including possible out-of-plane rotations. An approach aimed at the detection of the most unreliable points has been developed. It is based on the a priori knowledge of the sequential ordering of rotation and scaling. A measure of guilt derived from the anomalous geometric relationships is introduced. A heuristic decision rule allowing for deletion of the most guilty points is proposed. The approach allows for more precise evaluation of the translation vector. It has been tested on phantom images with known parameters and has shown satisfactory results.
Falcão, Jorge Tarcísio Da Rocha; Hazin, Izabel
2012-03-01
Køppe's proposition of four layers in theoretical building are used here in the exploration of a specific case of eclectic combination, the use of Piagetian and Vygotskian general approaches to the analysis of proportional reasoning as a cognitive mathematical ability. It is proposed here that the eclectic consideration of these contributions depends on the consideration of their specificity, in the sense that they highlight different aspects of the phenomenon under scrutiny, and also on the consideration of the coherence between this eclectic convergence and premises in terms of schools of thought under which each contribution is framed. We conclude proposing in accordance to S. Køppe's proposal that eclecticism can be valuable and heuristic in theory development, but this contribution will depend largely on the effort in establishing careful relations between the four layers of theory-building.
ERIC Educational Resources Information Center
Smith, Douglas A.; Coleman, Dawn
2018-01-01
This intrinsic case study explored organizational readiness to implement a campus-wide technology initiative. Specifically, this research examined a rural community college's implementation of an "iPad campus" initiative in which all students, faculty, and staff were required to adopt iPad technology. We apply a heuristic for…
Heuristic Evaluation of Online COPD Respiratory Therapy and Education Video Resource Center
Chaney, Beth; Chaney, Don
2014-01-01
Abstract Purpose: Because of limited accessibility to pulmonary rehabilitation programs, patients with chronic obstructive pulmonary disease (COPD) are infrequently provided with patient education resources. To help educate patients with COPD on how to live a better life with diminished breathing capacity, we developed a novel social media resource center containing COPD respiratory therapy and education videos called “COPDFlix.” Methodology: A heuristic evaluation of COPDFlix was conducted as part of a larger study to determine whether the prototype was successful in adhering to formal Web site usability guidelines for older adults. A purposive sample of three experts, with expertise in Web design and health communications technology, was recruited (a) to identify usability violations and (b) to propose solutions to improve the functionality of the COPDFlix prototype. Each expert evaluated 18 heuristics in four categories of task-based criteria (i.e., interaction and navigation, information architecture, presentation design, and information design). Seventy-six subcriteria across these four categories were assessed. Quantitative ratings and qualitative comments from each expert were compiled into a single master list, noting the violated heuristic and type/location of problem(s). Results: Sixty-one usability violations were identified across the 18 heuristics. Evaluators rated the majority of heuristic subcriteria as either a “minor hindrance” (n=32) or “no problem” (n=132). Moreover, only 2 of the 18 heuristic categories were noted as “major” violations, with mean severity scores of ≥3. Conclusions: Mixed-methods data analysis helped the multidisciplinary research team to categorize and prioritize usability problems and solutions, leading to 26 discrete design modifications within the COPDFlix prototype. PMID:24650318
NASA Astrophysics Data System (ADS)
Huber, A.; Chankin, A. V.
2017-06-01
A simple two-point representation of the tokamak scrape-off layer (SOL) in the conduction limited regime, based on the parallel and perpendicular energy balance equations in combination with the heat flux width predicted by a heuristic drift-based model, was used to derive a scaling for the cross-field thermal diffusivity {χ }\\perp . For fixed plasma shape and neglecting weak power dependence indexes 1/8, the scaling {χ }\\perp \\propto {P}{{S}{{O}}{{L}}}/(n{B}θ {R}2) is derived.
Hermawati, Setia; Lawson, Glyn
2016-09-01
Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.
The affect heuristic in occupational safety.
Savadori, Lucia; Caovilla, Jessica; Zaniboni, Sara; Fraccaroli, Franco
2015-07-08
The affect heuristic is a rule of thumb according to which, in the process of making a judgment or decision, people use affect as a cue. If a stimulus elicits positive affect then risks associated to that stimulus are viewed as low and benefits as high; conversely, if the stimulus elicits negative affect, then risks are perceived as high and benefits as low. The basic tenet of this study is that affect heuristic guides worker's judgment and decision making in a risk situation. The more the worker likes her/his organization the less she/he will perceive the risks as high. A sample of 115 employers and 65 employees working in small family agricultural businesses completed a questionnaire measuring perceived safety costs, psychological safety climate, affective commitment and safety compliance. A multi-sample structural analysis supported the thesis that safety compliance can be explained through an affect-based heuristic reasoning, but only for employers. Positive affective commitment towards their family business reduced employers' compliance with safety procedures by increasing the perceived cost of implementing them.
Approaches to eliminate waste and reduce cost for recycling glass.
Chao, Chien-Wen; Liao, Ching-Jong
2011-12-01
In recent years, the issue of environmental protection has received considerable attention. This paper adds to the literature by investigating a scheduling problem in the manufacturing of a glass recycling factory in Taiwan. The objective is to minimize the sum of the total holding cost and loss cost. We first represent the problem as an integer programming (IP) model, and then develop two heuristics based on the IP model to find near-optimal solutions for the problem. To validate the proposed heuristics, comparisons between optimal solutions from the IP model and solutions from the current method are conducted. The comparisons involve two problem sizes, small and large, where the small problems range from 15 to 45 jobs, and the large problems from 50 to 100 jobs. Finally, a genetic algorithm is applied to evaluate the proposed heuristics. Computational experiments show that the proposed heuristics can find good solutions in a reasonable time for the considered problem. Copyright © 2011 Elsevier Ltd. All rights reserved.
Neural model of gene regulatory network: a survey on supportive meta-heuristics.
Biswas, Surama; Acharyya, Sriyankar
2016-06-01
Gene regulatory network (GRN) is produced as a result of regulatory interactions between different genes through their coded proteins in cellular context. Having immense importance in disease detection and drug finding, GRN has been modelled through various mathematical and computational schemes and reported in survey articles. Neural and neuro-fuzzy models have been the focus of attraction in bioinformatics. Predominant use of meta-heuristic algorithms in training neural models has proved its excellence. Considering these facts, this paper is organized to survey neural modelling schemes of GRN and the efficacy of meta-heuristic algorithms towards parameter learning (i.e. weighting connections) within the model. This survey paper renders two different structure-related approaches to infer GRN which are global structure approach and substructure approach. It also describes two neural modelling schemes, such as artificial neural network/recurrent neural network based modelling and neuro-fuzzy modelling. The meta-heuristic algorithms applied so far to learn the structure and parameters of neutrally modelled GRN have been reviewed here.
Determining the optimal number of Kanban in multi-products supply chain system
NASA Astrophysics Data System (ADS)
Widyadana, G. A.; Wee, H. M.; Chang, Jer-Yuan
2010-02-01
Kanban, a key element of just-in-time system, is a re-order card or signboard giving instruction or triggering the pull system to manufacture or supply a component based on actual usage of material. There are two types of Kanban: production Kanban and withdrawal Kanban. This study uses optimal and meta-heuristic methods to determine the Kanban quantity and withdrawal lot sizes in a supply chain system. Although the mix integer programming method gives an optimal solution, it is not time efficient. For this reason, the meta-heuristic methods are suggested. In this study, a genetic algorithm (GA) and a hybrid of genetic algorithm and simulated annealing (GASA) are used. The study compares the performance of GA and GASA with that of the optimal method using MIP. The given problems show that both GA and GASA result in a near optimal solution, and they outdo the optimal method in term of run time. In addition, the GASA heuristic method gives a better performance than the GA heuristic method.
Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.
Ćwik, Michał; Józefczyk, Jerzy
2018-01-01
An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.
Kneale, Laura; Mikles, Sean; Choi, Yong K; Thompson, Hilaire; Demiris, George
2017-09-01
Using heuristics to evaluate user experience is a common methodology for human-computer interaction studies. One challenge of this method is the inability to tailor results towards specific end-user needs. This manuscript reports on a method that uses validated scenarios and personas of older adults and care team members to enhance heuristics evaluations of the usability of commercially available personal health records for homebound older adults. Our work extends the Chisnell and Redish heuristic evaluation methodology by using a protocol that relies on multiple expert reviews of each system. It further standardizes the heuristic evaluation process through the incorporation of task-based scenarios. We were able to use the modified version of the Chisnell and Redish heuristic evaluation methodology to identify potential usability challenges of two commercially available personal health record systems. This allowed us to: (1) identify potential usability challenges for specific types of users, (2) describe improvements that would be valuable to all end-users of the system, and (3) better understand how the interactions of different users may vary within a single personal health record. The methodology described in this paper may help designers of consumer health information technology tools, such as personal health records, understand the needs of diverse end-user populations. Such methods may be particularly helpful when designing systems for populations that are difficult to recruit for end-user evaluations through traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.
More than one way to see it: Individual heuristics in avian visual computation
Ravignani, Andrea; Westphal-Fitch, Gesche; Aust, Ulrike; Schlumpp, Martin M.; Fitch, W. Tecumseh
2015-01-01
Comparative pattern learning experiments investigate how different species find regularities in sensory input, providing insights into cognitive processing in humans and other animals. Past research has focused either on one species’ ability to process pattern classes or different species’ performance in recognizing the same pattern, with little attention to individual and species-specific heuristics and decision strategies. We trained and tested two bird species, pigeons (Columba livia) and kea (Nestor notabilis, a parrot species), on visual patterns using touch-screen technology. Patterns were composed of several abstract elements and had varying degrees of structural complexity. We developed a model selection paradigm, based on regular expressions, that allowed us to reconstruct the specific decision strategies and cognitive heuristics adopted by a given individual in our task. Individual birds showed considerable differences in the number, type and heterogeneity of heuristic strategies adopted. Birds’ choices also exhibited consistent species-level differences. Kea adopted effective heuristic strategies, based on matching learned bigrams to stimulus edges. Individual pigeons, in contrast, adopted an idiosyncratic mix of strategies that included local transition probabilities and global string similarity. Although performance was above chance and quite high for kea, no individual of either species provided clear evidence of learning exactly the rule used to generate the training stimuli. Our results show that similar behavioral outcomes can be achieved using dramatically different strategies and highlight the dangers of combining multiple individuals in a group analysis. These findings, and our general approach, have implications for the design of future pattern learning experiments, and the interpretation of comparative cognition research more generally. PMID:26113444
Tschandl, P; Kittler, H; Schmid, K; Zalaudek, I; Argenziano, G
2015-06-01
There are two strategies to approach the dermatoscopic diagnosis of pigmented skin tumours, namely the verbal-based analytic and the more visual-global heuristic method. It is not known if one or the other is more efficient in teaching dermatoscopy. To compare two teaching methods in short-term training of dermatoscopy to medical students. Fifty-seven medical students in the last year of the curriculum were given a 1-h lecture of either the heuristic- or the analytic-based teaching of dermatoscopy. Before and after this session, they were shown the same 50 lesions and asked to diagnose them and rate for chance of malignancy. Test lesions consisted of melanomas, basal cell carcinomas, nevi, seborrhoeic keratoses, benign vascular tumours and dermatofibromas. Performance measures were diagnostic accuracy regarding malignancy as measured by the area under the curves of receiver operating curves (range: 0-1), as well as per cent correct diagnoses (range: 0-100%). Diagnostic accuracy as well as per cent correct diagnoses increased by +0.21 and +32.9% (heuristic teaching) and +0.19 and +35.7% (analytic teaching) respectively (P for all <0.001). Neither for diagnostic accuracy (P = 0.585), nor for per cent correct diagnoses (P = 0.298) was a difference between the two groups. Short-term training of dermatoscopy to medical students allows significant improvement in diagnostic abilities. Choosing a heuristic or analytic method does not have an influence on this effect in short training using common pigmented skin lesions. © 2014 European Academy of Dermatology and Venereology.
Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy
2017-10-01
Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.
BCI Control of Heuristic Search Algorithms
Cavazza, Marc; Aranyi, Gabor; Charles, Fred
2017-01-01
The ability to develop Brain-Computer Interfaces (BCI) to Intelligent Systems would offer new perspectives in terms of human supervision of complex Artificial Intelligence (AI) systems, as well as supporting new types of applications. In this article, we introduce a basic mechanism for the control of heuristic search through fNIRS-based BCI. The rationale is that heuristic search is not only a basic AI mechanism but also one still at the heart of many different AI systems. We investigate how users’ mental disposition can be harnessed to influence the performance of heuristic search algorithm through a mechanism of precision-complexity exchange. From a system perspective, we use weighted variants of the A* algorithm which have an ability to provide faster, albeit suboptimal solutions. We use recent results in affective BCI to capture a BCI signal, which is indicative of a compatible mental disposition in the user. It has been established that Prefrontal Cortex (PFC) asymmetry is strongly correlated to motivational dispositions and results anticipation, such as approach or even risk-taking, and that this asymmetry is amenable to Neurofeedback (NF) control. Since PFC asymmetry is accessible through fNIRS, we designed a BCI paradigm in which users vary their PFC asymmetry through NF during heuristic search tasks, resulting in faster solutions. This is achieved through mapping the PFC asymmetry value onto the dynamic weighting parameter of the weighted A* (WA*) algorithm. We illustrate this approach through two different experiments, one based on solving 8-puzzle configurations, and the other on path planning. In both experiments, subjects were able to speed up the computation of a solution through a reduction of search space in WA*. Our results establish the ability of subjects to intervene in heuristic search progression, with effects which are commensurate to their control of PFC asymmetry: this opens the way to new mechanisms for the implementation of hybrid cognitive systems. PMID:28197092
BCI Control of Heuristic Search Algorithms.
Cavazza, Marc; Aranyi, Gabor; Charles, Fred
2017-01-01
The ability to develop Brain-Computer Interfaces (BCI) to Intelligent Systems would offer new perspectives in terms of human supervision of complex Artificial Intelligence (AI) systems, as well as supporting new types of applications. In this article, we introduce a basic mechanism for the control of heuristic search through fNIRS-based BCI. The rationale is that heuristic search is not only a basic AI mechanism but also one still at the heart of many different AI systems. We investigate how users' mental disposition can be harnessed to influence the performance of heuristic search algorithm through a mechanism of precision-complexity exchange. From a system perspective, we use weighted variants of the A* algorithm which have an ability to provide faster, albeit suboptimal solutions. We use recent results in affective BCI to capture a BCI signal, which is indicative of a compatible mental disposition in the user. It has been established that Prefrontal Cortex (PFC) asymmetry is strongly correlated to motivational dispositions and results anticipation, such as approach or even risk-taking, and that this asymmetry is amenable to Neurofeedback (NF) control. Since PFC asymmetry is accessible through fNIRS, we designed a BCI paradigm in which users vary their PFC asymmetry through NF during heuristic search tasks, resulting in faster solutions. This is achieved through mapping the PFC asymmetry value onto the dynamic weighting parameter of the weighted A* (WA*) algorithm. We illustrate this approach through two different experiments, one based on solving 8-puzzle configurations, and the other on path planning. In both experiments, subjects were able to speed up the computation of a solution through a reduction of search space in WA*. Our results establish the ability of subjects to intervene in heuristic search progression, with effects which are commensurate to their control of PFC asymmetry: this opens the way to new mechanisms for the implementation of hybrid cognitive systems.
Case-based clinical reasoning in feline medicine: 2: Managing cognitive error.
Canfield, Paul J; Whitehead, Martin L; Johnson, Robert; O'Brien, Carolyn R; Malik, Richard
2016-03-01
This is Article 2 of a three-part series on clinical reasoning that encourages practitioners to explore and understand how they think and make case-based decisions. It is hoped that, in the process, they will learn to trust their intuition but, at the same time, put in place safeguards to diminish the impact of bias and misguided logic on their diagnostic decision-making. Article 1, published in the January 2016 issue of JFMS, discussed the relative merits and shortcomings of System 1 thinking (immediate and unconscious) and System 2 thinking (effortful and analytical). This second article examines ways of managing cognitive error, particularly the negative impact of bias, when making a diagnosis. Article 3, to appear in the May 2016 issue, explores the use of heuristics (mental short cuts) and illness scripts in diagnostic reasoning. © The Author(s) 2016.
Design of Composite Structures Using Knowledge-Based and Case Based Reasoning
NASA Technical Reports Server (NTRS)
Lambright, Jonathan Paul
1996-01-01
A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.
Pitfalls in Teaching Judgment Heuristics
ERIC Educational Resources Information Center
Shepperd, James A.; Koch, Erika J.
2005-01-01
Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…
A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.
Dubljević, Veljko; Racine, Eric
2014-10-01
The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).
ERIC Educational Resources Information Center
Marks, Shaylyn Barrie
2013-01-01
This study demonstrates the need for the integration of multiculturalism in the K-12 curriculum as well as provides a heuristic based on the work conducted by Sleeter and Grant (2009) to evaluate literature for level of multiculturalism. In the study, the researcher uses an evaluate heuristic to critically analyze and evaluation ten of the books…
Moving multiple sinks through wireless sensor networks for lifetime maximization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrioli, Chiara; Carosi, Alessio; Basagni, Stefano
2008-01-01
Unattended sensor networks typically watch for some phenomena such as volcanic events, forest fires, pollution, or movements in animal populations. Sensors report to a collection point periodically or when they observe reportable events. When sensors are too far from the collection point to communicate directly, other sensors relay messages for them. If the collection point location is static, sensor nodes that are closer to the collection point relay far more messages than those on the periphery. Assuming all sensor nodes have roughly the same capabilities, those with high relay burden experience battery failure much faster than the rest of themore » network. However, since their death disconnects the live nodes from the collection point, the whole network is then dead. We consider the problem of moving a set of collectors (sinks) through a wireless sensor network to balance the energy used for relaying messages, maximizing the lifetime of the network. We show how to compute an upper bound on the lifetime for any instance using linear and integer programming. We present a centralized heuristic that produces sink movement schedules that produce network lifetimes within 1.4% of the upper bound for realistic settings. We also present a distributed heuristic that produces lifetimes at most 25:3% below the upper bound. More specifically, we formulate a linear program (LP) that is a relaxation of the scheduling problem. The variables are naturally continuous, but the LP relaxes some constraints. The LP has an exponential number of constraints, but we can satisfy them all by enforcing only a polynomial number using a separation algorithm. This separation algorithm is a p-median facility location problem, which we can solve efficiently in practice for huge instances using integer programming technology. This LP selects a set of good sensor configurations. Given the solution to the LP, we can find a feasible schedule by selecting a subset of these configurations, ordering them via a traveling salesman heuristic, and computing feasible transitions using matching algorithms. This algorithm assumes sinks can get a schedule from a central server or a leader sink. If the network owner prefers the sinks make independent decisions, they can use our distributed heuristic. In this heuristic, sinks maintain estimates of the energy distribution in the network and move greedily (with some coordination) based on local search. This application uses the new SUCASA (Solver Utility for Customization with Automatic Symbol Access) facility within the PICO (Parallel Integer and Combinatorial Optimizer) integer programming solver system. SUCASA allows rapid development of customized math programming (search-based) solvers using a problem's natural multidimensional representation. In this case, SUCASA also significantly improves runtime compared to implementations in the ampl math programming language or in perl.« less
Ecologically rational choice and the structure of the environment.
Pleskac, Timothy J; Hertwig, Ralph
2014-10-01
In life, risk is reward and vice versa. Unfortunately, the big rewards people desire are relatively unlikely to occur. This relationship between risk and reward or probabilities and payoffs seems obvious to the financial community and to laypeople alike. Yet theories of decision making have largely ignored it. We conducted an ecological analysis of life's gambles, ranging from the domains of roulette and life insurance to scientific publications and artificial insemination. Across all domains, payoffs and probabilities proved intimately tied, with payoff magnitudes signaling their probabilities. In some cases, the constraints of the market result in these two core elements of choice being related via a power function; in other cases, other factors such as social norms appear to produce the inverse relationship between risks and rewards. We offer evidence that decision makers exploit this relationship in the form of a heuristic--the risk-reward heuristic--to infer the probability of a payoff during decisions under uncertainty. We demonstrate how the heuristic can help explain observed ambiguity aversion. We further show how this ecological relationship can inform other aspects of decision making, particularly the approach of using monetary lotteries to study choice under risk and uncertainty. Taken together, these findings suggest that theories of decision making need to model not only the decision process but also the environment to which the process is adapted.
A multilevel probabilistic beam search algorithm for the shortest common supersequence problem.
Gallardo, José E
2012-01-01
The shortest common supersequence problem is a classical problem with many applications in different fields such as planning, Artificial Intelligence and especially in Bioinformatics. Due to its NP-hardness, we can not expect to efficiently solve this problem using conventional exact techniques. This paper presents a heuristic to tackle this problem based on the use at different levels of a probabilistic variant of a classical heuristic known as Beam Search. The proposed algorithm is empirically analysed and compared to current approaches in the literature. Experiments show that it provides better quality solutions in a reasonable time for medium and large instances of the problem. For very large instances, our heuristic also provides better solutions, but required execution times may increase considerably.
An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics
NASA Technical Reports Server (NTRS)
Baluja, Shumeet
1995-01-01
This report is a repository of the results obtained from a large scale empirical comparison of seven iterative and evolution-based optimization heuristics. Twenty-seven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include job-shop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2368 to 22040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.
How cognitive heuristics can explain social interactions in spatial movement.
Seitz, Michael J; Bode, Nikolai W F; Köster, Gerta
2016-08-01
The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as 'stop if another step would lead to a collision' or 'follow the person in front'. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. © 2016 The Author(s).
How cognitive heuristics can explain social interactions in spatial movement
Köster, Gerta
2016-01-01
The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as ‘stop if another step would lead to a collision’ or ‘follow the person in front’. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. PMID:27581483
Decision heuristic or preference? Attribute non-attendance in discrete choice problems.
Heidenreich, Sebastian; Watson, Verity; Ryan, Mandy; Phimister, Euan
2018-01-01
This paper investigates if respondents' choice to not consider all characteristics of a multiattribute health service may represent preferences. Over the last decade, an increasing number of studies account for attribute non-attendance (ANA) when using discrete choice experiments to elicit individuals' preferences. Most studies assume such behaviour is a heuristic and therefore uninformative. This assumption may result in misleading welfare estimates if ANA reflects preferences. This is the first paper to assess if ANA is a heuristic or genuine preference without relying on respondents' self-stated motivation and the first study to explore this question within a health context. Based on findings from cognitive psychology, we expect that familiar respondents are less likely to use a decision heuristic to simplify choices than unfamiliar respondents. We employ a latent class model of discrete choice experiment data concerned with National Health Service managers' preferences for support services that assist with performance concerns. We present quantitative and qualitative evidence that in our study ANA mostly represents preferences. We also show that wrong assumptions about ANA result in inadequate welfare measures that can result in suboptimal policy advice. Future research should proceed with caution when assuming that ANA is a heuristic. Copyright © 2017 John Wiley & Sons, Ltd.
Testing Bayesian and heuristic predictions of mass judgments of colliding objects
Sanborn, Adam N.
2014-01-01
Mass judgments of colliding objects have been used to explore people's understanding of the physical world because they are ecologically relevant, yet people display biases that are most easily explained by a small set of heuristics. Recent work has challenged the heuristic explanation, by producing the same biases from a model that copes with perceptual uncertainty by using Bayesian inference with a prior based on the correct combination rules from Newtonian mechanics (noisy Newton). Here I test the predictions of the leading heuristic model (Gilden and Proffitt, 1989) against the noisy Newton model using a novel manipulation of the standard mass judgment task: making one of the objects invisible post-collision. The noisy Newton model uses the remaining information to predict above-chance performance, while the leading heuristic model predicts chance performance when one or the other final velocity is occluded. An experiment using two different types of occlusion showed better-than-chance performance and response patterns that followed the predictions of the noisy Newton model. The results demonstrate that people can make sensible physical judgments even when information critical for the judgment is missing, and that a Bayesian model can serve as a guide in these situations. Possible algorithmic-level accounts of this task that more closely correspond to the noisy Newton model are explored. PMID:25206345
Combining heuristic and statistical techniques in landslide hazard assessments
NASA Astrophysics Data System (ADS)
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
Test Scheduling for Core-Based SOCs Using Genetic Algorithm Based Heuristic Approach
NASA Astrophysics Data System (ADS)
Giri, Chandan; Sarkar, Soumojit; Chattopadhyay, Santanu
This paper presents a Genetic algorithm (GA) based solution to co-optimize test scheduling and wrapper design for core based SOCs. Core testing solutions are generated as a set of wrapper configurations, represented as rectangles with width equal to the number of TAM (Test Access Mechanism) channels and height equal to the corresponding testing time. A locally optimal best-fit heuristic based bin packing algorithm has been used to determine placement of rectangles minimizing the overall test times, whereas, GA has been utilized to generate the sequence of rectangles to be considered for placement. Experimental result on ITC'02 benchmark SOCs shows that the proposed method provides better solutions compared to the recent works reported in the literature.
Optimisation of flight dynamic control based on many-objectives meta-heuristic: a comparative study
NASA Astrophysics Data System (ADS)
Bureerat, Sujin; Pholdee, Nantiwat; Radpukdee, Thana
2018-05-01
Development of many objective meta-heuristics (MnMHs) is a currently interesting topic as they are suitable to real applications of optimisation problems which usually require many ob-jectives. However, most of MnMHs have been mostly developed and tested based on stand-ard testing functions while the use of MnMHs to real applications is rare. Therefore, in this work, MnMHs are applied for optimisation design of flight dynamic control. The design prob-lem is posed to find control gains for minimising; the control effort, the spiral root, the damp-ing in roll root, sideslip angle deviation, and maximising; the damping ratio of the dutch-roll complex pair, the dutch-roll frequency, bank angle at pre-specified times 1 seconds and 2.8 second subjected to several constraints based on Military Specifications (1969) requirement. Several established many-objective meta-heuristics (MnMHs) are used to solve the problem while their performances are compared. With this research work, performance of several MnMHs for flight control is investigated. The results obtained will be the baseline for future development of flight dynamic and control.
Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.
Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J
2006-08-01
Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.
Deciding the fate of others: the cognitive underpinnings of racially biased juror decision making.
Kleider, Heather M; Knuycky, Leslie R; Cavrak, Sarah E
2012-01-01
In criminal law, jurors are supposed to ignore defendant race when considering factual matters of culpability. However, when judging the merits of a criminal case, jurors' ability (or inability) to avoid bias may affect verdicts. Fact-based decision making expend cognitive resources, while heuristic-based decisions (e.g., using criminal stereotypes) conserve resources. Here, we investigated whether differences in cognitive resources and prejudice attitudes about Blacks influenced trial outcomes. We tested the impact of working memory capacity (WMC), cognitive load, prejudice, and target race (Black, White) on penalties ascribed to fictional criminal defendants in ambiguous-fact cases. Results showed that when "loaded," prejudiced-low-WMC persons supported guilty verdicts with higher confidence more often for Black than White defendants. Conversely, regardless of WMC or prejudice attitude, participants penalized White defendants more often when not loaded. We suggest that cognitive resources and prejudice attitude influence fact-based decisions. Links to juror judgments and potential trial outcomes are discussed.
Seismic Strengthening of Carpentry Joints in Traditional Timber Structures
NASA Astrophysics Data System (ADS)
Parisi, Maria A.; Cordié, Cinzia; Piazza, Maurizio
2008-07-01
The static and dynamic behavior of timber structures largely depends on their connections. In traditional timber construction, elements are usually connected with carpentry joints based on contact pressure and friction, often with only minor reinforcement generically intended to avoid disassembling. In current practice, interventions for the upgrading of carpentry joints are mainly based on empirical knowledge according to tradition. Often they produce a general strengthening of the connection, but are not specific for the case of seismic action. Strengthening on heuristic bases may be only partially effective or possibly disproportioned. The behavior of the carpentry joints most used in roof structures is examined. The birdsmouth joint, connecting rafters to the tie beam, has been studied first, characterizing its behavior numerically and experimentally in monotonic and cyclic conditions. Other forms of the rafter-to-tie connection, the double notch joint and the case of parallel rafters, are discussed. Some general criteria for the seismic strengthening of these joints are presented.
Health on impulse: when low self-control promotes healthy food choices.
Salmon, Stefanie J; Fennis, Bob M; de Ridder, Denise T D; Adriaanse, Marieke A; de Vet, Emely
2014-02-01
Food choices are often made mindlessly, when individuals are not able or willing to exert self-control. Under low self-control, individuals have difficulties to resist palatable but unhealthy food products. In contrast to previous research aiming to foster healthy choices by promoting high self-control, this study exploits situations of low self-control, by strategically using the tendency under these conditions to rely on heuristics (simple decision rules) as quick guides to action. More specifically, the authors associated healthy food products with the social proof heuristic (i.e., normative cues that convey majority endorsement for those products). One hundred seventy-seven students (119 men), with an average age of 20.47 years (SD = 2.25) participated in the experiment. This study used a 2 (low vs. high self-control) × 2 (social proof vs. no heuristic) × 2 (trade-off vs. control choice) design, with the latter as within-subjects factor. The dependent variable was the number of healthy food choices in a food-choice task. In line with previous studies, people made fewer healthy food choices under low self-control. However, this negative effect of low self-control on food choice was reversed when the healthy option was associated with the social proof heuristic. In that case, people made more healthy choices under conditions of low self-control. Low self-control may be even more beneficial for healthy food choices than high self-control in the presence of a heuristic. Exploiting situations of low self-control is a new and promising method to promote health on impulse. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Rossi, Sandrine; Cassotti, Mathieu; Moutier, Sylvain; Delcroix, Nicolas; Houdé, Olivier
2015-01-01
Reasoners make systematic logical errors by giving heuristic responses that reflect deviations from the logical norm. Influential studies have suggested first that our reasoning is often biased because we minimize cognitive effort to surpass a cognitive conflict between heuristic response from system 1 and analytic response from system 2 thinking. Additionally, cognitive control processes might be necessary to inhibit system 1 responses to activate a system 2 response. Previous studies have shown a significant effect of executive learning (EL) on adults who have transferred knowledge acquired on the Wason selection task (WST) to another isomorphic task, the rule falsification task (RFT). The original paradigm consisted of teaching participants to inhibit a classical matching heuristic that sufficed the first problem and led to significant EL transfer on the second problem. Interestingly, the reasoning tasks differed in inhibiting-heuristic metacognitive cost. Success on the WST requires half-suppression of the matching elements. In contrast, the RFT necessitates a global rejection of the matching elements for a correct answer. Therefore, metacognitive learning difficulty most likely differs depending on whether one uses the first or second task during the learning phase. We aimed to investigate this difficulty and various matching-bias inhibition effects in a new (reversed) paradigm. In this case, the transfer effect from the RFT to the WST could be more difficult because the reasoner learns to reject all matching elements in the first task. We observed that the EL leads to a significant reduction in matching selections on the WST without increasing logical performances. Interestingly, the acquired metacognitive knowledge was too "strictly" transferred and discouraged matching rather than encouraging logic. This finding underlines the complexity of learning transfer and adds new evidence to the pedagogy of reasoning.
Rossi, Sandrine; Cassotti, Mathieu; Moutier, Sylvain; Delcroix, Nicolas; Houdé, Olivier
2015-01-01
Reasoners make systematic logical errors by giving heuristic responses that reflect deviations from the logical norm. Influential studies have suggested first that our reasoning is often biased because we minimize cognitive effort to surpass a cognitive conflict between heuristic response from system 1 and analytic response from system 2 thinking. Additionally, cognitive control processes might be necessary to inhibit system 1 responses to activate a system 2 response. Previous studies have shown a significant effect of executive learning (EL) on adults who have transferred knowledge acquired on the Wason selection task (WST) to another isomorphic task, the rule falsification task (RFT). The original paradigm consisted of teaching participants to inhibit a classical matching heuristic that sufficed the first problem and led to significant EL transfer on the second problem. Interestingly, the reasoning tasks differed in inhibiting-heuristic metacognitive cost. Success on the WST requires half-suppression of the matching elements. In contrast, the RFT necessitates a global rejection of the matching elements for a correct answer. Therefore, metacognitive learning difficulty most likely differs depending on whether one uses the first or second task during the learning phase. We aimed to investigate this difficulty and various matching-bias inhibition effects in a new (reversed) paradigm. In this case, the transfer effect from the RFT to the WST could be more difficult because the reasoner learns to reject all matching elements in the first task. We observed that the EL leads to a significant reduction in matching selections on the WST without increasing logical performances. Interestingly, the acquired metacognitive knowledge was too “strictly” transferred and discouraged matching rather than encouraging logic. This finding underlines the complexity of learning transfer and adds new evidence to the pedagogy of reasoning. PMID:25849555
Why Irregulars Win: Asymmetry of Motivations and the Outcomes of Irregular Warfare
2016-12-01
have on dependent variables.” 58 Alexander L. George and Andrew Bennett, Case Studies and Theory Development in the Social Sciences, BCSIA Studies in...of case studies is used to evaluate irregular wars and the motivations of the combatants. The findings suggest that asymmetries of motivation only...irregular wars. A mixed methodology, including heuristics, process tracing, and comparison of case studies is used to evaluate irregular wars and the
Simply criminal: predicting burglars' occupancy decisions with a simple heuristic.
Snook, Brent; Dhami, Mandeep K; Kavanagh, Jennifer M
2011-08-01
Rational choice theories of criminal decision making assume that offenders weight and integrate multiple cues when making decisions (i.e., are compensatory). We tested this assumption by comparing how well a compensatory strategy called Franklin's Rule captured burglars' decision policies regarding residence occupancy compared to a non-compensatory strategy (i.e., Matching Heuristic). Forty burglars each decided on the occupancy of 20 randomly selected photographs of residences (for which actual occupancy was known when the photo was taken). Participants also provided open-ended reports on the cues that influenced their decisions in each case, and then rated the importance of eight cues (e.g., deadbolt visible) over all decisions. Burglars predicted occupancy beyond chance levels. The Matching Heuristic was a significantly better predictor of burglars' decisions than Franklin's Rule, and cue use in the Matching Heuristic better corresponded to the cue ecological validities in the environment than cue use in Franklin's Rule. The most important cue in burglars' models was also the most ecologically valid or predictive of actual occupancy (i.e., vehicle present). The majority of burglars correctly identified the most important cue in their models, and the open-ended technique showed greater correspondence between self-reported and captured cue use than the rating over decision technique. Our findings support a limited rationality perspective to understanding criminal decision making, and have implications for crime prevention.
Heuristic-based scheduling algorithm for high level synthesis
NASA Technical Reports Server (NTRS)
Mohamed, Gulam; Tan, Han-Ngee; Chng, Chew-Lye
1992-01-01
A new scheduling algorithm is proposed which uses a combination of a resource utilization chart, a heuristic algorithm to estimate the minimum number of hardware units based on operator mobilities, and a list-scheduling technique to achieve fast and near optimal schedules. The schedule time of this algorithm is almost independent of the length of mobilities of operators as can be seen from the benchmark example (fifth order digital elliptical wave filter) presented when the cycle time was increased from 17 to 18 and then to 21 cycles. It is implemented in C on a SUN3/60 workstation.
Understanding place and health: a heuristic for using administrative data.
Frohlich, Katherine L; Dunn, James R; McLaren, Lindsay; Shiell, Alan; Potvin, Louise; Hawe, Penelope; Dassa, Clément; Thurston, Wilfreda E
2007-06-01
The increasing availability, use and limitations of administrative data for place-based population health research, and a lack of theory development, created the context for the current paper. We developed a heuristic to interrogate administrative data sets and to help us develop explanatory pathways for linking place and health. Guided by a worked example, we argue that some items in administrative data sets lend themselves to multiple theories, creating problems of inference owing to the implications of using inductive versus deductive reasoning during the research process, and that certain types of theories are privileged when used administrative data bases.
Wishful Thinking? Inside the Black Box of Exposure Assessment.
Money, Annemarie; Robinson, Christine; Agius, Raymond; de Vocht, Frank
2016-05-01
Decision-making processes used by experts when undertaking occupational exposure assessment are relatively unknown, but it is often assumed that there is a common underlying method that experts employ. However, differences in training and experience of assessors make it unlikely that one general method for expert assessment would exist. Therefore, there are concerns about formalizing, validating, and comparing expert estimates within and between studies that are difficult, if not impossible, to characterize. Heuristics on the other hand (the processes involved in decision making) have been extensively studied. Heuristics are deployed by everyone as short-cuts to make the often complex process of decision-making simpler, quicker, and less burdensome. Experts' assessments are often subject to various simplifying heuristics as a way to reach a decision in the absence of sufficient data. Therefore, investigating the underlying heuristics or decision-making processes involved may help to shed light on the 'black box' of exposure assessment. A mixed method study was conducted utilizing both a web-based exposure assessment exercise incorporating quantitative and semiqualitative elements of data collection, and qualitative semi-structured interviews with exposure assessors. Qualitative data were analyzed using thematic analysis. Twenty-five experts completed the web-based exposure assessment exercise and 8 of these 25 were randomly selected to participate in the follow-up interview. Familiar key themes relating to the exposure assessment exercise emerged; 'intensity'; 'probability'; 'agent'; 'process'; and 'duration' of exposure. However, an important aspect of the detailed follow-up interviews revealed a lack of structure and order with which participants described their decision making. Participants mostly described some form of an iterative process, heavily relying on the anchoring and adjustment heuristic, which differed between experts. In spite of having undertaken comparable training (in occupational hygiene or exposure assessment), experts use different methods to assess exposure. Decision making appears to be an iterative process with heavy reliance on the key heuristic of anchoring and adjustment. Using multiple experts to assess exposure while providing some form of anchoring scenario to build from, and additional training in understanding the impact of simple heuristics on the process of decision making, is likely to produce a more methodical approach to assessment; thereby improving consistency and transparency in expert exposure assessment. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Wishful Thinking? Inside the Black Box of Exposure Assessment
Money, Annemarie; Robinson, Christine; Agius, Raymond; de Vocht, Frank
2016-01-01
Background: Decision-making processes used by experts when undertaking occupational exposure assessment are relatively unknown, but it is often assumed that there is a common underlying method that experts employ. However, differences in training and experience of assessors make it unlikely that one general method for expert assessment would exist. Therefore, there are concerns about formalizing, validating, and comparing expert estimates within and between studies that are difficult, if not impossible, to characterize. Heuristics on the other hand (the processes involved in decision making) have been extensively studied. Heuristics are deployed by everyone as short-cuts to make the often complex process of decision-making simpler, quicker, and less burdensome. Experts’ assessments are often subject to various simplifying heuristics as a way to reach a decision in the absence of sufficient data. Therefore, investigating the underlying heuristics or decision-making processes involved may help to shed light on the ‘black box’ of exposure assessment. Methods: A mixed method study was conducted utilizing both a web-based exposure assessment exercise incorporating quantitative and semiqualitative elements of data collection, and qualitative semi-structured interviews with exposure assessors. Qualitative data were analyzed using thematic analysis. Results: Twenty-five experts completed the web-based exposure assessment exercise and 8 of these 25 were randomly selected to participate in the follow-up interview. Familiar key themes relating to the exposure assessment exercise emerged; ‘intensity’; ‘probability’; ‘agent’; ‘process’; and ‘duration’ of exposure. However, an important aspect of the detailed follow-up interviews revealed a lack of structure and order with which participants described their decision making. Participants mostly described some form of an iterative process, heavily relying on the anchoring and adjustment heuristic, which differed between experts. Conclusion: In spite of having undertaken comparable training (in occupational hygiene or exposure assessment), experts use different methods to assess exposure. Decision making appears to be an iterative process with heavy reliance on the key heuristic of anchoring and adjustment. Using multiple experts to assess exposure while providing some form of anchoring scenario to build from, and additional training in understanding the impact of simple heuristics on the process of decision making, is likely to produce a more methodical approach to assessment; thereby improving consistency and transparency in expert exposure assessment. PMID:26764244
Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets
ERIC Educational Resources Information Center
Zaharias, Panagiotis; Koutsabasis, Panayiotis
2012-01-01
Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…
NASA Astrophysics Data System (ADS)
Cook, J.; Jacobs, P.; Nuccitelli, D.
2014-12-01
Laypeople use expert opinion as a mental shortcut to form views on complex scientific issues. This heuristic is particularly relevant in the case of climate change, where perception of consensus is one of the main predictors of public support for climate action. A low public perception of consensus (around 60% compared to the actual 97% consensus) is a significant stumbling block to meaningful climate action, underscoring the importance of closing the "consensus gap". However, some scientists question the efficacy or appropriateness of emphasizing consensus in climate communication. I'll summarize the social science research examining the importance and effectiveness of consensus messaging. I'll also present several case studies of consensus messaging employed by the team of communicators at the Skeptical Science website.
Gardner, Charlie J.; Raxworthy, Christopher J.; Metcalfe, Kristian; Raselimanana, Achille P.; Smith, Robert J.; Davies, Zoe G.
2015-01-01
There are insufficient resources available to manage the world’s existing protected area portfolio effectively, so the most important sites should be prioritised in investment decision-making. Sophisticated conservation planning and assessment tools developed to identify locations for new protected areas can provide an evidence base for such prioritisations, yet decision-makers in many countries lack the institutional support and necessary capacity to use the associated software. As such, simple heuristic approaches such as species richness or number of threatened species are generally adopted to inform prioritisation decisions. However, their performance has never been tested. Using the reptile fauna of Madagascar’s dry forests as a case study, we evaluate the performance of four site prioritisation protocols used to rank the conservation value of 22 established and candidate protected areas. We compare the results to a benchmark produced by the widely-used systematic conservation planning software Zonation. The four indices scored sites on the basis of: i) species richness; ii) an index based on species’ Red List status; iii) irreplaceability (a key metric in systematic conservation planning); and, iv) a novel Conservation Value Index (CVI), which incorporates species-level information on endemism, representation in the protected area system, tolerance of habitat degradation and hunting/collection pressure. Rankings produced by the four protocols were positively correlated to the results of Zonation, particularly amongst high-scoring sites, but CVI and Irreplaceability performed better than Species Richness and the Red List Index. Given the technological capacity constraints experienced by decision-makers in the developing world, our findings suggest that heuristic metrics can represent a useful alternative to more sophisticated analyses, especially when they integrate species-specific information related to extinction risk. However, this can require access to, and understanding of, more complex species data. PMID:26162073
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
More than one way to see it: Individual heuristics in avian visual computation.
Ravignani, Andrea; Westphal-Fitch, Gesche; Aust, Ulrike; Schlumpp, Martin M; Fitch, W Tecumseh
2015-10-01
Comparative pattern learning experiments investigate how different species find regularities in sensory input, providing insights into cognitive processing in humans and other animals. Past research has focused either on one species' ability to process pattern classes or different species' performance in recognizing the same pattern, with little attention to individual and species-specific heuristics and decision strategies. We trained and tested two bird species, pigeons (Columba livia) and kea (Nestor notabilis, a parrot species), on visual patterns using touch-screen technology. Patterns were composed of several abstract elements and had varying degrees of structural complexity. We developed a model selection paradigm, based on regular expressions, that allowed us to reconstruct the specific decision strategies and cognitive heuristics adopted by a given individual in our task. Individual birds showed considerable differences in the number, type and heterogeneity of heuristic strategies adopted. Birds' choices also exhibited consistent species-level differences. Kea adopted effective heuristic strategies, based on matching learned bigrams to stimulus edges. Individual pigeons, in contrast, adopted an idiosyncratic mix of strategies that included local transition probabilities and global string similarity. Although performance was above chance and quite high for kea, no individual of either species provided clear evidence of learning exactly the rule used to generate the training stimuli. Our results show that similar behavioral outcomes can be achieved using dramatically different strategies and highlight the dangers of combining multiple individuals in a group analysis. These findings, and our general approach, have implications for the design of future pattern learning experiments, and the interpretation of comparative cognition research more generally. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
The development of a culture of problem solving with secondary students through heuristic strategies
NASA Astrophysics Data System (ADS)
Eisenmann, Petr; Novotná, Jarmila; Přibyl, Jiří; Břehovský, Jiří
2015-12-01
The article reports the results of a longitudinal research study conducted in three mathematics classes in Czech schools with 62 pupils aged 12-18 years. The pupils were exposed to the use of selected heuristic strategies in mathematical problem solving for a period of 16 months. This was done through solving problems where the solution was the most efficient if heuristic strategies were used. The authors conducted a two-dimensional classification of the use of heuristic strategies based on the work of Pólya (2004) and Schoenfeld (1985). We developed a tool that allows for the description of a pupil's ability to solve problems. Named, the Culture of Problem Solving (CPS), this tool consists of four components: intelligence, text comprehension, creativity and the ability to use existing knowledge. The pupils' success rate in problem solving and the changes in some of the CPS factors pre- and post-experiment were monitored. The pupils appeared to considerably improve in the creativity component. In addition, the results indicate a positive change in the students' attitude to problem solving. As far as the teachers participating in the experiment are concerned, a significant change was in their teaching style to a more constructivist, inquiry-based approach, as well as their willingness to accept a student's non-standard approach to solving a problem. Another important outcome of the research was the identification of the heuristic strategies that can be taught via long-term guided solutions of suitable problems and those that cannot. Those that can be taught include systematic experimentation, guess-check-revise and introduction of an auxiliary element. Those that cannot be taught (or can only be taught with difficulty) include the strategies of specification and generalization and analogy.
Conflict and bias in heuristic judgment.
Bhatia, Sudeep
2017-02-01
Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy. We tested this prediction in experiments involving judgments of argument strength and word frequency, and found that participants are more likely to avoid heuristic bias and respond correctly in settings with 2 incorrect heuristic response options compared with similar settings with only 1 heuristic response option. Our results provide strong evidence for conflict as a mechanism influencing the interaction between heuristic and deliberative thought, and illustrate how accuracy can be increased through simple changes to the response sets offered to participants. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Heuristic lipophilicity potential for computer-aided rational drug design.
Du, Q; Arteca, G A; Mezey, P G
1997-09-01
In this contribution we suggest a heuristic molecular lipophilicity potential (HMLP), which is a structure-based technique requiring no empirical indices of atomic lipophilicity. The input data used in this approach are molecular geometries and molecular surfaces. The HMLP is a modified electrostatic potential, combined with the averaged influences from the molecular environment. Quantum mechanics is used to calculate the electron density function rho(r) and the electrostatic potential V(r), and from this information a lipophilicity potential L(r) is generated. The HMLP is a unified lipophilicity and hydrophilicity potential. The interactions of dipole and multipole moments, hydrogen bonds, and charged atoms in a molecule are included in the hydrophilic interactions in this model. The HMLP is used to study hydrogen bonds and water-octanol partition coefficients in several examples. The calculated results show that the HMLP gives qualitatively and quantitatively correct, as well as chemically reasonable, results in cases where comparisons are available. These comparisons indicate that the HMLP has advantages over the empirical lipophilicity potential in many aspects. The HMLP is a three-dimensional and easily visualizable representation of molecular lipophilicity, suggested as a potential tool in computer-aided three-dimensional drug design.
NASA Astrophysics Data System (ADS)
Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri
2017-06-01
Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.
Fast marching methods for the continuous traveling salesman problem
Andrews, June; Sethian, J. A.
2007-01-01
We consider a problem in which we are given a domain, a cost function which depends on position at each point in the domain, and a subset of points (“cities”) in the domain. The goal is to determine the cheapest closed path that visits each city in the domain once. This can be thought of as a version of the traveling salesman problem, in which an underlying known metric determines the cost of moving through each point of the domain, but in which the actual shortest path between cities is unknown at the outset. We describe algorithms for both a heuristic and an optimal solution to this problem. The complexity of the heuristic algorithm is at worst case M·N log N, where M is the number of cities, and N the size of the computational mesh used to approximate the solutions to the shortest paths problems. The average runtime of the heuristic algorithm is linear in the number of cities and O(N log N) in the size N of the mesh. PMID:17220271
Fast marching methods for the continuous traveling salesman problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, J.; Sethian, J.A.
We consider a problem in which we are given a domain, a cost function which depends on position at each point in the domain, and a subset of points ('cities') in the domain. The goal is to determine the cheapest closed path that visits each city in the domain once. This can be thought of as a version of the Traveling Salesman Problem, in which an underlying known metric determines the cost of moving through each point of the domain, but in which the actual shortest path between cities is unknown at the outset. We describe algorithms for both amore » heuristic and an optimal solution to this problem. The order of the heuristic algorithm is at worst case M * N logN, where M is the number of cities, and N the size of the computational mesh used to approximate the solutions to the shortest paths problems. The average runtime of the heuristic algorithm is linear in the number of cities and O(N log N) in the size N of the mesh.« less
Wei, C P; Hu, P J; Sheng, O R
2001-03-01
When performing primary reading on a newly taken radiological examination, a radiologist often needs to reference relevant prior images of the same patient for confirmation or comparison purposes. Support of such image references is of clinical importance and may have significant effects on radiologists' examination reading efficiency, service quality, and work satisfaction. To effectively support such image reference needs, we proposed and developed a knowledge-based patient image pre-fetching system, addressing several challenging requirements of the application that include representation and learning of image reference heuristics and management of data-intensive knowledge inferencing. Moreover, the system demands an extensible and maintainable architecture design capable of effectively adapting to a dynamic environment characterized by heterogeneous and autonomous data source systems. In this paper, we developed a synthesized object-oriented entity- relationship model, a conceptual model appropriate for representing radiologists' prior image reference heuristics that are heuristic oriented and data intensive. We detailed the system architecture and design of the knowledge-based patient image pre-fetching system. Our architecture design is based on a client-mediator-server framework, capable of coping with a dynamic environment characterized by distributed, heterogeneous, and highly autonomous data source systems. To adapt to changes in radiologists' patient prior image reference heuristics, ID3-based multidecision-tree induction and CN2-based multidecision induction learning techniques were developed and evaluated. Experimentally, we examined effects of the pre-fetching system we created on radiologists' examination readings. Preliminary results show that the knowledge-based patient image pre-fetching system more accurately supports radiologists' patient prior image reference needs than the current practice adopted at the study site and that radiologists may become more efficient, consultatively effective, and better satisfied when supported by the pre-fetching system than when relying on the study site's pre-fetching practice.
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
NASA Astrophysics Data System (ADS)
Perreard, S.; Wildner, E.
1994-12-01
Many processes are controlled by experts using some kind of mental model to decide on actions and make conclusions. This model, based on heuristic knowledge, can often be represented by rules and does not have to be particularly accurate. Such is the case for the problem of conditioning high voltage RF cavities; the expert has to decide, by observing some criteria, whether to increase or to decrease the voltage and by how much. A program has been implemented which can be applied to a class of similar problems. The kernel of the program is a small rule base, which is independent of the kind of cavity. To model a specific cavity, we use fuzzy logic which is implemented as a separate routine called by the rule base, to translate from numeric to symbolic information.
ERIC Educational Resources Information Center
Jacobs, Richard M.
2011-01-01
This article reports a case study of seventeen faculty leaders teaching at a Catholic university who responded to a questionnaire concerning academic freedom and its practice in classroom speech. Situating the responses within a heuristic model, this article offers a portrait that provides insight into how these faculty leaders define academic…
A Heuristics Approach for Classroom Scheduling Using Genetic Algorithm Technique
NASA Astrophysics Data System (ADS)
Ahmad, Izah R.; Sufahani, Suliadi; Ali, Maselan; Razali, Siti N. A. M.
2018-04-01
Reshuffling and arranging classroom based on the capacity of the audience, complete facilities, lecturing time and many more may lead to a complexity of classroom scheduling. While trying to enhance the productivity in classroom planning, this paper proposes a heuristic approach for timetabling optimization. A new algorithm was produced to take care of the timetabling problem in a university. The proposed of heuristics approach will prompt a superior utilization of the accessible classroom space for a given time table of courses at the university. Genetic Algorithm through Java programming languages were used in this study and aims at reducing the conflicts and optimizes the fitness. The algorithm considered the quantity of students in each class, class time, class size, time accessibility in each class and lecturer who in charge of the classes.
Neural correlates of strategic reasoning during competitive games.
Seo, Hyojung; Cai, Xinying; Donahue, Christopher H; Lee, Daeyeol
2014-10-17
Although human and animal behaviors are largely shaped by reinforcement and punishment, choices in social settings are also influenced by information about the knowledge and experience of other decision-makers. During competitive games, monkeys increased their payoffs by systematically deviating from a simple heuristic learning algorithm and thereby countering the predictable exploitation by their computer opponent. Neurons in the dorsomedial prefrontal cortex (dmPFC) signaled the animal's recent choice and reward history that reflected the computer's exploitative strategy. The strength of switching signals in the dmPFC also correlated with the animal's tendency to deviate from the heuristic learning algorithm. Therefore, the dmPFC might provide control signals for overriding simple heuristic learning algorithms based on the inferred strategies of the opponent. Copyright © 2014, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Edalati, Sh; Houshangi far, A.; Torabi, N.; Baneshi, Z.; Behjat, A.
2017-02-01
Poly(3,4-ethylendioxythiophene):poly(styrene sulfonate) (PEDOT:PSS) was deposited on a fluoride-doped tin oxide glass substrate using a heuristic method to fabricate platinum-free counter electrodes for dye-sensitized solar cells (DSSCs). In this heuristic method a thin layer of PEDOT:PPS is obtained by spin coating the PEDOT:PSS on a Cu substrate and then removing the substrate with FeCl3. The characteristics of the deposited PEDOT:PSS were studied by energy dispersive x-ray analysis and scanning electron microscopy, which revealed the micro-electronic specifications of the cathode. The aforementioned DSSCs exhibited a solar conversion efficiency of 3.90%, which is far higher than that of DSSCs with pure PEDOT:PSS (1.89%). This enhancement is attributed not only to the micro-electronic specifications but also to the HNO3 treatment through our heuristic method. The results of cyclic voltammetry, electrochemical impedance spectroscopy (EIS) and Tafel polarization plots show the modified cathode has a dual function, including excellent conductivity and electrocatalytic activity for iodine reduction.
When decision heuristics and science collide.
Yu, Erica C; Sprenger, Amber M; Thomas, Rick P; Dougherty, Michael R
2014-04-01
The ongoing discussion among scientists about null-hypothesis significance testing and Bayesian data analysis has led to speculation about the practices and consequences of "researcher degrees of freedom." This article advances this debate by asking the broader questions that we, as scientists, should be asking: How do scientists make decisions in the course of doing research, and what is the impact of these decisions on scientific conclusions? We asked practicing scientists to collect data in a simulated research environment, and our findings show that some scientists use data collection heuristics that deviate from prescribed methodology. Monte Carlo simulations show that data collection heuristics based on p values lead to biases in estimated effect sizes and Bayes factors and to increases in both false-positive and false-negative rates, depending on the specific heuristic. We also show that using Bayesian data collection methods does not eliminate these biases. Thus, our study highlights the little appreciated fact that the process of doing science is a behavioral endeavor that can bias statistical description and inference in a manner that transcends adherence to any particular statistical framework.
Path integration mediated systematic search: a Bayesian model.
Vickerstaff, Robert J; Merkle, Tobias
2012-08-21
The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhu, Wei; Timmermans, Harry
2011-06-01
Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.
Strategy selection as rational metareasoning.
Lieder, Falk; Griffiths, Thomas L
2017-11-01
Many contemporary accounts of human reasoning assume that the mind is equipped with multiple heuristics that could be deployed to perform a given task. This raises the question of how the mind determines when to use which heuristic. To answer this question, we developed a rational model of strategy selection, based on the theory of rational metareasoning developed in the artificial intelligence literature. According to our model people learn to efficiently choose the strategy with the best cost-benefit tradeoff by learning a predictive model of each strategy's performance. We found that our model can provide a unifying explanation for classic findings from domains ranging from decision-making to arithmetic by capturing the variability of people's strategy choices, their dependence on task and context, and their development over time. Systematic model comparisons supported our theory, and 4 new experiments confirmed its distinctive predictions. Our findings suggest that people gradually learn to make increasingly more rational use of fallible heuristics. This perspective reconciles the 2 poles of the debate about human rationality by integrating heuristics and biases with learning and rationality. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Gigerenzer, Gerd
2009-01-01
In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes. PMID:19784854
Gigerenzer, Gerd; Gaissmaier, Wolfgang
2011-01-01
As reflected in the amount of controversy, few areas in psychology have undergone such dramatic conceptual changes in the past decade as the emerging science of heuristics. Heuristics are efficient cognitive processes, conscious or unconscious, that ignore part of the information. Because using heuristics saves effort, the classical view has been that heuristic decisions imply greater errors than do "rational" decisions as defined by logic or statistical models. However, for many decisions, the assumptions of rational models are not met, and it is an empirical rather than an a priori issue how well cognitive heuristics function in an uncertain world. To answer both the descriptive question ("Which heuristics do people use in which situations?") and the prescriptive question ("When should people rely on a given heuristic rather than a complex strategy to make better judgments?"), formal models are indispensable. We review research that tests formal models of heuristic inference, including in business organizations, health care, and legal institutions. This research indicates that (a) individuals and organizations often rely on simple heuristics in an adaptive way, and (b) ignoring part of the information can lead to more accurate judgments than weighting and adding all information, for instance for low predictability and small samples. The big future challenge is to develop a systematic theory of the building blocks of heuristics as well as the core capacities and environmental structures these exploit.
Reconsidering "evidence" for fast-and-frugal heuristics.
Hilbig, Benjamin E
2010-12-01
In several recent reviews, authors have argued for the pervasive use of fast-and-frugal heuristics in human judgment. They have provided an overview of heuristics and have reiterated findings corroborating that such heuristics can be very valid strategies leading to high accuracy. They also have reviewed previous work that implies that simple heuristics are actually used by decision makers. Unfortunately, concerning the latter point, these reviews appear to be somewhat incomplete. More important, previous conclusions have been derived from investigations that bear some noteworthy methodological limitations. I demonstrate these by proposing a new heuristic and provide some novel critical findings. Also, I review some of the relevant literature often not-or only partially-considered. Overall, although some fast-and-frugal heuristics indeed seem to predict behavior at times, there is little to no evidence for others. More generally, the empirical evidence available does not warrant the conclusion that heuristics are pervasively used.
VHP - An environment for the remote visualization of heuristic processes
NASA Technical Reports Server (NTRS)
Crawford, Stuart L.; Leiner, Barry M.
1991-01-01
A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.
Remarks on a New Possible Discretization Scheme for Gauge Theories
NASA Astrophysics Data System (ADS)
Magnot, Jean-Pierre
2018-03-01
We propose here a new discretization method for a class of continuum gauge theories which action functionals are polynomials of the curvature. Based on the notion of holonomy, this discretization procedure appears gauge-invariant for discretized analogs of Yang-Mills theories, and hence gauge-fixing is fully rigorous for these discretized action functionals. Heuristic parts are forwarded to the quantization procedure via Feynman integrals and the meaning of the heuristic infinite dimensional Lebesgue integral is questioned.
POCO-MOEA: Using Evolutionary Algorithms to Solve the Controller Placement Problem
2016-03-24
to gather data on POCO-MOEA performance to a series of iv model networks. The algorithm’s behavior is then evaluated and compared to ex- haustive... evaluation of a third heuristic based on a Multi 3 Objective Evolutionary Algorithm (MOEA). This heuristic is modeled after one of the most well known MOEAs...researchers to extend into more realistic evaluations of the performance characteristics of SDN controllers, such as the use of simulators or live
An approach to combining heuristic and qualitative reasoning in an expert system
NASA Technical Reports Server (NTRS)
Jiang, Wei-Si; Han, Chia Yung; Tsai, Lian Cheng; Wee, William G.
1988-01-01
An approach to combining the heuristic reasoning from shallow knowledge and the qualitative reasoning from deep knowledge is described. The shallow knowledge is represented in production rules and under the direct control of the inference engine. The deep knowledge is represented in frames, which may be put in a relational DataBase Management System. This approach takes advantage of both reasoning schemes and results in improved efficiency as well as expanded problem solving ability.
Remarks on a New Possible Discretization Scheme for Gauge Theories
NASA Astrophysics Data System (ADS)
Magnot, Jean-Pierre
2018-07-01
We propose here a new discretization method for a class of continuum gauge theories which action functionals are polynomials of the curvature. Based on the notion of holonomy, this discretization procedure appears gauge-invariant for discretized analogs of Yang-Mills theories, and hence gauge-fixing is fully rigorous for these discretized action functionals. Heuristic parts are forwarded to the quantization procedure via Feynman integrals and the meaning of the heuristic infinite dimensional Lebesgue integral is questioned.
Scope of Various Random Number Generators in Ant System Approach for TSP
NASA Technical Reports Server (NTRS)
Sen, S. K.; Shaykhian, Gholam Ali
2007-01-01
Experimented on heuristic, based on an ant system approach for traveling Salesman problem, are several quasi and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is just to seek an answer to the controversial performance ranking of the generators in probabilistic/statically sense.
Cultural heuristics in risk assessment of HIV/AIDS.
Bailey, Ajay; Hutter, Inge
2006-01-01
Behaviour change models in HIV prevention tend to consider that risky sexual behaviours reflect risk assessments and that by changing risk assessments behaviour can be changed. Risk assessment is however culturally constructed. Individuals use heuristics or bounded cognitive devices derived from broader cultural meaning systems to rationalize uncertainty. In this study, we identify some of the cultural heuristics used by migrant men in Goa, India to assess their risk of HIV infection from different sexual partners. Data derives from a series of in-depth interviews and a locally informed survey. Cultural heuristics identified include visual heuristics, heuristics of gender roles, vigilance and trust. The paper argues that, for more culturally informed HIV/AIDS behaviour change interventions, knowledge of cultural heuristics is essential.
Fast or Frugal, but Not Both: Decision Heuristics Under Time Pressure
2017-01-01
Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. PMID:28557503
Fast or frugal, but not both: Decision heuristics under time pressure.
Bobadilla-Suarez, Sebastian; Love, Bradley C
2018-01-01
Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Varying execution discipline to increase performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, P.L.; Maccabe, A.B.
1993-12-22
This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less
Decision Making in Paediatric Cardiology. Are We Prone to Heuristics, Biases and Traps?
Ryan, Aedin; Duignan, Sophie; Kenny, Damien; McMahon, Colin J
2018-01-01
Hidden traps in decision making have been long recognised in the behavioural economics community. Yet we spend very limited, if any time, analysing our decision-making processes in medicine and paediatric cardiology. Systems 1 and 2 thought processes differentiate between rapid emotional thoughts and slow deliberate rational thoughts. For fairly clear cut medical decisions, in-depth analysis may not be needed, but in our field of paediatric cardiology it is not uncommon for challenging cases and occasionally 'simple' cases to generate significant debate and uncertainty as to the best decision. Although morbidity and mortality meetings frequently highlight poor outcomes for our patients, they often neglect to analyse the process of thought which underlined those decisions taken. This article attempts to review commonly acknowledged traps in decision making in the behavioural economics world to ascertain whether these heuristics translate to decision making in the paediatric cardiology environment. We also discuss potential individual and collective solutions to pitfalls in decision making.
Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.
Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio
2018-02-21
Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.
Yuan, Michael Juntao; Finley, George Mike; Long, Ju; Mills, Christy; Johnson, Ron Kim
2013-01-31
Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. The evaluation has shown that our design was functional and met the requirements demanded by the nurses' tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction.
Choi, Jeungok; Bakken, Suzanne
2010-01-01
Purpose Low health literacy has been associated with poor health-related outcomes. The purposes are to report the development of a website for low-literate parents in the Neonatal Intensive Care Unit (NICU), and the findings of heuristic evaluation and a usability testing of this website. Methods To address low literacy of NICU parents, multimedia educational Website using visual aids (e.g., pictographs, photographs), voice-recorded text message in addition to a simplified text was developed. The text was created at the 5th grade readability level. The heuristic evaluation was conducted by three usability experts using 10 heuristics. End-users’ performance was measured by counting the time spent completing tasks and number of errors, as well as recording users’ perception of ease of use and usefulness (PEUU) in a sample of 10 NICU parents. Results Three evaluators identified 82 violations across the 10 heuristics. All violations, however, received scores <2, indicating minor usability problems. Participants’ time to complete task varies from 81.2 seconds (SD=30.9) to 2.2 seconds (SD=1.3). Participants rated the Website as easy to use and useful (PEUU Mean= 4.52, SD=0.53). Based on the participants’ comments, appropriate modifications were made. Discussion and Conclusions Different types of visuals on the Website were well accepted by low-literate users and agreement of visuals with text improved understanding of the educational materials over that with text alone. The findings suggest that using concrete and realistic pictures and pictographs with clear captions would maximize the benefit of visuals. One emerging theme was “simplicity” in design (e.g., limited use of colors, one font type and size), content (e.g., avoid lengthy text), and technical features (e.g., limited use of pop-ups). The heuristic evaluation by usability experts and the usability test with actual users provided complementary expertise, which can give a richer assessment of a design for low literacy Website. These results facilitated design modification and implementation of solutions by categorizing and prioritizing the usability problems. PMID:20617546
Choi, Jeungok; Bakken, Suzanne
2010-08-01
Low health literacy has been associated with poor health-related outcomes. The purposes are to report the development of a website for low-literate parents in the Neonatal Intensive Care Unit (NICU), and the findings of heuristic evaluation and a usability testing of this website. To address low literacy of NICU parents, multimedia educational Website using visual aids (e.g., pictographs, photographs), voice-recorded text message in addition to a simplified text was developed. The text was created at the 5th grade readability level. The heuristic evaluation was conducted by three usability experts using 10 heuristics. End-users' performance was measured by counting the time spent completing tasks and number of errors, as well as recording users' perception of ease of use and usefulness (PEUU) in a sample of 10 NICU parents. Three evaluators identified 82 violations across the 10 heuristics. All violations, however, received scores <2, indicating minor usability problems. Participants' time to complete task varies from 81.2 s (SD = 30.9) to 2.2 s (SD = 1.3). Participants rated the Website as easy to use and useful (PEUU mean = 4.52, SD = 0.53). Based on the participants' comments, appropriate modifications were made. Different types of visuals on the Website were well accepted by low-literate users and agreement of visuals with text improved understanding of the educational materials over that with text alone. The findings suggest that using concrete and realistic pictures and pictographs with clear captions would maximize the benefit of visuals. One emerging theme was "simplicity" in design (e.g., limited use of colors, one font type and size), content (e.g., avoid lengthy text), and technical features (e.g., limited use of pop-ups). The heuristic evaluation by usability experts and the usability test with actual users provided complementary expertise, which can give a richer assessment of a design for low literacy Website. These results facilitated design modification and implementation of solutions by categorizing and prioritizing the usability problems.
The Shannon entropy as a measure of diffusion in multidimensional dynamical systems
NASA Astrophysics Data System (ADS)
Giordano, C. M.; Cincotta, P. M.
2018-05-01
In the present work, we introduce two new estimators of chaotic diffusion based on the Shannon entropy. Using theoretical, heuristic and numerical arguments, we show that the entropy, S, provides a measure of the diffusion extent of a given small initial ensemble of orbits, while an indicator related with the time derivative of the entropy, S', estimates the diffusion rate. We show that in the limiting case of near ergodicity, after an appropriate normalization, S' coincides with the standard homogeneous diffusion coefficient. The very first application of this formulation to a 4D symplectic map and to the Arnold Hamiltonian reveals very successful and encouraging results.
Periodic Orbits and Semiclassical Form Factor in Barrier Billiards
NASA Astrophysics Data System (ADS)
Giraud, O.
2005-11-01
Using heuristic arguments based on the trace formulas, we analytically calculate the semiclassical two-point correlation form factor for a family of rectangular billiards with a barrier of height irrational with respect to the side of the billiard and located at any rational position p/q from the side. To do this, we first obtain the asymptotic density of lengths for each family of periodic orbits by a Siegel-Veech formula. The result obtained for these pseudo-integrable, non-Veech billiards is different but not far from the value of 1/2 expected for semi-Poisson statistics and from values of obtained previously in the case of Veech billiards.
Reexamining our bias against heuristics.
McLaughlin, Kevin; Eva, Kevin W; Norman, Geoff R
2014-08-01
Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources of bias in the literature implicating the use of heuristics in diagnostic error and highlight the fact that there are also data suggesting that under certain circumstances using heuristics may lead to better decisions that formal analysis. They suggest that diagnostic error is frequently misattributed to the use of heuristics and propose an alternative view whereby content knowledge is the root cause of diagnostic performance and heuristics lie on the causal pathway between knowledge and diagnostic error or success.
Not so fast! (and not so frugal!): rethinking the recognition heuristic.
Oppenheimer, Daniel M
2003-11-01
The 'fast and frugal' approach to reasoning (Gigerenzer, G., & Todd, P. M. (1999). Simple heuristics that make us smart. New York: Oxford University Press) claims that individuals use non-compensatory strategies in judgment--the idea that only one cue is taken into account in reasoning. The simplest and most important of these heuristics postulates that judgment sometimes relies solely on recognition. However, the studies that have investigated usage of the recognition heuristic have confounded recognition with other cues that could also lead to similar judgments. This paper tests whether mere recognition is actually driving the findings in support of the recognition heuristic. Two studies provide evidence that judgments do not conform to the recognition heuristic when these confounds are accounted for. Implications for the study of simple heuristics are discussed.
The E-health Literacy Demands of Australia's My Health Record: A Heuristic Evaluation of Usability.
Walsh, Louisa; Hemsley, Bronwyn; Allan, Meredith; Adams, Natalie; Balandin, Susan; Georgiou, Andrew; Higgins, Isabel; McCarthy, Shaun; Hill, Sophie
2017-01-01
My Health Record is Australia's electronic personal health record system, which was introduced in July 2012. As of August 2017, approximately 21 percent of Australia's total population was registered to use My Health Record. Internationally, usability issues have been shown to negatively influence the uptake and use of electronic health record systems, and this scenario may particularly affect people who have low e-health literacy. It is likely that usability issues are negatively affecting the uptake and use of My Health Record in Australia. To identify potential e-health literacy-related usability issues within My Health Record through a heuristic evaluation method. Between September 14 and October 12, 2016, three of the authors conducted a heuristic evaluation of the two consumer-facing components of My Health Record-the information website and the electronic health record itself. These two components were evaluated against two sets of heuristics-the Health Literacy Online checklist and the Monkman Heuristics. The Health Literacy Online checklist and Monkman Heuristics are evidence-based checklists of web design elements with a focus on design for audiences with low health literacy. During this heuristic evaluation, the investigators individually navigated through the consumer-facing components of My Health Record, recording instances where the My Health Record did not conform to the checklist criteria. After the individual evaluations were completed, the investigators conferred and aggregated their results. From this process, a list of usability violations was constructed. When evaluated against the Health Literacy Online Checklist, the information website demonstrated violations in 12 of 35 criteria, and the electronic health record demonstrated violations in 16 of 35 criteria. When evaluated against the Monkman Heuristics, the information website demonstrated violations in 7 of 11 criteria, and the electronic health record demonstrated violations in 9 of 11 criteria. The identified violations included usability issues with the reading levels used within My Health Record, the graphic design elements, the layout of web pages, and a lack of images and audiovisual tools to support learning. Other important usability issues included a lack of translated resources, difficulty using accessibility tools, and complexity of the registration processes. My Health Record is an important piece of technology that has the potential to facilitate better communication between consumers and their health providers. However, this heuristic evaluation demonstrated that many usability-related elements of My Health Record cater poorly to users at risk of having low e-health literacy. Usability issues have been identified as an important barrier to use of personal health records internationally, and the findings of this heuristic evaluation demonstrate that usability issues may be substantial barriers to the uptake and use of My Health Record.
Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling
NASA Astrophysics Data System (ADS)
Wada, Yoshihisa; Tsuji, Hiroshi
In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.
Multicriteria meta-heuristics for AGV dispatching control based on computational intelligence.
Naso, David; Turchiano, Biagio
2005-04-01
In many manufacturing environments, automated guided vehicles are used to move the processed materials between various pickup and delivery points. The assignment of vehicles to unit loads is a complex problem that is often solved in real-time with simple dispatching rules. This paper proposes an automated guided vehicles dispatching approach based on computational intelligence. We adopt a fuzzy multicriteria decision strategy to simultaneously take into account multiple aspects in every dispatching decision. Since the typical short-term view of dispatching rules is one of the main limitations of such real-time assignment heuristics, we also incorporate in the multicriteria algorithm a specific heuristic rule that takes into account the empty-vehicle travel on a longer time-horizon. Moreover, we also adopt a genetic algorithm to tune the weights associated to each decision criteria in the global decision algorithm. The proposed approach is validated by means of a comparison with other dispatching rules, and with other recently proposed multicriteria dispatching strategies also based on computational Intelligence. The analysis of the results obtained by the proposed dispatching approach in both nominal and perturbed operating conditions (congestions, faults) confirms its effectiveness.
Aiding USAF/UPT (Undergraduate Pilot Training) Aircrew Scheduling Using Network Flow Models.
1986-06-01
51 3.4 Heuristic Modifications ............ 55 CHAPTER 4 STUDENT SCHEDULING PROBLEM (LEVEL 2) 4.0 Introduction 4.01 Constraints ............. 60 4.02...Covering" Complete Enumeration . . .. . 71 4.14 Heuristics . ............. 72 4.2 Heuristic Method for the Level 2 Problem 4.21 Step I ............... 73...4.22 Step 2 ............... 74 4.23 Advantages to the Heuristic Method. .... .. 78 4.24 Problems with the Heuristic Method. . ... 79 :,., . * CHAPTER5
Tuning Parameters in Heuristics by Using Design of Experiments Methods
NASA Technical Reports Server (NTRS)
Arin, Arif; Rabadi, Ghaith; Unal, Resit
2010-01-01
With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.
NASA Astrophysics Data System (ADS)
Yarmand, Hamed; Winey, Brian; Craft, David
2013-09-01
Stereotactic body radiation therapy (SBRT) is characterized by delivering a high amount of dose in a short period of time. In SBRT the dose is delivered using open fields (e.g., beam’s-eye-view) known as ‘apertures’. Mathematical methods can be used for optimizing treatment planning for delivery of sufficient dose to the cancerous cells while keeping the dose to surrounding organs at risk (OARs) minimal. Two important elements of a treatment plan are quality and delivery time. Quality of a plan is measured based on the target coverage and dose to OARs. Delivery time heavily depends on the number of beams used in the plan as the setup times for different beam directions constitute a large portion of the delivery time. Therefore the ideal plan, in which all potential beams can be used, will be associated with a long impractical delivery time. We use the dose to OARs in the ideal plan to find the plan with the minimum number of beams which is guaranteed to be epsilon-optimal (i.e., a predetermined maximum deviation from the ideal plan is guaranteed). Since the treatment plan optimization is inherently a multi-criteria-optimization problem, the planner can navigate the ideal dose distribution Pareto surface and select a plan of desired target coverage versus OARs sparing, and then use the proposed technique to reduce the number of beams while guaranteeing epsilon-optimality. We use mixed integer programming (MIP) for optimization. To reduce the computation time for the resultant MIP, we use two heuristics: a beam elimination scheme and a family of heuristic cuts, known as ‘neighbor cuts’, based on the concept of ‘adjacent beams’. We show the effectiveness of the proposed technique on two clinical cases, a liver and a lung case. Based on our technique we propose an algorithm for fast generation of epsilon-optimal plans.
Approximation algorithms for the min-power symmetric connectivity problem
NASA Astrophysics Data System (ADS)
Plotnikov, Roman; Erzin, Adil; Mladenovic, Nenad
2016-10-01
We consider the NP-hard problem of synthesis of optimal spanning communication subgraph in a given arbitrary simple edge-weighted graph. This problem occurs in the wireless networks while minimizing the total transmission power consumptions. We propose several new heuristics based on the variable neighborhood search metaheuristic for the approximation solution of the problem. We have performed a numerical experiment where all proposed algorithms have been executed on the randomly generated test samples. For these instances, on average, our algorithms outperform the previously known heuristics.
Kepler: Analogies in the search for the law of refraction.
Cardona, Carlos Alberto
2016-10-01
This paper examines the methodology used by Kepler to discover a quantitative law of refraction. The aim is to argue that this methodology follows a heuristic method based on the following two Pythagorean principles: (1) sameness is made known by sameness, and (2) harmony arises from establishing a limit to what is unlimited. We will analyse some of the author's proposed analogies to find the aforementioned law and argue that the investigation's heuristic pursues such principles. Copyright © 2016 Elsevier Ltd. All rights reserved.
Weighted graph based ordering techniques for preconditioned conjugate gradient methods
NASA Technical Reports Server (NTRS)
Clift, Simon S.; Tang, Wei-Pai
1994-01-01
We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.
NASA Astrophysics Data System (ADS)
Zhu, Meng-Hua; Liu, Liang-Gang; You, Zhong; Xu, Ao-Ao
2009-03-01
In this paper, a heuristic approach based on Slavic's peak searching method has been employed to estimate the width of peak regions for background removing. Synthetic and experimental data are used to test this method. With the estimated peak regions using the proposed method in the whole spectrum, we find it is simple and effective enough to be used together with the Statistics-sensitive Nonlinear Iterative Peak-Clipping method.
A Hyper-Heuristic Ensemble Method for Static Job-Shop Scheduling.
Hart, Emma; Sim, Kevin
2016-01-01
We describe a new hyper-heuristic method NELLI-GP for solving job-shop scheduling problems (JSSP) that evolves an ensemble of heuristics. The ensemble adopts a divide-and-conquer approach in which each heuristic solves a unique subset of the instance set considered. NELLI-GP extends an existing ensemble method called NELLI by introducing a novel heuristic generator that evolves heuristics composed of linear sequences of dispatching rules: each rule is represented using a tree structure and is itself evolved. Following a training period, the ensemble is shown to outperform both existing dispatching rules and a standard genetic programming algorithm on a large set of new test instances. In addition, it obtains superior results on a set of 210 benchmark problems from the literature when compared to two state-of-the-art hyper-heuristic approaches. Further analysis of the relationship between heuristics in the evolved ensemble and the instances each solves provides new insights into features that might describe similar instances.
Heuristic-based information acquisition and decision making among pilots.
Wiggins, Mark W; Bollwerk, Sandra
2006-01-01
This research was designed to examine the impact of heuristic-based approaches to the acquisition of task-related information on the selection of an optimal alternative during simulated in-flight decision making. The work integrated features of naturalistic and normative decision making and strategies of information acquisition within a computer-based, decision support framework. The study comprised two phases, the first of which involved familiarizing pilots with three different heuristic-based strategies of information acquisition: frequency, elimination by aspects, and majority of confirming decisions. The second stage enabled participants to choose one of the three strategies of information acquisition to resolve a fourth (choice) scenario. The results indicated that task-oriented experience, rather than the information acquisition strategies, predicted the selection of the optimal alternative. It was also evident that of the three strategies available, the elimination by aspects information acquisition strategy was preferred by most participants. It was concluded that task-oriented experience, rather than the process of information acquisition, predicted task accuracy during the decision-making task. It was also concluded that pilots have a preference for one particular approach to information acquisition. Applications of outcomes of this research include the development of decision support systems that adapt to the information-processing capabilities and preferences of users.
A proven knowledge-based approach to prioritizing process information
NASA Technical Reports Server (NTRS)
Corsberg, Daniel R.
1991-01-01
Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.
The Production of "Proper Cheating" in Online Examinations within Technological Universities
ERIC Educational Resources Information Center
Kitto, Simon; Saltmarsh, Sue
2007-01-01
This paper uses poststructuralist theories of governmentality, agency, consumption and Barry's (2001) concept of Technological Societies, as a heuristic framework to trace the role of online education technologies in the instantiation of subjectification processes within contemporary Australian universities. This case study of the unintended…
Miller, Chad S
2013-01-01
Nearly half of medical errors can be attributed to an error of clinical reasoning or decision making. It is estimated that the correct diagnosis is missed or delayed in between 5% and 14% of acute hospital admissions. Through understanding why and how physicians make these errors, it is hoped that strategies can be developed to decrease the number of these errors. In the present case, a patient presented with dyspnea, gastrointestinal symptoms and weight loss; the diagnosis was initially missed when the treating physicians took mental short cuts and used heuristics as in this case. Heuristics have an inherent bias that can lead to faulty reasoning or conclusions, especially in complex or difficult cases. Affective bias, which is the overinvolvement of emotion in clinical decision making, limited the available information for diagnosis because of the hesitancy to acquire a full history and perform a complete physical examination in this patient. Zebra retreat, another type of bias, is when a rare diagnosis figures prominently on the differential diagnosis but the physician retreats for various reasons. Zebra retreat also factored in the delayed diagnosis. Through the description of these clinical reasoning errors in an actual case, it is hoped that future errors can be prevented or inspiration for additional research in this area will develop.
Joshi, Ashish; Perin, Douglas M Puricelli; Amadi, Chioma; Trout, Kate
2015-03-05
The study purpose was to conduct heuristic evaluation of an interactive, bilingual touchscreen-enabled breastfeeding educational programme for Hispanic women living in rural settings in Nebraska. Three raters conducted the evaluation during May 2013 using principles of Nielson's heuristics. A total of 271 screens were evaluated and included: interface (n = 5), programme sections (n = 223) and educational content (n = 43). A total of 97 heuristic violations were identified and were mostly related to interface (8 violations/5 screens) and programme components (89 violations/266 screens). The most common heuristic violations reported were recognition rather than recall (62%, n = 60), consistency and standards (14%, n = 14) and match between the system and real world (9%, n = 9). Majority of the heuristic violations had minor usability issues (73%, n = 71). The only grade 4 heuristic violation reported was due to the visibility of system status in the assessment modules. The results demonstrated that the system was more consistent with Nielsen's usability heuristics. With Nielsen's usability heuristics, it is possible to identify problems in a timely manner, and help facilitate the identification and prioritisation of problems needing urgent attention at an earlier stage before the final deployment of the system.
Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems.
Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique
2016-01-01
Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems.
Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems
Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique
2016-01-01
Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383
Conflict and Bias in Heuristic Judgment
ERIC Educational Resources Information Center
Bhatia, Sudeep
2017-01-01
Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy.…
Ideology in Writing Instruction: Reconsidering Invention Heuristics.
ERIC Educational Resources Information Center
Byard, Vicki
Modern writing textbooks tend to offer no heuristics, treat heuristics as if they do not have different impacts on inquiry, or take the view that heuristics are ideologically neutral pedagogies. Yet theory about language demonstrates that ideological neutrality is impossible. Any use of language in attempting to represent reality will inevitably…
An Effective Exercise for Teaching Cognitive Heuristics
ERIC Educational Resources Information Center
Swinkels, Alan
2003-01-01
This article describes a brief heuristics demonstration and offers suggestions for personalizing examples of heuristics by making them relevant to students. Students complete a handout asking for 4 judgments illustrative of such heuristics. The decisions are cast in the context of students' daily lives at their particular university. After the…
Qin, Xin; Ren, Run; Zhang, Zhi-Xue; Johnson, Russell E
2015-05-01
Employees routinely make judgments of 3 kinds of justice (i.e., distributive, procedural, and interactional), yet they may lack clear information to do so. This research examines how justice judgments are formed when clear information about certain types of justice is unavailable or ambiguous. Drawing from fairness heuristic theory, as well as more general theories of cognitive heuristics, we predict that when information for 1 type of justice is unclear (i.e., low in justice clarity), people infer its fairness based on other types of justice with clear information (i.e., high in justice clarity). Results across 3 studies employing different designs (correlational vs. experimental), samples (employees vs. students), and measures (proxy vs. direct) provided support for the proposed substitutability effects, especially when inferences were based on clear interactional justice information. Moreover, we found that substitutability effects were more likely to occur when employees had high (vs. low) need for cognitive closure. We conclude by discussing the theoretical contributions and practical implications of our findings. (c) 2015 APA, all rights reserved).
Derived heuristics-based consistent optimization of material flow in a gold processing plant
NASA Astrophysics Data System (ADS)
Myburgh, Christie; Deb, Kalyanmoy
2018-01-01
Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation
Shen, Liang; Huang, Xiaotao; Fan, Chongyi
2018-01-01
Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm. PMID:29724013
Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation.
Shen, Liang; Huang, Xiaotao; Fan, Chongyi
2018-05-01
Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm.
Elementary Principals as Developers vs. Deliverers of District Instructional Decisions
ERIC Educational Resources Information Center
Fields, Joshua Paul
2012-01-01
The purpose of this heuristic case study, also informed through the tradition of critical systems theory was to explore elementary principals' "voices" in instructional decisions made by central office administrators at a large suburban school district in a Midwestern State. Six elementary principals were interviewed for this study.…
Concept Inventories: Predicting the Wrong Answer May Boost Performance
ERIC Educational Resources Information Center
Talanquer, Vincente
2017-01-01
Several concept inventories have been developed to elicit students' alternative conceptions in chemistry. It is suggested that heuristic reasoning may bias students' answers in these types of assessments toward intuitively appealing choices. If this is the case, one could expect students to improve their performance by engaging in more analytical…
An Investigation of the Representativeness Heuristic: The Case of a Multiple Choice Exam
ERIC Educational Resources Information Center
Chernoff, Egan J.; Mamolo, Ami; Zazkis, Rina
2016-01-01
By focusing on a particular alteration of the comparative likelihood task, this study contributes to research on teachers' understanding of probability. Our novel task presented prospective teachers with multinomial, contextualized sequences and asked them to identify which was least likely. Results demonstrate that determinants of…
ERIC Educational Resources Information Center
Partti, Heidi; Westerlund, Heidi
2013-01-01
This qualitative instrumental case study examines collaborative composing in the "operabyyou.com" online music community from the perspective of learning by utilising the concept of a "community of practice" as a heuristic frame. The article suggests that although informal music practices offer important opportunities for…
ERIC Educational Resources Information Center
du Plessis, Andre; Webb, Paul
2012-01-01
This qualitative interpretive exploratory case study investigated a sample of South African teachers' perceptions of the requirements for successful implementation of Information and Communication Technology (ICT) Professional Teacher Development (PTD) within disadvantaged South African township schools in the Port Elizabeth district in South…
How a "Top-Performing" Asian School System Formulates and Implements Policy: The Case of Singapore
ERIC Educational Resources Information Center
Tan, Cheng Yong; Dimmock, Clive
2014-01-01
This article analyses the paradox inherent in the "top-performing" yet tightly controlled Singapore education system. As government controls have increased in complexity, existing policymaking conceptual heuristics in accounting for centre-periphery relationships appear inadequate. It argues that more direct government control is being…
Heuristic and optimal policy computations in the human brain during sequential decision-making.
Korn, Christoph W; Bach, Dominik R
2018-01-23
Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.
Investigating the Impacts of Design Heuristics on Idea Initiation and Development
ERIC Educational Resources Information Center
Kramer, Julia; Daly, Shanna R.; Yilmaz, Seda; Seifert, Colleen M.; Gonzalez, Richard
2015-01-01
This paper presents an analysis of engineering students' use of Design Heuristics as part of a team project in an undergraduate engineering design course. Design Heuristics are an empirically derived set of cognitive "rules of thumb" for use in concept generation. We investigated heuristic use in the initial concept generation phase,…
Heuristics Made Easy: An Effort-Reduction Framework
ERIC Educational Resources Information Center
Shah, Anuj K.; Oppenheimer, Daniel M.
2008-01-01
In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…
Heuristic Diagrams as a Tool to Teach History of Science
ERIC Educational Resources Information Center
Chamizo, Jose A.
2012-01-01
The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…
Pohl, Rüdiger F; Michalkiewicz, Martha; Erdfelder, Edgar; Hilbig, Benjamin E
2017-07-01
According to the recognition-heuristic theory, decision makers solve paired comparisons in which one object is recognized and the other not by recognition alone, inferring that recognized objects have higher criterion values than unrecognized ones. However, success-and thus usefulness-of this heuristic depends on the validity of recognition as a cue, and adaptive decision making, in turn, requires that decision makers are sensitive to it. To this end, decision makers could base their evaluation of the recognition validity either on the selected set of objects (the set's recognition validity), or on the underlying domain from which the objects were drawn (the domain's recognition validity). In two experiments, we manipulated the recognition validity both in the selected set of objects and between domains from which the sets were drawn. The results clearly show that use of the recognition heuristic depends on the domain's recognition validity, not on the set's recognition validity. In other words, participants treat all sets as roughly representative of the underlying domain and adjust their decision strategy adaptively (only) with respect to the more general environment rather than the specific items they are faced with.
Hicks, E Preston; Kluemper, G Thomas
2011-03-01
Studies show that our brains use 2 modes of reasoning: heuristic (intuitive, automatic, implicit processing) and analytic (deliberate, rule-based, explicit processing). The use of intuition often dominates problem solving when innovative, creative thinking is required. Under conditions of uncertainty, we default to an even greater reliance on the heuristic processing. In health care settings and other such environments of increased importance, this mode becomes problematic. Since choice heuristics are quickly constructed from fragments of memory, they are often biased by prior evaluations of and preferences for the alternatives being considered. Therefore, a rigorous and systematic decision process notwithstanding, clinical judgments under uncertainty are often flawed by a number of unwitting biases. Clinical orthodontics is as vulnerable to this fundamental failing in the decision-making process as any other health care discipline. Several of the more common cognitive biases relevant to clinical orthodontics are discussed in this article. By raising awareness of these sources of cognitive errors in our clinical decision making, our intent was to equip the clinician to take corrective action to avoid them. Our secondary goal was to expose this important area of empirical research and encourage those with expertise in the cognitive sciences to explore, through further research, the possible relevance and impact of cognitive heuristics and biases on the accuracy of orthodontic judgments and decision making. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Detecting false positive sequence homology: a machine learning approach.
Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M
2016-02-24
Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.
Byrne, Patrick A; Crawford, J Douglas
2010-06-01
It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.
Design and usability of heuristic‐based deliberation tools for women facing amniocentesis
Durand, Marie‐Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn
2011-01-01
Abstract Background Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health‐care decisions), fast and frugal decision‐making strategies (heuristics) may perform better than complex rules of reasoning. Objective To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). Design The ‘Take The Best’ heuristic (i.e. selection of a ‘most important reason’) and ‘The Tallying’ integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web‐based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health‐care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of ‘your most important reason’ (Take The Best) and ‘weighing it up’ (Tallying). Results Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health‐care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that ‘weighing it up’ and ‘your most important reason’ were not appropriate when facing such a difficult and emotional decision. Conclusion Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real‐world decisions about amniocentesis. PMID:21241434
Bi-Partition of Shared Binary Decision Diagrams
2002-12-01
independently. Such BDDs are considered as a special case of partitioned BDDs [6], [12], [13] and free BDDs ( FBDDs ) [7], [8]. Note that BDD nomenclature...shi, 214-8571 Japan. a)E-mail: sasao@cse.kyutech.ac.jp Applications of partitioned SBDDs are similar to that of partitioned BDDs and FBDDs . When...partitioned SBDD is more canonical than partitioned BDDs and free BDDs ( FBDDs ). We developed a heuristic bi-partition algorithm for SBDDs, and showed cases
Dynamic Inertia Weight Binary Bat Algorithm with Neighborhood Search
2017-01-01
Binary bat algorithm (BBA) is a binary version of the bat algorithm (BA). It has been proven that BBA is competitive compared to other binary heuristic algorithms. Since the update processes of velocity in the algorithm are consistent with BA, in some cases, this algorithm also faces the premature convergence problem. This paper proposes an improved binary bat algorithm (IBBA) to solve this problem. To evaluate the performance of IBBA, standard benchmark functions and zero-one knapsack problems have been employed. The numeric results obtained by benchmark functions experiment prove that the proposed approach greatly outperforms the original BBA and binary particle swarm optimization (BPSO). Compared with several other heuristic algorithms on zero-one knapsack problems, it also verifies that the proposed algorithm is more able to avoid local minima. PMID:28634487
Dynamic Inertia Weight Binary Bat Algorithm with Neighborhood Search.
Huang, Xingwang; Zeng, Xuewen; Han, Rui
2017-01-01
Binary bat algorithm (BBA) is a binary version of the bat algorithm (BA). It has been proven that BBA is competitive compared to other binary heuristic algorithms. Since the update processes of velocity in the algorithm are consistent with BA, in some cases, this algorithm also faces the premature convergence problem. This paper proposes an improved binary bat algorithm (IBBA) to solve this problem. To evaluate the performance of IBBA, standard benchmark functions and zero-one knapsack problems have been employed. The numeric results obtained by benchmark functions experiment prove that the proposed approach greatly outperforms the original BBA and binary particle swarm optimization (BPSO). Compared with several other heuristic algorithms on zero-one knapsack problems, it also verifies that the proposed algorithm is more able to avoid local minima.
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Adelman, H. M.
1984-01-01
Orbiting spacecraft such as large space antennas have to maintain a highly accurate space to operate satisfactorily. Such structures require active and passive controls to mantain an accurate shape under a variety of disturbances. Methods for the optimum placement of control actuators for correcting static deformations are described. In particular, attention is focused on the case were control locations have to be selected from a large set of available sites, so that integer programing methods are called for. The effectiveness of three heuristic techniques for obtaining a near-optimal site selection is compared. In addition, efficient reanalysis techniques for the rapid assessment of control effectiveness are presented. Two examples are used to demonstrate the methods: a simple beam structure and a 55m space-truss-parabolic antenna.
NASA Technical Reports Server (NTRS)
Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Mount-Campbell, Clark A.; Olen, Carl A.
1990-01-01
Consideration is given to the problem of allotting GEO locations to communication satellites so as to maximize the smallest aggregate carrier-to-interference (C/I) ratio calculated at any test point (assumed earth station). The location allotted to each satellite must be within the satellite's service arc, and angular separation constraints are enforced for each pair of satellites to control single-entry EMI. Solutions to this satellite system synthesis problem (SSSP) are found by embedding two heuristic procedures for the satellite location problem (SLP), in a binary search routine to find an estimate of the largest increment to the angular separation values that permits a feasible solution to SLP and SSSP. Numerical results for a 183-satellite, 208-beam example problem are presented.
Multilayer Optimization of Heterogeneous Networks Using Grammatical Genetic Programming.
Fenton, Michael; Lynch, David; Kucera, Stepan; Claussen, Holger; O'Neill, Michael
2017-09-01
Heterogeneous cellular networks are composed of macro cells (MCs) and small cells (SCs) in which all cells occupy the same bandwidth. Provision has been made under the third generation partnership project-long term evolution framework for enhanced intercell interference coordination (eICIC) between cell tiers. Expanding on previous works, this paper instruments grammatical genetic programming to evolve control heuristics for heterogeneous networks. Three aspects of the eICIC framework are addressed including setting SC powers and selection biases, MC duty cycles, and scheduling of user equipments (UEs) at SCs. The evolved heuristics yield minimum downlink rates three times higher than a baseline method, and twice that of a state-of-the-art benchmark. Furthermore, a greater number of UEs receive transmissions under the proposed scheme than in either the baseline or benchmark cases.
Improving patient care. The cognitive psychology of missed diagnoses.
Redelmeier, Donald A
2005-01-18
Cognitive psychology is the science that examines how people reason, formulate judgments, and make decisions. This case involves a patient given a diagnosis of pharyngitis, whose ultimate diagnosis of osteomyelitis was missed through a series of cognitive shortcuts. These errors include the availability heuristic (in which people judge likelihood by how easily examples spring to mind), the anchoring heuristic (in which people stick with initial impressions), framing effects (in which people make different decisions depending on how information is presented), blind obedience (in which people stop thinking when confronted with authority), and premature closure (in which several alternatives are not pursued). Rather than trying to completely eliminate cognitive shortcuts (which often serve clinicians well), becoming aware of common errors might lead to sustained improvement in patient care.
Open shop scheduling problem to minimize total weighted completion time
NASA Astrophysics Data System (ADS)
Bai, Danyu; Zhang, Zhihai; Zhang, Qiang; Tang, Mengqian
2017-01-01
A given number of jobs in an open shop scheduling environment must each be processed for given amounts of time on each of a given set of machines in an arbitrary sequence. This study aims to achieve a schedule that minimizes total weighted completion time. Owing to the strong NP-hardness of the problem, the weighted shortest processing time block (WSPTB) heuristic is presented to obtain approximate solutions for large-scale problems. Performance analysis proves the asymptotic optimality of the WSPTB heuristic in the sense of probability limits. The largest weight block rule is provided to seek optimal schedules in polynomial time for a special case. A hybrid discrete differential evolution algorithm is designed to obtain high-quality solutions for moderate-scale problems. Simulation experiments demonstrate the effectiveness of the proposed algorithms.
Ant Colony Optimization for Markowitz Mean-Variance Portfolio Model
NASA Astrophysics Data System (ADS)
Deng, Guang-Feng; Lin, Woo-Tsong
This work presents Ant Colony Optimization (ACO), which was initially developed to be a meta-heuristic for combinatorial optimization, for solving the cardinality constraints Markowitz mean-variance portfolio model (nonlinear mixed quadratic programming problem). To our knowledge, an efficient algorithmic solution for this problem has not been proposed until now. Using heuristic algorithms in this case is imperative. Numerical solutions are obtained for five analyses of weekly price data for the following indices for the period March, 1992 to September, 1997: Hang Seng 31 in Hong Kong, DAX 100 in Germany, FTSE 100 in UK, S&P 100 in USA and Nikkei 225 in Japan. The test results indicate that the ACO is much more robust and effective than Particle swarm optimization (PSO), especially for low-risk investment portfolios.
Fast or Frugal, but Not Both: Decision Heuristics under Time Pressure
ERIC Educational Resources Information Center
Bobadilla-Suarez, Sebastian; Love, Bradley C.
2018-01-01
Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics…
Fluency Heuristic: A Model of How the Mind Exploits a By-Product of Information Retrieval
ERIC Educational Resources Information Center
Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten
2008-01-01
Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the…
Gigerenzer, Gerd
2008-01-01
The adaptive toolbox is a Darwinian-inspired theory that conceives of the mind as a modular system that is composed of heuristics, their building blocks, and evolved capacities. The study of the adaptive toolbox is descriptive and analyzes the selection and structure of heuristics in social and physical environments. The study of ecological rationality is prescriptive and identifies the structure of environments in which specific heuristics either succeed or fail. Results have been used for designing heuristics and environments to improve professional decision making in the real world. © 2008 Association for Psychological Science.
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Heidelberger, Philip [Cortlandt Manor, NY; Kumar, Sameer [White Plains, NY; Parker, Jeffrey J [Rochester, MN; Ratterman, Joseph D [Rochester, MN
2011-06-07
Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.
Intelligent process mapping through systematic improvement of heuristics
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.
1992-01-01
The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.
Heuristics: foundations for a novel approach to medical decision making.
Bodemer, Nicolai; Hanoch, Yaniv; Katsikopoulos, Konstantinos V
2015-03-01
Medical decision-making is a complex process that often takes place during uncertainty, that is, when knowledge, time, and resources are limited. How can we ensure good decisions? We present research on heuristics-simple rules of thumb-and discuss how medical decision-making can benefit from these tools. We challenge the common view that heuristics are only second-best solutions by showing that they can be more accurate, faster, and easier to apply in comparison to more complex strategies. Using the example of fast-and-frugal decision trees, we illustrate how heuristics can be studied and implemented in the medical context. Finally, we suggest how a heuristic-friendly culture supports the study and application of heuristics as complementary strategies to existing decision rules.
A novel heuristic algorithm for capacitated vehicle routing problem
NASA Astrophysics Data System (ADS)
Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre
2017-09-01
The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.
Heuristics and Cognitive Error in Medical Imaging.
Itri, Jason N; Patel, Sohil H
2018-05-01
The field of cognitive science has provided important insights into mental processes underlying the interpretation of imaging examinations. Despite these insights, diagnostic error remains a major obstacle in the goal to improve quality in radiology. In this article, we describe several types of cognitive bias that lead to diagnostic errors in imaging and discuss approaches to mitigate cognitive biases and diagnostic error. Radiologists rely on heuristic principles to reduce complex tasks of assessing probabilities and predicting values into simpler judgmental operations. These mental shortcuts allow rapid problem solving based on assumptions and past experiences. Heuristics used in the interpretation of imaging studies are generally helpful but can sometimes result in cognitive biases that lead to significant errors. An understanding of the causes of cognitive biases can lead to the development of educational content and systematic improvements that mitigate errors and improve the quality of care provided by radiologists.
Heuristics for the Hodgkin-Huxley system.
Hoppensteadt, Frank
2013-09-01
Hodgkin and Huxley (HH) discovered that voltages control ionic currents in nerve membranes. This led them to describe electrical activity in a neuronal membrane patch in terms of an electronic circuit whose characteristics were determined using empirical data. Due to the complexity of this model, a variety of heuristics, including relaxation oscillator circuits and integrate-and-fire models, have been used to investigate activity in neurons, and these simpler models have been successful in suggesting experiments and explaining observations. Connections between most of the simpler models had not been made clear until recently. Shown here are connections between these heuristics and the full HH model. In particular, we study a new model (Type III circuit): It includes the van der Pol-based models; it can be approximated by a simple integrate-and-fire model; and it creates voltages and currents that correspond, respectively, to the h and V components of the HH system. Copyright © 2012 Elsevier Inc. All rights reserved.
A human reliability based usability evaluation method for safety-critical software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, R. L.; Tran, T. Q.; Gertman, D. I.
2006-07-01
Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thusmore » allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)« less
Meyer, Friederike; Meyer, Thomas D
2009-01-01
Looking at chart records bipolar disorder is often misdiagnosed as a psychotic disorder but no study has ever systematically looked into the reasons. One reason for misdiagnoses could be that clinicians use heuristics like the prototype approach in routine practice instead of strictly adhering to the diagnostic criteria. Using an experimental approach we investigated if the use of heuristics can explain when a diagnosis of psychotic disorder is given instead of bipolar disorder. We systematically varied information about the presence or absence of specific symptoms, i.e. hallucinations and decreased need for sleep during a manic episode. Experimentally varied case vignettes were randomly sent to psychiatrists in Southern Germany. The four versions of the case vignette all described the same person in a manic state and differed only in two aspects: the presence or absence of auditory hallucinations and of decreased need for sleep. The psychiatrists were asked to make a diagnosis, to rate their confidence in their diagnosis, and to recommend treatments. Almost half of the 142 psychiatrists (45%) did not diagnose bipolar disorder. Mentioning hallucinations decreased the likelihood of diagnosing bipolar disorder. The information about decreased need for sleep only affected the diagnosis significantly, if schizoaffective disorder was considered a bipolar disorder. Our results suggest that clinicians indeed use heuristics when making diagnostic decisions instead of strictly adhering to diagnostic criteria. More research is needed to better understand diagnostic decision making, especially under real life settings, and this might also be of interest when revising diagnostic manuals such as DSM.
Generalized Buneman Pruning for Inferring the Most Parsimonious Multi-state Phylogeny
NASA Astrophysics Data System (ADS)
Misra, Navodit; Blelloch, Guy; Ravi, R.; Schwartz, Russell
Accurate reconstruction of phylogenies remains a key challenge in evolutionary biology. Most biologically plausible formulations of the problem are formally NP-hard, with no known efficient solution. The standard in practice are fast heuristic methods that are empirically known to work very well in general, but can yield results arbitrarily far from optimal. Practical exact methods, which yield exponential worst-case running times but generally much better times in practice, provide an important alternative. We report progress in this direction by introducing a provably optimal method for the weighted multi-state maximum parsimony phylogeny problem. The method is based on generalizing the notion of the Buneman graph, a construction key to efficient exact methods for binary sequences, so as to apply to sequences with arbitrary finite numbers of states with arbitrary state transition weights. We implement an integer linear programming (ILP) method for the multi-state problem using this generalized Buneman graph and demonstrate that the resulting method is able to solve data sets that are intractable by prior exact methods in run times comparable with popular heuristics. Our work provides the first method for provably optimal maximum parsimony phylogeny inference that is practical for multi-state data sets of more than a few characters.
NASA Astrophysics Data System (ADS)
Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.
2018-03-01
Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.
How the twain can meet: Prospect theory and models of heuristics in risky choice.
Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph
2017-03-01
Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice. Copyright © 2017 Elsevier Inc. All rights reserved.
Pain as a fact and heuristic: how pain neuroimaging illuminates moral dimensions of law.
Pustilnik, Amanda C
2012-05-01
In legal domains ranging from tort to torture, pain and its degree do important definitional work by delimiting boundaries of lawfulness and of entitlements. Yet, for all the work done by pain as a term in legal texts and practice, it has a confounding lack of external verifiability. Now, neuroimaging is rendering pain and myriad other subjective states at least partly ascertainable. This emerging ability to ascertain and quantify subjective states is prompting a "hedonic" or a "subjectivist" turn in legal scholarship, which has sparked a vigorous debate as to whether the quantification of subjective states might affect legal theory and practice. Subjectivists contend that much values-talk in law has been a necessary but poor substitute for quantitative determinations of subjective states--determinations that will be possible in the law's "experiential future." This Article argues the converse: that pain discourse in law frequently is a heuristic for values. Drawing on interviews and laboratory visits with neuroimaging researchers, this Article shows current and in-principle limitations of pain quantification through neuroimaging. It then presents case studies on torture-murder, torture, the death penalty, and abortion to show the largely heuristic role of pain discourse in law. Introducing the theory of "embodied morality," the Article describes how moral conceptions of rights and duties are informed by human physicality and constrained by the limits of empathic identification. Pain neuroimaging helps reveal this dual factual and heuristic nature of pain in the law, and thus itself points to the translational work required for neuroimaging to influence, much less transform, legal practice and doctrine.
NASA Astrophysics Data System (ADS)
Gamshadzaei, Mohammad Hossein; Rahimzadegan, Majid
2017-10-01
Identification of water extents in Landsat images is challenging due to surfaces with similar reflectance to water extents. The objective of this study is to provide stable and accurate methods for identifying water extents in Landsat images based on meta-heuristic algorithms. Then, seven Landsat images were selected from various environmental regions in Iran. Training of the algorithms was performed using 40 water pixels and 40 nonwater pixels in operational land imager images of Chitgar Lake (one of the study regions). Moreover, high-resolution images from Google Earth were digitized to evaluate the results. Two approaches were considered: index-based and artificial intelligence (AI) algorithms. In the first approach, nine common water spectral indices were investigated. AI algorithms were utilized to acquire coefficients of optimal band combinations to extract water extents. Among the AI algorithms, the artificial neural network algorithm and also the ant colony optimization, genetic algorithm, and particle swarm optimization (PSO) meta-heuristic algorithms were implemented. Index-based methods represented different performances in various regions. Among AI methods, PSO had the best performance with average overall accuracy and kappa coefficient of 93% and 98%, respectively. The results indicated the applicability of acquired band combinations to extract accurately and stably water extents in Landsat imagery.
The recognition heuristic: a review of theory and tests.
Pachur, Thorsten; Todd, Peter M; Gigerenzer, Gerd; Schooler, Lael J; Goldstein, Daniel G
2011-01-01
The recognition heuristic is a prime example of how, by exploiting a match between mind and environment, a simple mental strategy can lead to efficient decision making. The proposal of the heuristic initiated a debate about the processes underlying the use of recognition in decision making. We review research addressing four key aspects of the recognition heuristic: (a) that recognition is often an ecologically valid cue; (b) that people often follow recognition when making inferences; (c) that recognition supersedes further cue knowledge; (d) that its use can produce the less-is-more effect - the phenomenon that lesser states of recognition knowledge can lead to more accurate inferences than more complete states. After we contrast the recognition heuristic to other related concepts, including availability and fluency, we carve out, from the existing findings, some boundary conditions of the use of the recognition heuristic as well as key questions for future research. Moreover, we summarize developments concerning the connection of the recognition heuristic with memory models. We suggest that the recognition heuristic is used adaptively and that, compared to other cues, recognition seems to have a special status in decision making. Finally, we discuss how systematic ignorance is exploited in other cognitive mechanisms (e.g., estimation and preference).
Cognitive load during route selection increases reliance on spatial heuristics.
Brunyé, Tad T; Martis, Shaina B; Taylor, Holly A
2018-05-01
Planning routes from maps involves perceiving the symbolic environment, identifying alternate routes and applying explicit strategies and implicit heuristics to select an option. Two implicit heuristics have received considerable attention, the southern route preference and initial segment strategy. This study tested a prediction from decision-making theory that increasing cognitive load during route planning will increase reliance on these heuristics. In two experiments, participants planned routes while under conditions of minimal (0-back) or high (2-back) working memory load. In Experiment 1, we examined how memory load impacts the southern route heuristic. In Experiment 2, we examined how memory load impacts the initial segment heuristic. Results replicated earlier results demonstrating a southern route preference (Experiment 1) and initial segment strategy (Experiment 2) and further demonstrated that evidence for heuristic reliance is more likely under conditions of concurrent working memory load. Furthermore, the extent to which participants maintained efficient route selection latencies in the 2-back condition predicted the magnitude of this effect. Together, results demonstrate that working memory load increases the application of heuristics during spatial decision making, particularly when participants attempt to maintain quick decisions while managing concurrent task demands.
Improved finite difference schemes for transonic potential calculations
NASA Technical Reports Server (NTRS)
Hafez, M.; Osher, S.; Whitlow, W., Jr.
1984-01-01
Engquist and Osher (1980) have introduced a finite difference scheme for solving the transonic small disturbance equation, taking into account cases in which only compression shocks are admitted. Osher et al. (1983) studied a class of schemes for the full potential equation. It is proved that these schemes satisfy a new discrete 'entropy inequality' which rules out expansion shocks. However, the conducted analysis is restricted to steady two-dimensional flows. The present investigation is concerned with the adoption of a heuristic approach. The full potential equation in conservation form is solved with the aid of a modified artificial density method, based on flux biasing. It is shown that, with the current scheme, expansion shocks are not possible.
Dahmen, Tim; Kohr, Holger; de Jonge, Niels; Slusallek, Philipp
2015-06-01
Combined tilt- and focal series scanning transmission electron microscopy is a recently developed method to obtain nanoscale three-dimensional (3D) information of thin specimens. In this study, we formulate the forward projection in this acquisition scheme as a linear operator and prove that it is a generalization of the Ray transform for parallel illumination. We analytically derive the corresponding backprojection operator as the adjoint of the forward projection. We further demonstrate that the matched backprojection operator drastically improves the convergence rate of iterative 3D reconstruction compared to the case where a backprojection based on heuristic weighting is used. In addition, we show that the 3D reconstruction is of better quality.
Heuristic Diagrams as a Tool to Teach History of Science
NASA Astrophysics Data System (ADS)
Chamizo, José A.
2012-05-01
The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.
The E-health Literacy Demands of Australia's My Health Record: A Heuristic Evaluation of Usability
Walsh, Louisa; Hemsley, Bronwyn; Allan, Meredith; Adams, Natalie; Balandin, Susan; Georgiou, Andrew; Higgins, Isabel; McCarthy, Shaun; Hill, Sophie
2017-01-01
Background My Health Record is Australia's electronic personal health record system, which was introduced in July 2012. As of August 2017, approximately 21 percent of Australia's total population was registered to use My Health Record. Internationally, usability issues have been shown to negatively influence the uptake and use of electronic health record systems, and this scenario may particularly affect people who have low e-health literacy. It is likely that usability issues are negatively affecting the uptake and use of My Health Record in Australia. Objective To identify potential e-health literacy–related usability issues within My Health Record through a heuristic evaluation method. Methods Between September 14 and October 12, 2016, three of the authors conducted a heuristic evaluation of the two consumer-facing components of My Health Record—the information website and the electronic health record itself. These two components were evaluated against two sets of heuristics—the Health Literacy Online checklist and the Monkman Heuristics. The Health Literacy Online checklist and Monkman Heuristics are evidence-based checklists of web design elements with a focus on design for audiences with low health literacy. During this heuristic evaluation, the investigators individually navigated through the consumer-facing components of My Health Record, recording instances where the My Health Record did not conform to the checklist criteria. After the individual evaluations were completed, the investigators conferred and aggregated their results. From this process, a list of usability violations was constructed. Results When evaluated against the Health Literacy Online Checklist, the information website demonstrated violations in 12 of 35 criteria, and the electronic health record demonstrated violations in 16 of 35 criteria. When evaluated against the Monkman Heuristics, the information website demonstrated violations in 7 of 11 criteria, and the electronic health record demonstrated violations in 9 of 11 criteria. The identified violations included usability issues with the reading levels used within My Health Record, the graphic design elements, the layout of web pages, and a lack of images and audiovisual tools to support learning. Other important usability issues included a lack of translated resources, difficulty using accessibility tools, and complexity of the registration processes. Conclusion My Health Record is an important piece of technology that has the potential to facilitate better communication between consumers and their health providers. However, this heuristic evaluation demonstrated that many usability-related elements of My Health Record cater poorly to users at risk of having low e-health literacy. Usability issues have been identified as an important barrier to use of personal health records internationally, and the findings of this heuristic evaluation demonstrate that usability issues may be substantial barriers to the uptake and use of My Health Record. PMID:29118683
Yuan, Michael Juntao; Finley, George Mike; Mills, Christy; Johnson, Ron Kim
2013-01-01
Background Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Objective Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Methods Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. Results A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. Conclusions The evaluation has shown that our design was functional and met the requirements demanded by the nurses’ tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction. PMID:23612350
Effects of an Uncertain Literature on All Facets of Clinical Decision Making
ERIC Educational Resources Information Center
Sammons, Morgan T.; Newman, Russ
2010-01-01
Greenberg (2010) is correct in his assertion that the investigational heuristic used to measure the efficacy of antidepressants is flawed. Robust placebo effects are endemic in the psychiatric literature, particularly in studies of antidepressants, and estimates of placebo responding have increased over time (Rief et al., 2009). In the case of…
ERIC Educational Resources Information Center
Lee, Kathryn S.; Smith, Shaunna; Bos, Beth
2014-01-01
This article reports a heuristic case study that explored how components of Technological Pedagogical Knowledge (TPK) manifested in the artifacts of post-Baccalaureate pre-service teachers. Self-reported perceptions of their technology integration competencies were high. End-of-semester presentations reflected three distinct views of technology…
Of Mental Models, Assumptions and Heuristics: The Case of Acids and Acid Strength
ERIC Educational Resources Information Center
McClary, LaKeisha Michelle
2010-01-01
This study explored what cognitive resources (i.e., units of knowledge necessary to learn) first-semester organic chemistry students used to make decisions about acid strength and how those resources guided the prediction, explanation and justification of trends in acid strength. We were specifically interested in the identifying and…
Impact of Virtual Work Environment on Traditional Team Domains.
ERIC Educational Resources Information Center
Geroy, Gary D.; Olson, Joel; Hartman, Jackie
2002-01-01
Examines a virtual work team to determine the domains of the team and the effect the virtual work environment had on the domains. Discusses results of a literature review and a phenomenological heuristic case study, including the effects of post-modern philosophy and postindustrial society on changes in the marketplace. (Contains 79 references.)…
Ginzburg-Landau equation as a heuristic model for generating rogue waves
NASA Astrophysics Data System (ADS)
Lechuga, Antonio
2016-04-01
Envelope equations have many applications in the study of physical systems. Particularly interesting is the case 0f surface water waves. In steady conditions, laboratory experiments are carried out for multiple purposes either for researches or for practical problems. In both cases envelope equations are useful for understanding qualitative and quantitative results. The Ginzburg-Landau equation provides an excellent model for systems of that kind with remarkable patterns. Taking into account the above paragraph the main aim of our work is to generate waves in a water tank with almost a symmetric spectrum according to Akhmediev (2011) and thus, to produce a succession of rogue waves. The envelope of these waves gives us some patterns whose model is a type of Ginzburg-Landau equation, Danilov et al (1988). From a heuristic point of view the link between the experiment and the model is achieved. Further, the next step consists of changing generating parameters on the water tank and also the coefficients of the Ginzburg-Landau equation, Lechuga (2013) in order to reach a sufficient good approach.
The function of credibility in information processing for risk perception.
Trumbo, Craig W; McComas, Katherine A
2003-04-01
This study examines how credibility affects the way people process information and how they subsequently perceive risk. Three conceptual areas are brought together in this analysis: the psychometric model of risk perception, Eagly and Chaiken's heuristic-systematic information processing model, and Meyer's credibility index. Data come from a study of risk communication in the circumstance of state health department investigations of suspected cancer clusters (five cases, N = 696). Credibility is assessed for three information sources: state health departments, citizen groups, and industries involved in each case. Higher credibility for industry and the state directly predicts lower risk perception, whereas high credibility for citizen groups predicts greater risk perception. A path model shows that perceiving high credibility for industry and state-and perceiving low credibility for citizen groups-promotes heuristic processing, which in turn is a strong predictor of lower risk perception. Alternately, perceiving industry and the state to have low credibility also promotes greater systematic processing, which consistently leads to perception of greater risk. Between a one-fifth and one-third of the effect of credibility on risk perception is shown to be indirectly transmitted through information processing.
Risk-Based Sampling: I Don't Want to Weight in Vain.
Powell, Mark R
2015-12-01
Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.
Dynamic Staffing and Rescheduling in Software Project Management: A Hybrid Approach.
Ge, Yujia; Xu, Bin
2016-01-01
Resource allocation could be influenced by various dynamic elements, such as the skills of engineers and the growth of skills, which requires managers to find an effective and efficient tool to support their staffing decision-making processes. Rescheduling happens commonly and frequently during the project execution. Control options have to be made when new resources are added or tasks are changed. In this paper we propose a software project staffing model considering dynamic elements of staff productivity with a Genetic Algorithm (GA) and Hill Climbing (HC) based optimizer. Since a newly generated reschedule dramatically different from the initial schedule could cause an obvious shifting cost increase, our rescheduling strategies consider both efficiency and stability. The results of real world case studies and extensive simulation experiments show that our proposed method is effective and could achieve comparable performance to other heuristic algorithms in most cases.
Autonomous Data Collection Using a Self-Organizing Map.
Faigl, Jan; Hollinger, Geoffrey A
2018-05-01
The self-organizing map (SOM) is an unsupervised learning technique providing a transformation of a high-dimensional input space into a lower dimensional output space. In this paper, we utilize the SOM for the traveling salesman problem (TSP) to develop a solution to autonomous data collection. Autonomous data collection requires gathering data from predeployed sensors by moving within a limited communication radius. We propose a new growing SOM that adapts the number of neurons during learning, which also allows our approach to apply in cases where some sensors can be ignored due to a lower priority. Based on a comparison with available combinatorial heuristic algorithms for relevant variants of the TSP, the proposed approach demonstrates improved results, while also being less computationally demanding. Moreover, the proposed learning procedure can be extended to cases where particular sensors have varying communication radii, and it can also be extended to multivehicle planning.
Dynamic Staffing and Rescheduling in Software Project Management: A Hybrid Approach
Ge, Yujia; Xu, Bin
2016-01-01
Resource allocation could be influenced by various dynamic elements, such as the skills of engineers and the growth of skills, which requires managers to find an effective and efficient tool to support their staffing decision-making processes. Rescheduling happens commonly and frequently during the project execution. Control options have to be made when new resources are added or tasks are changed. In this paper we propose a software project staffing model considering dynamic elements of staff productivity with a Genetic Algorithm (GA) and Hill Climbing (HC) based optimizer. Since a newly generated reschedule dramatically different from the initial schedule could cause an obvious shifting cost increase, our rescheduling strategies consider both efficiency and stability. The results of real world case studies and extensive simulation experiments show that our proposed method is effective and could achieve comparable performance to other heuristic algorithms in most cases. PMID:27285420
NASA Astrophysics Data System (ADS)
Castellanos Abella, Enrique A.; Van Westen, Cees J.
Geomorphological information can be combined with decision-support tools to assess landslide hazard and risk. A heuristic model was applied to a rural municipality in eastern Cuba. The study is based on a terrain mapping units (TMU) map, generated at 1:50,000 scale by interpretation of aerial photos, satellite images and field data. Information describing 603 terrain units was collected in a database. Landslide areas were mapped in detail to classify the different failure types and parts. Three major landslide regions are recognized in the study area: coastal hills with rockfalls, shallow debris flows and old rotational rockslides denudational slopes in limestone, with very large deep-seated rockslides related to tectonic activity and the Sierra de Caujerí scarp, with large rockslides. The Caujerí scarp presents the highest hazard, with recent landslides and various signs of active processes. The different landforms and the causative factors for landslides were analyzed and used to develop the heuristic model. The model is based on weights assigned by expert judgment and organized in a number of components such as slope angle, internal relief, slope shape, geological formation, active faults, distance to drainage, distance to springs, geomorphological subunits and existing landslide zones. From these variables a hierarchical heuristic model was applied in which three levels of weights were designed for classes, variables, and criteria. The model combines all weights into a single hazard value for each pixel of the landslide hazard map. The hazard map was then divided by two scales, one with three classes for disaster managers and one with 10 detailed hazard classes for technical staff. The range of weight values and the number of existing landslides is registered for each class. The resulting increasing landslide density with higher hazard classes indicates that the output map is reliable. The landslide hazard map was used in combination with existing information on buildings and infrastructure to prepare a qualitative risk map. The complete lack of historical landslide information and geotechnical data precludes the development of quantitative deterministic or probabilistic models.
Simple heuristics in over-the-counter drug choices: a new hint for medical education and practice.
Riva, Silvia; Monti, Marco; Antonietti, Alessandro
2011-01-01
Over-the-counter (OTC) drugs are widely available and often purchased by consumers without advice from a health care provider. Many people rely on self-management of medications to treat common medical conditions. Although OTC medications are regulated by the National and the International Health and Drug Administration, many people are unaware of proper dosing, side effects, adverse drug reactions, and possible medication interactions. This study examined how subjects make their decisions to select an OTC drug, evaluating the role of cognitive heuristics which are simple and adaptive rules that help the decision-making process of people in everyday contexts. By analyzing 70 subjects' information-search and decision-making behavior when selecting OTC drugs, we examined the heuristics they applied in order to assess whether simple decision-making processes were also accurate and relevant. Subjects were tested with a sequence of two experimental tests based on a computerized Java system devised to analyze participants' choices in a virtual environment. We found that subjects' information-search behavior reflected the use of fast and frugal heuristics. In addition, although the heuristics which correctly predicted subjects' decisions implied significantly fewer cues on average than the subjects did in the information-search task, they were accurate in describing order of information search. A simple combination of a fast and frugal tree and a tallying rule predicted more than 78% of subjects' decisions. The current emphasis in health care is to shift some responsibility onto the consumer through expansion of self medication. To know which cognitive mechanisms are behind the choice of OTC drugs is becoming a relevant purpose of current medical education. These findings have implications both for the validity of simple heuristics describing information searches in the field of OTC drug choices and for current medical education, which has to prepare competent health specialists to orientate and support the choices of their patients.
Simple heuristics in over-the-counter drug choices: a new hint for medical education and practice
Riva, Silvia; Monti, Marco; Antonietti, Alessandro
2011-01-01
Introduction Over-the-counter (OTC) drugs are widely available and often purchased by consumers without advice from a health care provider. Many people rely on self-management of medications to treat common medical conditions. Although OTC medications are regulated by the National and the International Health and Drug Administration, many people are unaware of proper dosing, side effects, adverse drug reactions, and possible medication interactions. Purpose This study examined how subjects make their decisions to select an OTC drug, evaluating the role of cognitive heuristics which are simple and adaptive rules that help the decision-making process of people in everyday contexts. Subjects and methods By analyzing 70 subjects’ information-search and decision-making behavior when selecting OTC drugs, we examined the heuristics they applied in order to assess whether simple decision-making processes were also accurate and relevant. Subjects were tested with a sequence of two experimental tests based on a computerized Java system devised to analyze participants’ choices in a virtual environment. Results We found that subjects’ information-search behavior reflected the use of fast and frugal heuristics. In addition, although the heuristics which correctly predicted subjects’ decisions implied significantly fewer cues on average than the subjects did in the information-search task, they were accurate in describing order of information search. A simple combination of a fast and frugal tree and a tallying rule predicted more than 78% of subjects’ decisions. Conclusion The current emphasis in health care is to shift some responsibility onto the consumer through expansion of self medication. To know which cognitive mechanisms are behind the choice of OTC drugs is becoming a relevant purpose of current medical education. These findings have implications both for the validity of simple heuristics describing information searches in the field of OTC drug choices and for current medical education, which has to prepare competent health specialists to orientate and support the choices of their patients. PMID:23745077
Shin, Dmitriy; Kovalenko, Mikhail; Ersoy, Ilker; Li, Yu; Doll, Donald; Shyu, Chi-Ren; Hammer, Richard
2017-01-01
Background: Visual heuristics of pathology diagnosis is a largely unexplored area where reported studies only provided a qualitative insight into the subject. Uncovering and quantifying pathology visual and nonvisual diagnostic patterns have great potential to improve clinical outcomes and avoid diagnostic pitfalls. Methods: Here, we present PathEdEx, an informatics computational framework that incorporates whole-slide digital pathology imaging with multiscale gaze-tracking technology to create web-based interactive pathology educational atlases and to datamine visual and nonvisual diagnostic heuristics. Results: We demonstrate the capabilities of PathEdEx for mining visual and nonvisual diagnostic heuristics using the first PathEdEx volume of a hematopathology atlas. We conducted a quantitative study on the time dynamics of zooming and panning operations utilized by experts and novices to come to the correct diagnosis. We then performed association rule mining to determine sets of diagnostic factors that consistently result in a correct diagnosis, and studied differences in diagnostic strategies across different levels of pathology expertise using Markov chain (MC) modeling and MC Monte Carlo simulations. To perform these studies, we translated raw gaze points to high-explanatory semantic labels that represent pathology diagnostic clues. Therefore, the outcome of these studies is readily transformed into narrative descriptors for direct use in pathology education and practice. Conclusion: PathEdEx framework can be used to capture best practices of pathology visual and nonvisual diagnostic heuristics that can be passed over to the next generation of pathologists and have potential to streamline implementation of precision diagnostics in precision medicine settings. PMID:28828200
Multiobjective hyper heuristic scheme for system design and optimization
NASA Astrophysics Data System (ADS)
Rafique, Amer Farhan
2012-11-01
As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.
Shin, Dmitriy; Kovalenko, Mikhail; Ersoy, Ilker; Li, Yu; Doll, Donald; Shyu, Chi-Ren; Hammer, Richard
2017-01-01
Visual heuristics of pathology diagnosis is a largely unexplored area where reported studies only provided a qualitative insight into the subject. Uncovering and quantifying pathology visual and nonvisual diagnostic patterns have great potential to improve clinical outcomes and avoid diagnostic pitfalls. Here, we present PathEdEx, an informatics computational framework that incorporates whole-slide digital pathology imaging with multiscale gaze-tracking technology to create web-based interactive pathology educational atlases and to datamine visual and nonvisual diagnostic heuristics. We demonstrate the capabilities of PathEdEx for mining visual and nonvisual diagnostic heuristics using the first PathEdEx volume of a hematopathology atlas. We conducted a quantitative study on the time dynamics of zooming and panning operations utilized by experts and novices to come to the correct diagnosis. We then performed association rule mining to determine sets of diagnostic factors that consistently result in a correct diagnosis, and studied differences in diagnostic strategies across different levels of pathology expertise using Markov chain (MC) modeling and MC Monte Carlo simulations. To perform these studies, we translated raw gaze points to high-explanatory semantic labels that represent pathology diagnostic clues. Therefore, the outcome of these studies is readily transformed into narrative descriptors for direct use in pathology education and practice. PathEdEx framework can be used to capture best practices of pathology visual and nonvisual diagnostic heuristics that can be passed over to the next generation of pathologists and have potential to streamline implementation of precision diagnostics in precision medicine settings.
Introducing TreeCollapse: a novel greedy algorithm to solve the cophylogeny reconstruction problem.
Drinkwater, Benjamin; Charleston, Michael A
2014-01-01
Cophylogeny mapping is used to uncover deep coevolutionary associations between two or more phylogenetic histories at a macro coevolutionary scale. As cophylogeny mapping is NP-Hard, this technique relies heavily on heuristics to solve all but the most trivial cases. One notable approach utilises a metaheuristic to search only a subset of the exponential number of fixed node orderings possible for the phylogenetic histories in question. This is of particular interest as it is the only known heuristic that guarantees biologically feasible solutions. This has enabled research to focus on larger coevolutionary systems, such as coevolutionary associations between figs and their pollinator wasps, including over 200 taxa. Although able to converge on solutions for problem instances of this size, a reduction from the current cubic running time is required to handle larger systems, such as Wolbachia and their insect hosts. Rather than solving this underlying problem optimally this work presents a greedy algorithm called TreeCollapse, which uses common topological patterns to recover an approximation of the coevolutionary history where the internal node ordering is fixed. This approach offers a significant speed-up compared to previous methods, running in linear time. This algorithm has been applied to over 100 well-known coevolutionary systems converging on Pareto optimal solutions in over 68% of test cases, even where in some cases the Pareto optimal solution has not previously been recoverable. Further, while TreeCollapse applies a local search technique, it can guarantee solutions are biologically feasible, making this the fastest method that can provide such a guarantee. As a result, we argue that the newly proposed algorithm is a valuable addition to the field of coevolutionary research. Not only does it offer a significantly faster method to estimate the cost of cophylogeny mappings but by using this approach, in conjunction with existing heuristics, it can assist in recovering a larger subset of the Pareto front than has previously been possible.
A case study of a college physics professor's pedagogical content knowledge
NASA Astrophysics Data System (ADS)
Counts, Margaret Cross
Problem. Research into pedagogical content knowledge (PCK) has focused mainly on subject (content) matter, levels of expertise, or subject specific areas. Throughout the literature, Fernandez-Balboa & Stiehl (1992), Grossman (1988), Lenze (1994), Shulman (1986b), few studies about college professors appear. The rationale for this heuristic case study of PCK was to contribute to that body of knowledge as it applies to college teaching. The purpose of this study was twofold: first, to contribute to a broader conceptualization and understanding of the development of "general" PCK in college level teaching by generalizing Shulman's (1987) and Grossman's (1988) model of PCK to college professors; secondly, to describe how this professor's PCK was constructed. Method. The heuristic case study employed techniques of multiple semistructured participant interviews and supportive data sources. Analyses of the data was by analytical induction. Results. In this heuristic study five major themes emerged that reflected this professor's PCK: (a) knowledge of the purposes for teaching, (b) knowledge of students as learners, (c) knowledge of human communication: teaching as an interaction, (d) knowledge of curriculum and course design, and (e) knowledge of a positive learning environment. Six categories emerged that described the development of his PCK: (a) the need for content knowledge, (b) the need for communication, (c) sensitivity to the students' in-class behavior and environment, (d) personal reflection regarding the classroom environment, both before and after class, (e) teaching experience, and (f) collegial discussions about teaching. The construction of his PCK was attributed to the integration of subject matter knowledge, apprenticeship of observation, and classroom experience. Conclusions. Analyses revealed that this college professor's PCK was in a large part congruent with Shulman's (1986b) conceptualization and Grossman's (1988) four components of PCK. An additional affective component, however, was identified for this professor which was considered to be an enhancing interactive component of PCK, the human communication element. Further research into the construction and enhancement of PCK for college faculty is needed.
Automating the packing heuristic design process with genetic programming.
Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John
2012-01-01
The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.
Kamiura, Moto; Sano, Kohei
2017-10-01
The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.
The Recognition Heuristic: A Review of Theory and Tests
Pachur, Thorsten; Todd, Peter M.; Gigerenzer, Gerd; Schooler, Lael J.; Goldstein, Daniel G.
2011-01-01
The recognition heuristic is a prime example of how, by exploiting a match between mind and environment, a simple mental strategy can lead to efficient decision making. The proposal of the heuristic initiated a debate about the processes underlying the use of recognition in decision making. We review research addressing four key aspects of the recognition heuristic: (a) that recognition is often an ecologically valid cue; (b) that people often follow recognition when making inferences; (c) that recognition supersedes further cue knowledge; (d) that its use can produce the less-is-more effect – the phenomenon that lesser states of recognition knowledge can lead to more accurate inferences than more complete states. After we contrast the recognition heuristic to other related concepts, including availability and fluency, we carve out, from the existing findings, some boundary conditions of the use of the recognition heuristic as well as key questions for future research. Moreover, we summarize developments concerning the connection of the recognition heuristic with memory models. We suggest that the recognition heuristic is used adaptively and that, compared to other cues, recognition seems to have a special status in decision making. Finally, we discuss how systematic ignorance is exploited in other cognitive mechanisms (e.g., estimation and preference). PMID:21779266
Heuristic Bayesian segmentation for discovery of coexpressed genes within genomic regions.
Pehkonen, Petri; Wong, Garry; Törönen, Petri
2010-01-01
Segmentation aims to separate homogeneous areas from the sequential data, and plays a central role in data mining. It has applications ranging from finance to molecular biology, where bioinformatics tasks such as genome data analysis are active application fields. In this paper, we present a novel application of segmentation in locating genomic regions with coexpressed genes. We aim at automated discovery of such regions without requirement for user-given parameters. In order to perform the segmentation within a reasonable time, we use heuristics. Most of the heuristic segmentation algorithms require some decision on the number of segments. This is usually accomplished by using asymptotic model selection methods like the Bayesian information criterion. Such methods are based on some simplification, which can limit their usage. In this paper, we propose a Bayesian model selection to choose the most proper result from heuristic segmentation. Our Bayesian model presents a simple prior for the segmentation solutions with various segment numbers and a modified Dirichlet prior for modeling multinomial data. We show with various artificial data sets in our benchmark system that our model selection criterion has the best overall performance. The application of our method in yeast cell-cycle gene expression data reveals potential active and passive regions of the genome.
Recipient design in human communication: simple heuristics or perspective taking?
Blokpoel, Mark; van Kesteren, Marlieke; Stolk, Arjen; Haselager, Pim; Toni, Ivan; van Rooij, Iris
2012-01-01
Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the "how" of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account.
Recipient design in human communication: simple heuristics or perspective taking?
Blokpoel, Mark; van Kesteren, Marlieke; Stolk, Arjen; Haselager, Pim; Toni, Ivan; van Rooij, Iris
2012-01-01
Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the “how” of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account. PMID:23055960
NASA Astrophysics Data System (ADS)
Indrayana, I. N. E.; P, N. M. Wirasyanti D.; Sudiartha, I. KG
2018-01-01
Mobile application allow many users to access data from the application without being limited to space, space and time. Over time the data population of this application will increase. Data access time will cause problems if the data record has reached tens of thousands to millions of records.The objective of this research is to maintain the performance of data execution for large data records. One effort to maintain data access time performance is to apply query optimization method. The optimization used in this research is query heuristic optimization method. The built application is a mobile-based financial application using MySQL database with stored procedure therein. This application is used by more than one business entity in one database, thus enabling rapid data growth. In this stored procedure there is an optimized query using heuristic method. Query optimization is performed on a “Select” query that involves more than one table with multiple clausa. Evaluation is done by calculating the average access time using optimized and unoptimized queries. Access time calculation is also performed on the increase of population data in the database. The evaluation results shown the time of data execution with query heuristic optimization relatively faster than data execution time without using query optimization.
Ericson, Keith M Marzilli; White, John Myles; Laibson, David; Cohen, Jonathan D
2015-06-01
Heuristic models have been proposed for many domains involving choice. We conducted an out-of-sample, cross-validated comparison of heuristic models of intertemporal choice (which can account for many of the known intertemporal choice anomalies) and discounting models. Heuristic models outperformed traditional utility-discounting models, including models of exponential and hyperbolic discounting. The best-performing models predicted choices by using a weighted average of absolute differences and relative percentage differences of the attributes of the goods in a choice set. We concluded that heuristic models explain time-money trade-off choices in experiments better than do utility-discounting models. © The Author(s) 2015.
Marzilli Ericson, Keith M.; White, John Myles; Laibson, David; Cohen, Jonathan D.
2015-01-01
Heuristic models have been proposed for many domains of choice. We compare heuristic models of intertemporal choice, which can account for many of the known intertemporal choice anomalies, to discounting models. We conduct an out-of-sample, cross-validated comparison of intertemporal choice models. Heuristic models outperform traditional utility discounting models, including models of exponential and hyperbolic discounting. The best performing models predict choices by using a weighted average of absolute differences and relative (percentage) differences of the attributes of the goods in a choice set. We conclude that heuristic models explain time-money tradeoff choices in experiments better than utility discounting models. PMID:25911124
Heuristics and Problem Solving.
ERIC Educational Resources Information Center
Abel, Charles F.
2003-01-01
Defines heuristics as cognitive "rules of thumb" that can help problem solvers work more efficiently and effectively. Professors can use a heuristic model of problem solving to guide students in all disciplines through the steps of problem-solving. (SWM)
On use of image quality metrics for perceptual blur modeling: image/video compression case
NASA Astrophysics Data System (ADS)
Cha, Jae H.; Olson, Jeffrey T.; Preece, Bradley L.; Espinola, Richard L.; Abbott, A. Lynn
2018-02-01
Linear system theory is employed to make target acquisition performance predictions for electro-optical/infrared imaging systems where the modulation transfer function (MTF) may be imposed from a nonlinear degradation process. Previous research relying on image quality metrics (IQM) methods, which heuristically estimate perceived MTF has supported that an average perceived MTF can be used to model some types of degradation such as image compression. Here, we discuss the validity of the IQM approach by mathematically analyzing the associated heuristics from the perspective of reliability, robustness, and tractability. Experiments with standard images compressed by x.264 encoding suggest that the compression degradation can be estimated by a perceived MTF within boundaries defined by well-behaved curves with marginal error. Our results confirm that the IQM linearizer methodology provides a credible tool for sensor performance modeling.
An adaptive toolbox approach to the route to expertise in sport.
de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus
2014-01-01
Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.