Sample records for simple heuristic model

  1. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models.

    PubMed

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-05-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.

  2. Précis of Simple heuristics that make us smart.

    PubMed

    Todd, P M; Gigerenzer, G

    2000-10-01

    How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.

  3. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models

    ERIC Educational Resources Information Center

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-01-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in…

  4. Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.

    PubMed

    Hutchinson, John M C; Gigerenzer, Gerd

    2005-05-31

    The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.

  5. Money Earlier or Later? Simple Heuristics Explain Intertemporal Choices Better than Delay Discounting1

    PubMed Central

    Marzilli Ericson, Keith M.; White, John Myles; Laibson, David; Cohen, Jonathan D.

    2015-01-01

    Heuristic models have been proposed for many domains of choice. We compare heuristic models of intertemporal choice, which can account for many of the known intertemporal choice anomalies, to discounting models. We conduct an out-of-sample, cross-validated comparison of intertemporal choice models. Heuristic models outperform traditional utility discounting models, including models of exponential and hyperbolic discounting. The best performing models predict choices by using a weighted average of absolute differences and relative (percentage) differences of the attributes of the goods in a choice set. We conclude that heuristic models explain time-money tradeoff choices in experiments better than utility discounting models. PMID:25911124

  6. Money earlier or later? Simple heuristics explain intertemporal choices better than delay discounting does.

    PubMed

    Ericson, Keith M Marzilli; White, John Myles; Laibson, David; Cohen, Jonathan D

    2015-06-01

    Heuristic models have been proposed for many domains involving choice. We conducted an out-of-sample, cross-validated comparison of heuristic models of intertemporal choice (which can account for many of the known intertemporal choice anomalies) and discounting models. Heuristic models outperformed traditional utility-discounting models, including models of exponential and hyperbolic discounting. The best-performing models predicted choices by using a weighted average of absolute differences and relative percentage differences of the attributes of the goods in a choice set. We concluded that heuristic models explain time-money trade-off choices in experiments better than do utility-discounting models. © The Author(s) 2015.

  7. The probability heuristics model of syllogistic reasoning.

    PubMed

    Chater, N; Oaksford, M

    1999-03-01

    A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.

  8. Simple Heuristic Approach to Introduction of the Black-Scholes Model

    ERIC Educational Resources Information Center

    Yalamova, Rossitsa

    2010-01-01

    A heuristic approach to explaining of the Black-Scholes option pricing model in undergraduate classes is described. The approach draws upon the method of protocol analysis to encourage students to "think aloud" so that their mental models can be surfaced. It also relies upon extensive visualizations to communicate relationships that are…

  9. Requirements analysis, domain knowledge, and design

    NASA Technical Reports Server (NTRS)

    Potts, Colin

    1988-01-01

    Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.

  10. Heuristic decision making.

    PubMed

    Gigerenzer, Gerd; Gaissmaier, Wolfgang

    2011-01-01

    As reflected in the amount of controversy, few areas in psychology have undergone such dramatic conceptual changes in the past decade as the emerging science of heuristics. Heuristics are efficient cognitive processes, conscious or unconscious, that ignore part of the information. Because using heuristics saves effort, the classical view has been that heuristic decisions imply greater errors than do "rational" decisions as defined by logic or statistical models. However, for many decisions, the assumptions of rational models are not met, and it is an empirical rather than an a priori issue how well cognitive heuristics function in an uncertain world. To answer both the descriptive question ("Which heuristics do people use in which situations?") and the prescriptive question ("When should people rely on a given heuristic rather than a complex strategy to make better judgments?"), formal models are indispensable. We review research that tests formal models of heuristic inference, including in business organizations, health care, and legal institutions. This research indicates that (a) individuals and organizations often rely on simple heuristics in an adaptive way, and (b) ignoring part of the information can lead to more accurate judgments than weighting and adding all information, for instance for low predictability and small samples. The big future challenge is to develop a systematic theory of the building blocks of heuristics as well as the core capacities and environmental structures these exploit.

  11. How the twain can meet: Prospect theory and models of heuristics in risky choice.

    PubMed

    Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph

    2017-03-01

    Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. It looks easy! Heuristics for combinatorial optimization problems.

    PubMed

    Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair

    2006-04-01

    Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.

  13. Heuristics as Bayesian inference under extreme priors.

    PubMed

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Heuristics for the Hodgkin-Huxley system.

    PubMed

    Hoppensteadt, Frank

    2013-09-01

    Hodgkin and Huxley (HH) discovered that voltages control ionic currents in nerve membranes. This led them to describe electrical activity in a neuronal membrane patch in terms of an electronic circuit whose characteristics were determined using empirical data. Due to the complexity of this model, a variety of heuristics, including relaxation oscillator circuits and integrate-and-fire models, have been used to investigate activity in neurons, and these simpler models have been successful in suggesting experiments and explaining observations. Connections between most of the simpler models had not been made clear until recently. Shown here are connections between these heuristics and the full HH model. In particular, we study a new model (Type III circuit): It includes the van der Pol-based models; it can be approximated by a simple integrate-and-fire model; and it creates voltages and currents that correspond, respectively, to the h and V components of the HH system. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Not so fast! (and not so frugal!): rethinking the recognition heuristic.

    PubMed

    Oppenheimer, Daniel M

    2003-11-01

    The 'fast and frugal' approach to reasoning (Gigerenzer, G., & Todd, P. M. (1999). Simple heuristics that make us smart. New York: Oxford University Press) claims that individuals use non-compensatory strategies in judgment--the idea that only one cue is taken into account in reasoning. The simplest and most important of these heuristics postulates that judgment sometimes relies solely on recognition. However, the studies that have investigated usage of the recognition heuristic have confounded recognition with other cues that could also lead to similar judgments. This paper tests whether mere recognition is actually driving the findings in support of the recognition heuristic. Two studies provide evidence that judgments do not conform to the recognition heuristic when these confounds are accounted for. Implications for the study of simple heuristics are discussed.

  16. A computational approach to animal breeding.

    PubMed

    Berger-Wolf, Tanya Y; Moore, Cristopher; Saia, Jared

    2007-02-07

    We propose a computational model of mating strategies for controlled animal breeding programs. A mating strategy in a controlled breeding program is a heuristic with some optimization criteria as a goal. Thus, it is appropriate to use the computational tools available for analysis of optimization heuristics. In this paper, we propose the first discrete model of the controlled animal breeding problem and analyse heuristics for two possible objectives: (1) breeding for maximum diversity and (2) breeding a target individual. These two goals are representative of conservation biology and agricultural livestock management, respectively. We evaluate several mating strategies and provide upper and lower bounds for the expected number of matings. While the population parameters may vary and can change the actual number of matings for a particular strategy, the order of magnitude of the number of expected matings and the relative competitiveness of the mating heuristics remains the same. Thus, our simple discrete model of the animal breeding problem provides a novel viable and robust approach to designing and comparing breeding strategies in captive populations.

  17. Aggregate age-at-marriage patterns from individual mate-search heuristics.

    PubMed

    Todd, Peter M; Billari, Francesco C; Simão, Jorge

    2005-08-01

    The distribution of age at first marriage shows well-known strong regularities across many countries and recent historical periods. We accounted for these patterns by developing agent-based models that simulate the aggregate behavior of individuals who are searching for marriage partners. Past models assumed fully rational agents with complete knowledge of the marriage market; our simulated agents used psychologically plausible simple heuristic mate search rules that adjust aspiration levels on the basis of a sequence of encounters with potential partners. Substantial individual variation must be included in the models to account for the demographically observed age-at-marriage patterns.

  18. Heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer.

    PubMed

    Evans, Jonathan St B T; Over, David E

    2010-05-01

    Marewski, Gaissmaier and Gigerenzer (2009) present a review of research on fast and frugal heuristics, arguing that complex problems are best solved by simple heuristics, rather than the application of knowledge and logical reasoning. We argue that the case for such heuristics is overrated. First, we point out that heuristics can often lead to biases as well as effective responding. Second, we show that the application of logical reasoning can be both necessary and relatively simple. Finally, we argue that the evidence for a logical reasoning system that co-exists with simpler heuristic forms of thinking is overwhelming. Not only is it implausible a priori that we would have evolved such a system that is of no use to us, but extensive evidence from the literature on dual processing in reasoning and judgement shows that many problems can only be solved when this form of reasoning is used to inhibit and override heuristic thinking.

  19. Scaling for the SOL/separatrix χ ⊥ following from the heuristic drift model for the power scrape-off layer width

    NASA Astrophysics Data System (ADS)

    Huber, A.; Chankin, A. V.

    2017-06-01

    A simple two-point representation of the tokamak scrape-off layer (SOL) in the conduction limited regime, based on the parallel and perpendicular energy balance equations in combination with the heat flux width predicted by a heuristic drift-based model, was used to derive a scaling for the cross-field thermal diffusivity {χ }\\perp . For fixed plasma shape and neglecting weak power dependence indexes 1/8, the scaling {χ }\\perp \\propto {P}{{S}{{O}}{{L}}}/(n{B}θ {R}2) is derived.

  20. Re-visions of rationality?

    PubMed

    Newell, Ben R

    2005-01-01

    The appeal of simple algorithms that take account of both the constraints of human cognitive capacity and the structure of environments has been an enduring theme in cognitive science. A novel version of such a boundedly rational perspective views the mind as containing an 'adaptive toolbox' of specialized cognitive heuristics suited to different problems. Although intuitively appealing, when this version was proposed, empirical evidence for the use of such heuristics was scant. I argue that in the light of empirical studies carried out since then, it is time this 'vision of rationality' was revised. An alternative view based on integrative models rather than collections of heuristics is proposed.

  1. The recognition heuristic: a review of theory and tests.

    PubMed

    Pachur, Thorsten; Todd, Peter M; Gigerenzer, Gerd; Schooler, Lael J; Goldstein, Daniel G

    2011-01-01

    The recognition heuristic is a prime example of how, by exploiting a match between mind and environment, a simple mental strategy can lead to efficient decision making. The proposal of the heuristic initiated a debate about the processes underlying the use of recognition in decision making. We review research addressing four key aspects of the recognition heuristic: (a) that recognition is often an ecologically valid cue; (b) that people often follow recognition when making inferences; (c) that recognition supersedes further cue knowledge; (d) that its use can produce the less-is-more effect - the phenomenon that lesser states of recognition knowledge can lead to more accurate inferences than more complete states. After we contrast the recognition heuristic to other related concepts, including availability and fluency, we carve out, from the existing findings, some boundary conditions of the use of the recognition heuristic as well as key questions for future research. Moreover, we summarize developments concerning the connection of the recognition heuristic with memory models. We suggest that the recognition heuristic is used adaptively and that, compared to other cues, recognition seems to have a special status in decision making. Finally, we discuss how systematic ignorance is exploited in other cognitive mechanisms (e.g., estimation and preference).

  2. Speededness and Adaptive Testing

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Xiong, Xinhui

    2013-01-01

    Two simple constraints on the item parameters in a response--time model are proposed to control the speededness of an adaptive test. As the constraints are additive, they can easily be included in the constraint set for a shadow-test approach (STA) to adaptive testing. Alternatively, a simple heuristic is presented to control speededness in plain…

  3. How Monte Carlo heuristics aid to identify the physical processes of drug release kinetics.

    PubMed

    Lecca, Paola

    2018-01-01

    We implement a Monte Carlo heuristic algorithm to model drug release from a solid dosage form. We show that with Monte Carlo simulations it is possible to identify and explain the causes of the unsatisfactory predictive power of current drug release models. It is well known that the power-law, the exponential models, as well as those derived from or inspired by them accurately reproduce only the first 60% of the release curve of a drug from a dosage form. In this study, by using Monte Carlo simulation approaches, we show that these models fit quite accurately almost the entire release profile when the release kinetics is not governed by the coexistence of different physico-chemical mechanisms. We show that the accuracy of the traditional models are comparable with those of Monte Carlo heuristics when these heuristics approximate and oversimply the phenomenology of drug release. This observation suggests to develop and use novel Monte Carlo simulation heuristics able to describe the complexity of the release kinetics, and consequently to generate data more similar to those observed in real experiments. Implementing Monte Carlo simulation heuristics of the drug release phenomenology may be much straightforward and efficient than hypothesizing and implementing from scratch complex mathematical models of the physical processes involved in drug release. Identifying and understanding through simulation heuristics what processes of this phenomenology reproduce the observed data and then formalize them in mathematics may allow avoiding time-consuming, trial-error based regression procedures. Three bullet points, highlighting the customization of the procedure. •An efficient heuristics based on Monte Carlo methods for simulating drug release from solid dosage form encodes is presented. It specifies the model of the physical process in a simple but accurate way in the formula of the Monte Carlo Micro Step (MCS) time interval.•Given the experimentally observed curve of drug release, we point out how Monte Carlo heuristics can be integrated in an evolutionary algorithmic approach to infer the mode of MCS best fitting the observed data, and thus the observed release kinetics.•The software implementing the method is written in R language, the free most used language in the bioinformaticians community.

  4. Path integration mediated systematic search: a Bayesian model.

    PubMed

    Vickerstaff, Robert J; Merkle, Tobias

    2012-08-21

    The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Motor heuristics and embodied choices: how to choose and act.

    PubMed

    Raab, Markus

    2017-08-01

    Human performance requires choosing what to do and how to do it. The goal of this theoretical contribution is to advance understanding of how the motor and cognitive components of choices are intertwined. From a holistic perspective I extend simple heuristics that have been tested in cognitive tasks to motor tasks, coining the term motor heuristics. Similarly I extend the concept of embodied cognition, that has been tested in simple sensorimotor processes changing decisions, to complex sport behavior coining the term embodied choices. Thus both motor heuristics and embodied choices explain complex behavior such as studied in sport and exercise psychology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. How cognitive heuristics can explain social interactions in spatial movement.

    PubMed

    Seitz, Michael J; Bode, Nikolai W F; Köster, Gerta

    2016-08-01

    The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as 'stop if another step would lead to a collision' or 'follow the person in front'. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. © 2016 The Author(s).

  7. How cognitive heuristics can explain social interactions in spatial movement

    PubMed Central

    Köster, Gerta

    2016-01-01

    The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as ‘stop if another step would lead to a collision’ or ‘follow the person in front’. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. PMID:27581483

  8. Combinatorial structures to modeling simple games and applications

    NASA Astrophysics Data System (ADS)

    Molinero, Xavier

    2017-09-01

    We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.

  9. Fast or Frugal, but Not Both: Decision Heuristics under Time Pressure

    ERIC Educational Resources Information Center

    Bobadilla-Suarez, Sebastian; Love, Bradley C.

    2018-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics…

  10. The Recognition Heuristic: A Review of Theory and Tests

    PubMed Central

    Pachur, Thorsten; Todd, Peter M.; Gigerenzer, Gerd; Schooler, Lael J.; Goldstein, Daniel G.

    2011-01-01

    The recognition heuristic is a prime example of how, by exploiting a match between mind and environment, a simple mental strategy can lead to efficient decision making. The proposal of the heuristic initiated a debate about the processes underlying the use of recognition in decision making. We review research addressing four key aspects of the recognition heuristic: (a) that recognition is often an ecologically valid cue; (b) that people often follow recognition when making inferences; (c) that recognition supersedes further cue knowledge; (d) that its use can produce the less-is-more effect – the phenomenon that lesser states of recognition knowledge can lead to more accurate inferences than more complete states. After we contrast the recognition heuristic to other related concepts, including availability and fluency, we carve out, from the existing findings, some boundary conditions of the use of the recognition heuristic as well as key questions for future research. Moreover, we summarize developments concerning the connection of the recognition heuristic with memory models. We suggest that the recognition heuristic is used adaptively and that, compared to other cues, recognition seems to have a special status in decision making. Finally, we discuss how systematic ignorance is exploited in other cognitive mechanisms (e.g., estimation and preference). PMID:21779266

  11. Post-game analysis: An initial experiment for heuristic-based resource management in concurrent systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.

    1987-01-01

    In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.

  12. A lack of appetite for information and computation. Simple heuristics in food choice.

    PubMed

    Schulte-Mecklenbeck, Michael; Sohn, Matthias; de Bellis, Emanuel; Martin, Nathalie; Hertwig, Ralph

    2013-12-01

    The predominant, but largely untested, assumption in research on food choice is that people obey the classic commandments of rational behavior: they carefully look up every piece of relevant information, weight each piece according to subjective importance, and then combine them into a judgment or choice. In real world situations, however, the available time, motivation, and computational resources may simply not suffice to keep these commandments. Indeed, there is a large body of research suggesting that human choice is often better accommodated by heuristics-simple rules that enable decision making on the basis of a few, but important, pieces of information. We investigated the prevalence of such heuristics in a computerized experiment that engaged participants in a series of choices between two lunch dishes. Employing MouselabWeb, a process-tracing technique, we found that simple heuristics described an overwhelmingly large proportion of choices, whereas strategies traditionally deemed rational were barely apparent in our data. Replicating previous findings, we also observed that visual stimulus segments received a much larger proportion of attention than any nutritional values did. Our results suggest that, consistent with human behavior in other domains, people make their food choices on the basis of simple and informationally frugal heuristics. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Heuristic Bayesian segmentation for discovery of coexpressed genes within genomic regions.

    PubMed

    Pehkonen, Petri; Wong, Garry; Törönen, Petri

    2010-01-01

    Segmentation aims to separate homogeneous areas from the sequential data, and plays a central role in data mining. It has applications ranging from finance to molecular biology, where bioinformatics tasks such as genome data analysis are active application fields. In this paper, we present a novel application of segmentation in locating genomic regions with coexpressed genes. We aim at automated discovery of such regions without requirement for user-given parameters. In order to perform the segmentation within a reasonable time, we use heuristics. Most of the heuristic segmentation algorithms require some decision on the number of segments. This is usually accomplished by using asymptotic model selection methods like the Bayesian information criterion. Such methods are based on some simplification, which can limit their usage. In this paper, we propose a Bayesian model selection to choose the most proper result from heuristic segmentation. Our Bayesian model presents a simple prior for the segmentation solutions with various segment numbers and a modified Dirichlet prior for modeling multinomial data. We show with various artificial data sets in our benchmark system that our model selection criterion has the best overall performance. The application of our method in yeast cell-cycle gene expression data reveals potential active and passive regions of the genome.

  14. Simulation of empty container logistic management at depot

    NASA Astrophysics Data System (ADS)

    Sze, San-Nah; Sek, Siaw-Ying Doreen; Chiew, Kang-Leng; Tiong, Wei-King

    2017-07-01

    This study focuses on the empty container management problem in a deficit regional area. Deficit area is the area having more export activities than the import activities, which always have a shortage of empty container. This environment has challenged the trading companies in the decision making in distributing the empty containers. A simulation model that fit to the environment is developed. Besides, a simple heuristic algorithm with some hard and soft constraints consideration are proposed to plan the logistic of empty container supply. Then, the feasible route with the minimum cost will be determined by applying the proposed heuristic algorithm. The heuristic algorithm can be divided into three main phases which are data sorting, data assigning and time window updating.

  15. A Computer Model of Simple Forms of Learning.

    ERIC Educational Resources Information Center

    Jones, Thomas L.

    A basic unsolved problem in science is that of understanding learning, the process by which people and machines use their experience in a situation to guide future action in similar situations. The ideas of Piaget, Pavlov, Hull, and other learning theorists, as well as previous heuristic programing models of human intelligence, stimulated this…

  16. Cognitive niches: an ecological model of strategy selection.

    PubMed

    Marewski, Julian N; Schooler, Lael J

    2011-07-01

    How do people select among different strategies to accomplish a given task? Across disciplines, the strategy selection problem represents a major challenge. We propose a quantitative model that predicts how selection emerges through the interplay among strategies, cognitive capacities, and the environment. This interplay carves out for each strategy a cognitive niche, that is, a limited number of situations in which the strategy can be applied, simplifying strategy selection. To illustrate our proposal, we consider selection in the context of 2 theories: the simple heuristics framework and the ACT-R (adaptive control of thought-rational) architecture of cognition. From the heuristics framework, we adopt the thesis that people make decisions by selecting from a repertoire of simple decision strategies that exploit regularities in the environment and draw on cognitive capacities, such as memory and time perception. ACT-R provides a quantitative theory of how these capacities adapt to the environment. In 14 simulations and 10 experiments, we consider the choice between strategies that operate on the accessibility of memories and those that depend on elaborate knowledge about the world. Based on Internet statistics, our model quantitatively predicts people's familiarity with and knowledge of real-world objects, the distributional characteristics of the associated speed of memory retrieval, and the cognitive niches of classic decision strategies, including those of the fluency, recognition, integration, lexicographic, and sequential-sampling heuristics. In doing so, the model specifies when people will be able to apply different strategies and how accurate, fast, and effortless people's decisions will be.

  17. A Planar Quasi-Static Constraint Mode Tire Model

    DTIC Science & Technology

    2015-07-10

    strikes a balance between simple tire models that lack the fidelity to make accurate chassis load predictions and computationally intensive models that...strikes a balance between heuristic tire models (such as a linear point-follower) that lack the fidelity to make accurate chassis load predictions...UNCLASSIFIED: Distribution Statement A. Cleared for public release A PLANAR QUASI-STATIC CONSTRAINT MODE TIRE MODEL Rui Maa John B. Ferris

  18. Family practitioners' diagnostic decision-making processes regarding patients with respiratory tract infections: an observational study.

    PubMed

    Fischer, Thomas; Fischer, Susanne; Himmel, Wolfgang; Kochen, Michael M; Hummers-Pradier, Eva

    2008-01-01

    The influence of patient characteristics on family practitioners' (FPs') diagnostic decision making has mainly been investigated using indirect methods such as vignettes or questionnaires. Direct observation-borrowed from social and cultural anthropology-may be an alternative method for describing FPs' real-life behavior and may help in gaining insight into how FPs diagnose respiratory tract infections, which are frequent in primary care. To clarify FPs' diagnostic processes when treating patients suffering from symptoms of respiratory tract infection. This direct observation study was performed in 30 family practices using a checklist for patient complaints, history taking, physical examination, and diagnoses. The influence of patients' symptoms and complaints on the FPs' physical examination and diagnosis was calculated by logistic regression analyses. Dummy variables based on combinations of symptoms and complaints were constructed and tested against saturated (full) and backward regression models. In total, 273 patients (median age 37 years, 51% women) were included. The median number of symptoms described was 4 per patient, and most information was provided at the patients' own initiative. Multiple logistic regression analysis showed a strong association between patients' complaints and the physical examination. Frequent diagnoses were upper respiratory tract infection (URTI)/common cold (43%), bronchitis (26%), sinusitis (12%), and tonsillitis (11%). There were no significant statistical differences between "simple heuristic'' models and saturated regression models in the diagnoses of bronchitis, sinusitis, and tonsillitis, indicating that simple heuristics are probably used by the FPs, whereas "URTI/common cold'' was better explained by the full model. FPs tended to make their diagnosis based on a few patient symptoms and a limited physical examination. Simple heuristic models were almost as powerful in explaining most diagnoses as saturated models. Direct observation allowed for the study of decision making under real conditions, yielding both quantitative data and "qualitative'' information about the FPs' performance. It is important for investigators to be aware of the specific disadvantages of the method (e.g., a possible observer effect).

  19. Design and usability of heuristic-based deliberation tools for women facing amniocentesis.

    PubMed

    Durand, Marie-Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn

    2012-03-01

    Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). The 'Take The Best' heuristic (i.e. selection of a 'most important reason') and 'The Tallying' integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web-based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health-care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of 'your most important reason' (Take The Best) and 'weighing it up' (Tallying). Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health-care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that 'weighing it up' and 'your most important reason' were not appropriate when facing such a difficult and emotional decision. Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real-world decisions about amniocentesis. © 2011 Blackwell Publishing Ltd.

  20. Simple heuristics in over-the-counter drug choices: a new hint for medical education and practice.

    PubMed

    Riva, Silvia; Monti, Marco; Antonietti, Alessandro

    2011-01-01

    Over-the-counter (OTC) drugs are widely available and often purchased by consumers without advice from a health care provider. Many people rely on self-management of medications to treat common medical conditions. Although OTC medications are regulated by the National and the International Health and Drug Administration, many people are unaware of proper dosing, side effects, adverse drug reactions, and possible medication interactions. This study examined how subjects make their decisions to select an OTC drug, evaluating the role of cognitive heuristics which are simple and adaptive rules that help the decision-making process of people in everyday contexts. By analyzing 70 subjects' information-search and decision-making behavior when selecting OTC drugs, we examined the heuristics they applied in order to assess whether simple decision-making processes were also accurate and relevant. Subjects were tested with a sequence of two experimental tests based on a computerized Java system devised to analyze participants' choices in a virtual environment. We found that subjects' information-search behavior reflected the use of fast and frugal heuristics. In addition, although the heuristics which correctly predicted subjects' decisions implied significantly fewer cues on average than the subjects did in the information-search task, they were accurate in describing order of information search. A simple combination of a fast and frugal tree and a tallying rule predicted more than 78% of subjects' decisions. The current emphasis in health care is to shift some responsibility onto the consumer through expansion of self medication. To know which cognitive mechanisms are behind the choice of OTC drugs is becoming a relevant purpose of current medical education. These findings have implications both for the validity of simple heuristics describing information searches in the field of OTC drug choices and for current medical education, which has to prepare competent health specialists to orientate and support the choices of their patients.

  1. Simple heuristics in over-the-counter drug choices: a new hint for medical education and practice

    PubMed Central

    Riva, Silvia; Monti, Marco; Antonietti, Alessandro

    2011-01-01

    Introduction Over-the-counter (OTC) drugs are widely available and often purchased by consumers without advice from a health care provider. Many people rely on self-management of medications to treat common medical conditions. Although OTC medications are regulated by the National and the International Health and Drug Administration, many people are unaware of proper dosing, side effects, adverse drug reactions, and possible medication interactions. Purpose This study examined how subjects make their decisions to select an OTC drug, evaluating the role of cognitive heuristics which are simple and adaptive rules that help the decision-making process of people in everyday contexts. Subjects and methods By analyzing 70 subjects’ information-search and decision-making behavior when selecting OTC drugs, we examined the heuristics they applied in order to assess whether simple decision-making processes were also accurate and relevant. Subjects were tested with a sequence of two experimental tests based on a computerized Java system devised to analyze participants’ choices in a virtual environment. Results We found that subjects’ information-search behavior reflected the use of fast and frugal heuristics. In addition, although the heuristics which correctly predicted subjects’ decisions implied significantly fewer cues on average than the subjects did in the information-search task, they were accurate in describing order of information search. A simple combination of a fast and frugal tree and a tallying rule predicted more than 78% of subjects’ decisions. Conclusion The current emphasis in health care is to shift some responsibility onto the consumer through expansion of self medication. To know which cognitive mechanisms are behind the choice of OTC drugs is becoming a relevant purpose of current medical education. These findings have implications both for the validity of simple heuristics describing information searches in the field of OTC drug choices and for current medical education, which has to prepare competent health specialists to orientate and support the choices of their patients. PMID:23745077

  2. Parallel constraint satisfaction in memory-based decisions.

    PubMed

    Glöckner, Andreas; Hodges, Sara D

    2011-01-01

    Three studies sought to investigate decision strategies in memory-based decisions and to test the predictions of the parallel constraint satisfaction (PCS) model for decision making (Glöckner & Betsch, 2008). Time pressure was manipulated and the model was compared against simple heuristics (take the best and equal weight) and a weighted additive strategy. From PCS we predicted that fast intuitive decision making is based on compensatory information integration and that decision time increases and confidence decreases with increasing inconsistency in the decision task. In line with these predictions we observed a predominant usage of compensatory strategies under all time-pressure conditions and even with decision times as short as 1.7 s. For a substantial number of participants, choices and decision times were best explained by PCS, but there was also evidence for use of simple heuristics. The time-pressure manipulation did not significantly affect decision strategies. Overall, the results highlight intuitive, automatic processes in decision making and support the idea that human information-processing capabilities are less severely bounded than often assumed.

  3. Hirarchical emotion calculation model for virtual human modellin - biomed 2010.

    PubMed

    Zhao, Yue; Wright, David

    2010-01-01

    This paper introduces a new emotion generation method for virtual human modelling. The method includes a novel hierarchical emotion structure, a group of emotion calculation equations and a simple heuristics decision making mechanism, which enables virtual humans to perform emotionally in real-time according to their internal and external factors. Emotion calculation equations used in this research were derived from psychologic emotion measurements. Virtual humans can utilise the information in virtual memory and emotion calculation equations to generate their own numerical emotion states within the hierarchical emotion structure. Those emotion states are important internal references for virtual humans to adopt appropriate behaviours and also key cues for their decision making. A simple heuristics theory is introduced and integrated into decision making process in order to make the virtual humans decision making more like a real human. A data interface which connects the emotion calculation and the decision making structure together has also been designed and simulated to test the method in Virtools environment.

  4. Memory-Based Decision-Making with Heuristics: Evidence for a Controlled Activation of Memory Representations

    ERIC Educational Resources Information Center

    Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank

    2011-01-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…

  5. Conflict and bias in heuristic judgment.

    PubMed

    Bhatia, Sudeep

    2017-02-01

    Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy. We tested this prediction in experiments involving judgments of argument strength and word frequency, and found that participants are more likely to avoid heuristic bias and respond correctly in settings with 2 incorrect heuristic response options compared with similar settings with only 1 heuristic response option. Our results provide strong evidence for conflict as a mechanism influencing the interaction between heuristic and deliberative thought, and illustrate how accuracy can be increased through simple changes to the response sets offered to participants. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Heuristic decision making in medicine

    PubMed Central

    Marewski, Julian N.; Gigerenzer, Gerd

    2012-01-01

    Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care. PMID:22577307

  7. Reconsidering "evidence" for fast-and-frugal heuristics.

    PubMed

    Hilbig, Benjamin E

    2010-12-01

    In several recent reviews, authors have argued for the pervasive use of fast-and-frugal heuristics in human judgment. They have provided an overview of heuristics and have reiterated findings corroborating that such heuristics can be very valid strategies leading to high accuracy. They also have reviewed previous work that implies that simple heuristics are actually used by decision makers. Unfortunately, concerning the latter point, these reviews appear to be somewhat incomplete. More important, previous conclusions have been derived from investigations that bear some noteworthy methodological limitations. I demonstrate these by proposing a new heuristic and provide some novel critical findings. Also, I review some of the relevant literature often not-or only partially-considered. Overall, although some fast-and-frugal heuristics indeed seem to predict behavior at times, there is little to no evidence for others. More generally, the empirical evidence available does not warrant the conclusion that heuristics are pervasively used.

  8. Automatic Generation of Heuristics for Scheduling

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Bresina, John L.; Rodgers, Stuart M.

    1997-01-01

    This paper presents a technique, called GenH, that automatically generates search heuristics for scheduling problems. The impetus for developing this technique is the growing consensus that heuristics encode advice that is, at best, useful in solving most, or typical, problem instances, and, at worst, useful in solving only a narrowly defined set of instances. In either case, heuristic problem solvers, to be broadly applicable, should have a means of automatically adjusting to the idiosyncrasies of each problem instance. GenH generates a search heuristic for a given problem instance by hill-climbing in the space of possible multi-attribute heuristics, where the evaluation of a candidate heuristic is based on the quality of the solution found under its guidance. We present empirical results obtained by applying GenH to the real world problem of telescope observation scheduling. These results demonstrate that GenH is a simple and effective way of improving the performance of an heuristic scheduler.

  9. Heuristic decision making in medicine.

    PubMed

    Marewski, Julian N; Gigerenzer, Gerd

    2012-03-01

    Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care.

  10. Heuristics: foundations for a novel approach to medical decision making.

    PubMed

    Bodemer, Nicolai; Hanoch, Yaniv; Katsikopoulos, Konstantinos V

    2015-03-01

    Medical decision-making is a complex process that often takes place during uncertainty, that is, when knowledge, time, and resources are limited. How can we ensure good decisions? We present research on heuristics-simple rules of thumb-and discuss how medical decision-making can benefit from these tools. We challenge the common view that heuristics are only second-best solutions by showing that they can be more accurate, faster, and easier to apply in comparison to more complex strategies. Using the example of fast-and-frugal decision trees, we illustrate how heuristics can be studied and implemented in the medical context. Finally, we suggest how a heuristic-friendly culture supports the study and application of heuristics as complementary strategies to existing decision rules.

  11. Uncertainty about social interactions leads to the evolution of social heuristics.

    PubMed

    van den Berg, Pieter; Wenseleers, Tom

    2018-05-31

    Individuals face many types of social interactions throughout their lives, but they often cannot perfectly assess what the consequences of their actions will be. Although it is known that unpredictable environments can profoundly affect the evolutionary process, it remains unclear how uncertainty about the nature of social interactions shapes the evolution of social behaviour. Here, we present an evolutionary simulation model, showing that even intermediate uncertainty leads to the evolution of simple cooperation strategies that disregard information about the social interaction ('social heuristics'). Moreover, our results show that the evolution of social heuristics can greatly affect cooperation levels, nearly doubling cooperation rates in our simulations. These results provide new insight into why social behaviour, including cooperation in humans, is often observed to be seemingly suboptimal. More generally, our results show that social behaviour that seems maladaptive when considered in isolation may actually be well-adapted to a heterogeneous and uncertain world.

  12. Vervet monkeys use paths consistent with context-specific spatial movement heuristics.

    PubMed

    Teichroeb, Julie A

    2015-10-01

    Animal foraging routes are analogous to the computationally demanding "traveling salesman problem" (TSP), where individuals must find the shortest path among several locations before returning to the start. Humans approximate solutions to TSPs using simple heuristics or "rules of thumb," but our knowledge of how other animals solve multidestination routing problems is incomplete. Most nonhuman primate species have shown limited ability to route plan. However, captive vervets were shown to solve a TSP for six sites. These results were consistent with either planning three steps ahead or a risk-avoidance strategy. I investigated how wild vervet monkeys (Chlorocebus pygerythrus) solved a path problem with six, equally rewarding food sites; where site arrangement allowed assessment of whether vervets found the shortest route and/or used paths consistent with one of three simple heuristics to navigate. Single vervets took the shortest possible path in fewer than half of the trials, usually in ways consistent with the most efficient heuristic (the convex hull). When in competition, vervets' paths were consistent with different, more efficient heuristics dependent on their dominance rank (a cluster strategy for dominants and the nearest neighbor rule for subordinates). These results suggest that, like humans, vervets may solve multidestination routing problems by applying simple, adaptive, context-specific "rules of thumb." The heuristics that were consistent with vervet paths in this study are the same as some of those asserted to be used by humans. These spatial movement strategies may have common evolutionary roots and be part of a universal mental navigational toolkit. Alternatively, they may have emerged through convergent evolution as the optimal way to solve multidestination routing problems.

  13. Fast or Frugal, but Not Both: Decision Heuristics Under Time Pressure

    PubMed Central

    2017-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. PMID:28557503

  14. Fast or frugal, but not both: Decision heuristics under time pressure.

    PubMed

    Bobadilla-Suarez, Sebastian; Love, Bradley C

    2018-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Assessing the empirical validity of the "take-the-best" heuristic as a model of human probabilistic inference.

    PubMed

    Bröder, A

    2000-09-01

    The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.

  16. Algorithm for parametric community detection in networks.

    PubMed

    Bettinelli, Andrea; Hansen, Pierre; Liberti, Leo

    2012-07-01

    Modularity maximization is extensively used to detect communities in complex networks. It has been shown, however, that this method suffers from a resolution limit: Small communities may be undetectable in the presence of larger ones even if they are very dense. To alleviate this defect, various modifications of the modularity function have been proposed as well as multiresolution methods. In this paper we systematically study a simple model (proposed by Pons and Latapy [Theor. Comput. Sci. 412, 892 (2011)] and similar to the parametric model of Reichardt and Bornholdt [Phys. Rev. E 74, 016110 (2006)]) with a single parameter α that balances the fraction of within community edges and the expected fraction of edges according to the configuration model. An exact algorithm is proposed to find optimal solutions for all values of α as well as the corresponding successive intervals of α values for which they are optimal. This algorithm relies upon a routine for exact modularity maximization and is limited to moderate size instances. An agglomerative hierarchical heuristic is therefore proposed to address parametric modularity detection in large networks. At each iteration the smallest value of α for which it is worthwhile to merge two communities of the current partition is found. Then merging is performed and the data are updated accordingly. An implementation is proposed with the same time and space complexity as the well-known Clauset-Newman-Moore (CNM) heuristic [Phys. Rev. E 70, 066111 (2004)]. Experimental results on artificial and real world problems show that (i) communities are detected by both exact and heuristic methods for all values of the parameter α; (ii) the dendrogram summarizing the results of the heuristic method provides a useful tool for substantive analysis, as illustrated particularly on a Les Misérables data set; (iii) the difference between the parametric modularity values given by the exact method and those given by the heuristic is moderate; (iv) the heuristic version of the proposed parametric method, viewed as a modularity maximization tool, gives better results than the CNM heuristic for large instances.

  17. Toward a Definition of the Engineering Method.

    ERIC Educational Resources Information Center

    Koen, Billy Vaughn

    1984-01-01

    Defines the engineering method by: (1) giving a preliminary definition and examples of its essential term (heuristics); (2) comparing the definition to a popular alternative; and (3) presenting a simple form of the definition. This definition states that the engineering method is the use of engineering heuristics. (JN)

  18. Neural correlates of strategic reasoning during competitive games.

    PubMed

    Seo, Hyojung; Cai, Xinying; Donahue, Christopher H; Lee, Daeyeol

    2014-10-17

    Although human and animal behaviors are largely shaped by reinforcement and punishment, choices in social settings are also influenced by information about the knowledge and experience of other decision-makers. During competitive games, monkeys increased their payoffs by systematically deviating from a simple heuristic learning algorithm and thereby countering the predictable exploitation by their computer opponent. Neurons in the dorsomedial prefrontal cortex (dmPFC) signaled the animal's recent choice and reward history that reflected the computer's exploitative strategy. The strength of switching signals in the dmPFC also correlated with the animal's tendency to deviate from the heuristic learning algorithm. Therefore, the dmPFC might provide control signals for overriding simple heuristic learning algorithms based on the inferred strategies of the opponent. Copyright © 2014, American Association for the Advancement of Science.

  19. Recipient design in human communication: simple heuristics or perspective taking?

    PubMed

    Blokpoel, Mark; van Kesteren, Marlieke; Stolk, Arjen; Haselager, Pim; Toni, Ivan; van Rooij, Iris

    2012-01-01

    Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the "how" of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account.

  20. Recipient design in human communication: simple heuristics or perspective taking?

    PubMed Central

    Blokpoel, Mark; van Kesteren, Marlieke; Stolk, Arjen; Haselager, Pim; Toni, Ivan; van Rooij, Iris

    2012-01-01

    Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the “how” of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account. PMID:23055960

  1. Simply criminal: predicting burglars' occupancy decisions with a simple heuristic.

    PubMed

    Snook, Brent; Dhami, Mandeep K; Kavanagh, Jennifer M

    2011-08-01

    Rational choice theories of criminal decision making assume that offenders weight and integrate multiple cues when making decisions (i.e., are compensatory). We tested this assumption by comparing how well a compensatory strategy called Franklin's Rule captured burglars' decision policies regarding residence occupancy compared to a non-compensatory strategy (i.e., Matching Heuristic). Forty burglars each decided on the occupancy of 20 randomly selected photographs of residences (for which actual occupancy was known when the photo was taken). Participants also provided open-ended reports on the cues that influenced their decisions in each case, and then rated the importance of eight cues (e.g., deadbolt visible) over all decisions. Burglars predicted occupancy beyond chance levels. The Matching Heuristic was a significantly better predictor of burglars' decisions than Franklin's Rule, and cue use in the Matching Heuristic better corresponded to the cue ecological validities in the environment than cue use in Franklin's Rule. The most important cue in burglars' models was also the most ecologically valid or predictive of actual occupancy (i.e., vehicle present). The majority of burglars correctly identified the most important cue in their models, and the open-ended technique showed greater correspondence between self-reported and captured cue use than the rating over decision technique. Our findings support a limited rationality perspective to understanding criminal decision making, and have implications for crime prevention.

  2. Automated discovery of local search heuristics for satisfiability testing.

    PubMed

    Fukunaga, Alex S

    2008-01-01

    The development of successful metaheuristic algorithms such as local search for a difficult problem such as satisfiability testing (SAT) is a challenging task. We investigate an evolutionary approach to automating the discovery of new local search heuristics for SAT. We show that several well-known SAT local search algorithms such as Walksat and Novelty are composite heuristics that are derived from novel combinations of a set of building blocks. Based on this observation, we developed CLASS, a genetic programming system that uses a simple composition operator to automatically discover SAT local search heuristics. New heuristics discovered by CLASS are shown to be competitive with the best Walksat variants, including Novelty+. Evolutionary algorithms have previously been applied to directly evolve a solution for a particular SAT instance. We show that the heuristics discovered by CLASS are also competitive with these previous, direct evolutionary approaches for SAT. We also analyze the local search behavior of the learned heuristics using the depth, mobility, and coverage metrics proposed by Schuurmans and Southey.

  3. Simple heuristic for the viscosity of polydisperse hard spheres

    NASA Astrophysics Data System (ADS)

    Farr, Robert S.

    2014-12-01

    We build on the work of Mooney [Colloids Sci. 6, 162 (1951)] to obtain an heuristic analytic approximation to the viscosity of a suspension any size distribution of hard spheres in a Newtonian solvent. The result agrees reasonably well with rheological data on monodispserse and bidisperse hard spheres, and also provides an approximation to the random close packing fraction of polydisperse spheres. The implied packing fraction is less accurate than that obtained by Farr and Groot [J. Chem. Phys. 131(24), 244104 (2009)], but has the advantage of being quick and simple to evaluate.

  4. Heuristic Identification of Biological Architectures for Simulating Complex Hierarchical Genetic Interactions

    PubMed Central

    Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C

    2015-01-01

    Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175

  5. The use of simple inflow- and storage-based heuristics equations to represent reservoir behavior in California for investigating human impacts on the water cycle

    NASA Astrophysics Data System (ADS)

    Solander, K.; David, C. H.; Reager, J. T.; Famiglietti, J. S.

    2013-12-01

    The ability to reasonably replicate reservoir behavior in terms of storage and outflow is important for studying the potential human impacts on the terrestrial water cycle. Developing a simple method for this purpose could facilitate subsequent integration in a land surface or global climate model. This study attempts to simulate monthly reservoir outflow and storage using a simple, temporally-varying set of heuristics equations with input consisting of in situ records of reservoir inflow and storage. Equations of increasing complexity relative to the number of parameters involved were tested. Only two parameters were employed in the final equations used to predict outflow and storage in an attempt to best mimic seasonal reservoir behavior while still preserving model parsimony. California reservoirs were selected for model development due to the high level of data availability and intensity of water resource management in this region relative to other areas. Calibration was achieved using observations from eight major reservoirs representing approximately 41% of the 107 largest reservoirs in the state. Parameter optimization was accomplished using the minimum RMSE between observed and modeled storage and outflow as the main objective function. Initial results obtained for a multi-reservoir average of the correlation coefficient between observed and modeled storage (resp. outflow) is of 0.78 (resp. 0.75). These results combined with the simplicity of the equations being used show promise for integration into a land surface or a global climate model. This would be invaluable for evaluations of reservoir management impacts on the flow regime and associated ecosystems as well as on the climate at both regional and global scales.

  6. Design and usability of heuristic‐based deliberation tools for women facing amniocentesis

    PubMed Central

    Durand, Marie‐Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn

    2011-01-01

    Abstract Background  Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health‐care decisions), fast and frugal decision‐making strategies (heuristics) may perform better than complex rules of reasoning. Objective  To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). Design  The ‘Take The Best’ heuristic (i.e. selection of a ‘most important reason’) and ‘The Tallying’ integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web‐based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health‐care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of ‘your most important reason’ (Take The Best) and ‘weighing it up’ (Tallying). Results  Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health‐care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that ‘weighing it up’ and ‘your most important reason’ were not appropriate when facing such a difficult and emotional decision. Conclusion  Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real‐world decisions about amniocentesis. PMID:21241434

  7. Not So Fast! (and Not So Frugal!): Rethinking the Recognition Heuristic

    ERIC Educational Resources Information Center

    Oppenheimer, Daniel M.

    2003-01-01

    The "fast and frugal" approach to reasoning (Gigerenzer, G., & Todd, P. M. (1999). "Simple heuristics that make us smart." New York: Oxford University Press) claims that individuals use non-compensatory strategies in judgment--the idea that only one cue is taken into account in reasoning. The simplest and most important of these heuristics…

  8. Minimizing conflicts: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steve; Johnston, Mark; Philips, Andrew; Laird, Phil

    1992-01-01

    This paper describes a simple heuristic approach to solving large-scale constraint satisfaction and scheduling problems. In this approach one starts with an inconsistent assignment for a set of variables and searches through the space of possible repairs. The search can be guided by a value-ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. The heuristic can be used with a variety of different search strategies. We demonstrate empirically that on the n-queens problem, a technique based on this approach performs orders of magnitude better than traditional backtracking techniques. We also describe a scheduling application where the approach has been used successfully. A theoretical analysis is presented both to explain why this method works well on certain types of problems and to predict when it is likely to be most effective.

  9. Laser-induced electron dynamics including photoionization: A heuristic model within time-dependent configuration interaction theory.

    PubMed

    Klinkusch, Stefan; Saalfrank, Peter; Klamroth, Tillmann

    2009-09-21

    We report simulations of laser-pulse driven many-electron dynamics by means of a simple, heuristic extension of the time-dependent configuration interaction singles (TD-CIS) approach. The extension allows for the treatment of ionizing states as nonstationary states with a finite, energy-dependent lifetime to account for above-threshold ionization losses in laser-driven many-electron dynamics. The extended TD-CIS method is applied to the following specific examples: (i) state-to-state transitions in the LiCN molecule which correspond to intramolecular charge transfer, (ii) creation of electronic wave packets in LiCN including wave packet analysis by pump-probe spectroscopy, and, finally, (iii) the effect of ionization on the dynamic polarizability of H(2) when calculated nonperturbatively by TD-CIS.

  10. Heuristics and Biases in Military Decision Making

    DTIC Science & Technology

    2010-10-01

    rationality and is based on a linear, step-based model that generates a specific course of action and is useful for the examination of problems that...exhibit stability and are underpinned by assumptions of “technical- rationality .”5 The Army values MDMP as the sanctioned approach for solving...theory) which sought to describe human behavior as a rational maximization of cost-benefit decisions, Kahne- man and Tversky provided a simple

  11. Clinical reasoning in the real world is mediated by bounded rationality: implications for diagnostic clinical practice guidelines.

    PubMed

    Bonilauri Ferreira, Ana Paula Ribeiro; Ferreira, Rodrigo Fernando; Rajgor, Dimple; Shah, Jatin; Menezes, Andrea; Pietrobon, Ricardo

    2010-04-20

    Little is known about the reasoning mechanisms used by physicians in decision-making and how this compares to diagnostic clinical practice guidelines. We explored the clinical reasoning process in a real life environment. This is a qualitative study evaluating transcriptions of sixteen physicians' reasoning during appointments with patients, clinical discussions between specialists, and personal interviews with physicians affiliated to a hospital in Brazil. FOUR MAIN THEMES WERE IDENTIFIED: simple and robust heuristics, extensive use of social environment rationality, attempts to prove diagnostic and therapeutic hypothesis while refuting potential contradictions using positive test strategy, and reaching the saturation point. Physicians constantly attempted to prove their initial hypothesis while trying to refute any contradictions. While social environment rationality was the main factor in the determination of all steps of the clinical reasoning process, factors such as referral letters and number of contradictions associated with the initial hypothesis had influence on physicians' confidence and determination of the threshold to reach a final decision. Physicians rely on simple heuristics associated with environmental factors. This model allows for robustness, simplicity, and cognitive energy saving. Since this model does not fit into current diagnostic clinical practice guidelines, we make some propositions to help its integration.

  12. Clinical Reasoning in the Real World Is Mediated by Bounded Rationality: Implications for Diagnostic Clinical Practice Guidelines

    PubMed Central

    Bonilauri Ferreira, Ana Paula Ribeiro; Ferreira, Rodrigo Fernando; Rajgor, Dimple; Shah, Jatin; Menezes, Andrea; Pietrobon, Ricardo

    2010-01-01

    Background Little is known about the reasoning mechanisms used by physicians in decision-making and how this compares to diagnostic clinical practice guidelines. We explored the clinical reasoning process in a real life environment. Method This is a qualitative study evaluating transcriptions of sixteen physicians' reasoning during appointments with patients, clinical discussions between specialists, and personal interviews with physicians affiliated to a hospital in Brazil. Results Four main themes were identified: simple and robust heuristics, extensive use of social environment rationality, attempts to prove diagnostic and therapeutic hypothesis while refuting potential contradictions using positive test strategy, and reaching the saturation point. Physicians constantly attempted to prove their initial hypothesis while trying to refute any contradictions. While social environment rationality was the main factor in the determination of all steps of the clinical reasoning process, factors such as referral letters and number of contradictions associated with the initial hypothesis had influence on physicians' confidence and determination of the threshold to reach a final decision. Discussion Physicians rely on simple heuristics associated with environmental factors. This model allows for robustness, simplicity, and cognitive energy saving. Since this model does not fit into current diagnostic clinical practice guidelines, we make some propositions to help its integration. PMID:20421920

  13. Optimizing Aerobot Exploration of Venus

    NASA Astrophysics Data System (ADS)

    Ford, Kevin S.

    1997-03-01

    Venus Flyer Robot (VFR) is an aerobot; an autonomous balloon probe designed for remote exploration of Earth's sister planet in 2003. VFR's simple navigation and control system permits travel to virtually any location on Venus, but it can survive for only a limited duration in the harsh Venusian environment. To help address this limitation, we develop: (1) a global circulation model that captures the most important characteristics of the Venusian atmosphere; (2) a simple aerobot model that captures thermal restrictions faced by VFR at Venus; and (3) one exact and two heuristic algorithms that, using abstractions (1) and (2), construct routes making the best use of VFR's limited lifetime. We demonstrate this modeling by planning several small example missions and a prototypical mission that explores numerous interesting sites recently documented in the plane tary geology literature.

  14. Optimizing Aerobot Exploration of Venus

    NASA Technical Reports Server (NTRS)

    Ford, Kevin S.

    1997-01-01

    Venus Flyer Robot (VFR) is an aerobot; an autonomous balloon probe designed for remote exploration of Earth's sister planet in 2003. VFR's simple navigation and control system permits travel to virtually any location on Venus, but it can survive for only a limited duration in the harsh Venusian environment. To help address this limitation, we develop: (1) a global circulation model that captures the most important characteristics of the Venusian atmosphere; (2) a simple aerobot model that captures thermal restrictions faced by VFR at Venus; and (3) one exact and two heuristic algorithms that, using abstractions (1) and (2), construct routes making the best use of VFR's limited lifetime. We demonstrate this modeling by planning several small example missions and a prototypical mission that explores numerous interesting sites recently documented in the plane tary geology literature.

  15. Deterministic diffusion in flower-shaped billiards.

    PubMed

    Harayama, Takahisa; Klages, Rainer; Gaspard, Pierre

    2002-08-01

    We propose a flower-shaped billiard in order to study the irregular parameter dependence of chaotic normal diffusion. Our model is an open system consisting of periodically distributed obstacles in the shape of a flower, and it is strongly chaotic for almost all parameter values. We compute the parameter dependent diffusion coefficient of this model from computer simulations and analyze its functional form using different schemes, all generalizing the simple random walk approximation of Machta and Zwanzig. The improved methods we use are based either on heuristic higher-order corrections to the simple random walk model, on lattice gas simulation methods, or they start from a suitable Green-Kubo formula for diffusion. We show that dynamical correlations, or memory effects, are of crucial importance in reproducing the precise parameter dependence of the diffusion coefficent.

  16. Heuristic extraction of rules in pruned artificial neural networks models used for quantifying highly overlapping chromatographic peaks.

    PubMed

    Hervás, César; Silva, Manuel; Serrano, Juan Manuel; Orejuela, Eva

    2004-01-01

    The suitability of an approach for extracting heuristic rules from trained artificial neural networks (ANNs) pruned by a regularization method and with architectures designed by evolutionary computation for quantifying highly overlapping chromatographic peaks is demonstrated. The ANN input data are estimated by the Levenberg-Marquardt method in the form of a four-parameter Weibull curve associated with the profile of the chromatographic band. To test this approach, two N-methylcarbamate pesticides, carbofuran and propoxur, were quantified using a classic peroxyoxalate chemiluminescence reaction as a detection system for chromatographic analysis. Straightforward network topologies (one and two outputs models) allow the analytes to be quantified in concentration ratios ranging from 1:7 to 5:1 with an average standard error of prediction for the generalization test of 2.7 and 2.3% for carbofuran and propoxur, respectively. The reduced dimensions of the selected ANN architectures, especially those obtained after using heuristic rules, allowed simple quantification equations to be developed that transform the input variables into output variables. These equations can be easily interpreted from a chemical point of view to attain quantitative analytical information regarding the effect of both analytes on the characteristics of chromatographic bands, namely profile, dispersion, peak height, and residence time. Copyright 2004 American Chemical Society

  17. Exploring the Impact of Early Decisions in Variable Ordering for Constraint Satisfaction Problems.

    PubMed

    Ortiz-Bayliss, José Carlos; Amaya, Ivan; Conant-Pablos, Santiago Enrique; Terashima-Marín, Hugo

    2018-01-01

    When solving constraint satisfaction problems (CSPs), it is a common practice to rely on heuristics to decide which variable should be instantiated at each stage of the search. But, this ordering influences the search cost. Even so, and to the best of our knowledge, no earlier work has dealt with how first variable orderings affect the overall cost. In this paper, we explore the cost of finding high-quality orderings of variables within constraint satisfaction problems. We also study differences among the orderings produced by some commonly used heuristics and the way bad first decisions affect the search cost. One of the most important findings of this work confirms the paramount importance of first decisions. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. We propose a simple method to improve early decisions of heuristics. By using it, performance of heuristics increases.

  18. Exploring the Impact of Early Decisions in Variable Ordering for Constraint Satisfaction Problems

    PubMed Central

    Amaya, Ivan

    2018-01-01

    When solving constraint satisfaction problems (CSPs), it is a common practice to rely on heuristics to decide which variable should be instantiated at each stage of the search. But, this ordering influences the search cost. Even so, and to the best of our knowledge, no earlier work has dealt with how first variable orderings affect the overall cost. In this paper, we explore the cost of finding high-quality orderings of variables within constraint satisfaction problems. We also study differences among the orderings produced by some commonly used heuristics and the way bad first decisions affect the search cost. One of the most important findings of this work confirms the paramount importance of first decisions. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. We propose a simple method to improve early decisions of heuristics. By using it, performance of heuristics increases. PMID:29681923

  19. Enhancements to the Engine Data Interpretation System (EDIS)

    NASA Technical Reports Server (NTRS)

    Hofmann, Martin O.

    1993-01-01

    The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The results of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.

  20. Enhancements to the Engine Data Interpretation System (EDIS)

    NASA Technical Reports Server (NTRS)

    Hofmann, Martin O.

    1993-01-01

    The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The result of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.

  1. The min-conflicts heuristic: Experimental and theoretical results

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Philips, Andrew B.; Johnston, Mark D.; Laird, Philip

    1991-01-01

    This paper describes a simple heuristic method for solving large-scale constraint satisfaction and scheduling problems. Given an initial assignment for the variables in a problem, the method operates by searching through the space of possible repairs. The search is guided by an ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. We demonstrate empirically that the method performs orders of magnitude better than traditional backtracking techniques on certain standard problems. For example, the one million queens problem can be solved rapidly using our approach. We also describe practical scheduling applications where the method has been successfully applied. A theoretical analysis is presented to explain why the method works so well on certain types of problems and to predict when it is likely to be most effective.

  2. Connections between survey calibration estimators and semiparametric models for incomplete data

    PubMed Central

    Lumley, Thomas; Shaw, Pamela A.; Dai, James Y.

    2012-01-01

    Survey calibration (or generalized raking) estimators are a standard approach to the use of auxiliary information in survey sampling, improving on the simple Horvitz–Thompson estimator. In this paper we relate the survey calibration estimators to the semiparametric incomplete-data estimators of Robins and coworkers, and to adjustment for baseline variables in a randomized trial. The development based on calibration estimators explains the ‘estimated weights’ paradox and provides useful heuristics for constructing practical estimators. We present some examples of using calibration to gain precision without making additional modelling assumptions in a variety of regression models. PMID:23833390

  3. Prediction of power requirements for a longwall armored face conveyor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broadfoot, A.R.; Betz, R.E.

    1995-12-31

    Longwall armored face conveyors (AFC`s) have traditionally been designed using a combination of heuristics and simple models. However, as longwalls increase in length these design procedures are proving to be inadequate. The result has either been costly loss of production due to AFC stalling or component failure, or larger than necessary capital investment due to overdesign. In order to allow accurate estimation of the power requirements for an AFC this paper develops a comprehensive model of all the friction forces associated with the AFC. Power requirement predictions obtained from these models are then compared with measurements from two mine faces.

  4. Influence maximization based on partial network structure information: A comparative analysis on seed selection heuristics

    NASA Astrophysics Data System (ADS)

    Erkol, Şirag; Yücel, Gönenç

    In this study, the problem of seed selection is investigated. This problem is mainly treated as an optimization problem, which is proved to be NP-hard. There are several heuristic approaches in the literature which mostly use algorithmic heuristics. These approaches mainly focus on the trade-off between computational complexity and accuracy. Although the accuracy of algorithmic heuristics are high, they also have high computational complexity. Furthermore, in the literature, it is generally assumed that complete information on the structure and features of a network is available, which is not the case in most of the times. For the study, a simulation model is constructed, which is capable of creating networks, performing seed selection heuristics, and simulating diffusion models. Novel metric-based seed selection heuristics that rely only on partial information are proposed and tested using the simulation model. These heuristics use local information available from nodes in the synthetically created networks. The performances of heuristics are comparatively analyzed on three different network types. The results clearly show that the performance of a heuristic depends on the structure of a network. A heuristic to be used should be selected after investigating the properties of the network at hand. More importantly, the approach of partial information provided promising results. In certain cases, selection heuristics that rely only on partial network information perform very close to similar heuristics that require complete network data.

  5. A Heuristic Probabilistic Approach to Estimating Size-Dependent Mobility of Nonuniform Sediment

    NASA Astrophysics Data System (ADS)

    Woldegiorgis, B. T.; Wu, F. C.; van Griensven, A.; Bauwens, W.

    2017-12-01

    Simulating the mechanism of bed sediment mobility is essential for modelling sediment dynamics. Despite the fact that many studies are carried out on this subject, they use complex mathematical formulations that are computationally expensive, and are often not easy for implementation. In order to present a simple and computationally efficient complement to detailed sediment mobility models, we developed a heuristic probabilistic approach to estimating the size-dependent mobilities of nonuniform sediment based on the pre- and post-entrainment particle size distributions (PSDs), assuming that the PSDs are lognormally distributed. The approach fits a lognormal probability density function (PDF) to the pre-entrainment PSD of bed sediment and uses the threshold particle size of incipient motion and the concept of sediment mixture to estimate the PSDs of the entrained sediment and post-entrainment bed sediment. The new approach is simple in physical sense and significantly reduces the complexity and computation time and resource required by detailed sediment mobility models. It is calibrated and validated with laboratory and field data by comparing to the size-dependent mobilities predicted with the existing empirical lognormal cumulative distribution function (CDF) approach. The novel features of the current approach are: (1) separating the entrained and non-entrained sediments by a threshold particle size, which is a modified critical particle size of incipient motion by accounting for the mixed-size effects, and (2) using the mixture-based pre- and post-entrainment PSDs to provide a continuous estimate of the size-dependent sediment mobility.

  6. Game Theory in the Social Studies Classroom

    ERIC Educational Resources Information Center

    Vesperman, Dean Patrick; Clark, Chris H.

    2016-01-01

    This article explores using game theory in social studies classrooms as a heuristic to aid students in understanding strategic decision making. The authors provide examples of several simple games teachers can use. Next, we address how to help students design their own simple (2 × 2) games.

  7. A simple homogeneous model for regular and irregular metallic wire media samples

    NASA Astrophysics Data System (ADS)

    Kosulnikov, S. Y.; Mirmoosa, M. S.; Simovski, C. R.

    2018-02-01

    To simplify the solution of electromagnetic problems with wire media samples, it is reasonable to treat them as the samples of a homogeneous material without spatial dispersion. The account of spatial dispersion implies additional boundary conditions and makes the solution of boundary problems difficult especially if the sample is not an infinitely extended layer. Moreover, for a novel type of wire media - arrays of randomly tilted wires - a spatially dispersive model has not been developed. Here, we introduce a simplistic heuristic model of wire media samples shaped as bricks. Our model covers WM of both regularly and irregularly stretched wires.

  8. Analysis of albedo versus cloud fraction relationships in liquid water clouds using heuristic models and large eddy simulation

    NASA Astrophysics Data System (ADS)

    Feingold, Graham; Balsells, Joseph; Glassmeier, Franziska; Yamaguchi, Takanobu; Kazil, Jan; McComiskey, Allison

    2017-07-01

    The relationship between the albedo of a cloudy scene A and cloud fraction fc is studied with the aid of heuristic models of stratocumulus and cumulus clouds. Existing work has shown that scene albedo increases monotonically with increasing cloud fraction but that the relationship varies from linear to superlinear. The reasons for these differences in functional dependence are traced to the relationship between cloud deepening and cloud widening. When clouds deepen with no significant increase in fc (e.g., in solid stratocumulus), the relationship between A and fc is linear. When clouds widen as they deepen, as in cumulus cloud fields, the relationship is superlinear. A simple heuristic model of a cumulus cloud field with a power law size distribution shows that the superlinear A-fc behavior is traced out either through random variation in cloud size distribution parameters or as the cloud field oscillates between a relative abundance of small clouds (steep slopes on a log-log plot) and a relative abundance of large clouds (flat slopes). Oscillations of this kind manifest in large eddy simulation of trade wind cumulus where the slope and intercept of the power law fit to the cloud size distribution are highly correlated. Further analysis of the large eddy model-generated cloud fields suggests that cumulus clouds grow larger and deeper as their underlying plumes aggregate; this is followed by breakup of large plumes and a tendency to smaller clouds. The cloud and thermal size distributions oscillate back and forth approximately in unison.

  9. Heuristics of reasoning and analogy in children's visual perspective taking.

    PubMed

    Yaniv, I; Shatz, M

    1990-10-01

    We propose that children's reasoning about others' visual perspectives is guided by simple heuristics based on a perceiver's line of sight and salient features of the object met by that line. In 3 experiments employing a 2-perceiver analogy task, children aged 3-6 were generally better able to reproduce a perceiver's perspective if a visual cue in the perceiver's line of sight sufficed to distinguish it from alternatives. Children had greater difficulty when the task hinged on attending to configural cues. Availability of distinctive cues affixed on the objects' sides facilitated solution of the symmetrical orientations. These and several other related findings reported in the literature are traced to children's reliance on heuristics of reasoning.

  10. We favor formal models of heuristics rather than lists of loose dichotomies: a reply to Evans and Over

    PubMed Central

    Gigerenzer, Gerd

    2009-01-01

    In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes. PMID:19784854

  11. Rarity-weighted richness: a simple and reliable alternative to integer programming and heuristic algorithms for minimum set and maximum coverage problems in conservation planning.

    PubMed

    Albuquerque, Fabio; Beier, Paul

    2015-01-01

    Here we report that prioritizing sites in order of rarity-weighted richness (RWR) is a simple, reliable way to identify sites that represent all species in the fewest number of sites (minimum set problem) or to identify sites that represent the largest number of species within a given number of sites (maximum coverage problem). We compared the number of species represented in sites prioritized by RWR to numbers of species represented in sites prioritized by the Zonation software package for 11 datasets in which the size of individual planning units (sites) ranged from <1 ha to 2,500 km2. On average, RWR solutions were more efficient than Zonation solutions. Integer programming remains the only guaranteed way find an optimal solution, and heuristic algorithms remain superior for conservation prioritizations that consider compactness and multiple near-optimal solutions in addition to species representation. But because RWR can be implemented easily and quickly in R or a spreadsheet, it is an attractive alternative to integer programming or heuristic algorithms in some conservation prioritization contexts.

  12. A detailed comparison of optimality and simplicity in perceptual decision-making

    PubMed Central

    Shen, Shan; Ma, Wei Ji

    2017-01-01

    Two prominent ideas in the study of decision-making have been that organisms behave near-optimally, and that they use simple heuristic rules. These principles might be operating in different types of tasks, but this possibility cannot be fully investigated without a direct, rigorous comparison within a single task. Such a comparison was lacking in most previous studies, because a) the optimal decision rule was simple; b) no simple suboptimal rules were considered; c) it was unclear what was optimal, or d) a simple rule could closely approximate the optimal rule. Here, we used a perceptual decision-making task in which the optimal decision rule is well-defined and complex, and makes qualitatively distinct predictions from many simple suboptimal rules. We find that all simple rules tested fail to describe human behavior, that the optimal rule accounts well for the data, and that several complex suboptimal rules are indistinguishable from the optimal one. Moreover, we found evidence that the optimal model is close to the true model: first, the better the trial-to-trial predictions of a suboptimal model agree with those of the optimal model, the better that suboptimal model fits; second, our estimate of the Kullback-Leibler divergence between the optimal model and the true model is not significantly different from zero. When observers receive no feedback, the optimal model still describes behavior best, suggesting that sensory uncertainty is implicitly represented and taken into account. Beyond the task and models studied here, our results have implications for best practices of model comparison. PMID:27177259

  13. Optimal deployment of thermal energy storage under diverse economic and climate conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeForest, Nicholas; Mendes, Gonçalo; Stadler, Michael

    2014-04-01

    This paper presents an investigation of the economic benefit of thermal energy storage (TES) for cooling, across a range of economic and climate conditions. Chilled water TES systems are simulated for a large office building in four distinct locations, Miami in the U.S.; Lisbon, Portugal; Shanghai, China; and Mumbai, India. Optimal system size and operating schedules are determined using the optimization model DER-CAM, such that total cost, including electricity and amortized capital costs are minimized. The economic impacts of each optimized TES system is then compared to systems sized using a simple heuristic method, which bases system size as fractionmore » (50percent and 100percent) of total on-peak summer cooling loads. Results indicate that TES systems of all sizes can be effective in reducing annual electricity costs (5percent-15percent) and peak electricity consumption (13percent-33percent). The investigation also indentifies a number of criteria which drive TES investment, including low capital costs, electricity tariffs with high power demand charges and prolonged cooling seasons. In locations where these drivers clearly exist, the heuristically sized systems capture much of the value of optimally sized systems; between 60percent and 100percent in terms of net present value. However, in instances where these drivers are less pronounced, the heuristic tends to oversize systems, and optimization becomes crucial to ensure economically beneficial deployment of TES, increasing the net present value of heuristically sized systems by as much as 10 times in some instances.« less

  14. A simple model for DSS-14 outage times

    NASA Technical Reports Server (NTRS)

    Rumsey, H. C.; Stevens, R.; Posner, E. C.

    1989-01-01

    A model is proposed to describe DSS-14 outage times. Discrepancy Reporting System outage data for the period from January 1986 through September 1988 are used to estimate the parameters of the model. The model provides a probability distribution for the duration of outages, which agrees well with observed data. The model depends only on a small number of parameters, and has some heuristic justification. This shows that the Discrepancy Reporting System in the Deep Space Network (DSN) can be used to estimate the probability of extended outages in spite of the discrepancy reports ending when the pass ends. The probability of an outage extending beyond the end of a pass is estimated as around 5 percent.

  15. Heuristic-Leadership Model: Adapting to Current Training and Changing Times.

    ERIC Educational Resources Information Center

    Danielson, Mary Ann

    A model was developed for training individuals to adapt better to the changing work environment by focusing on the subordinate to supervisor relationship and providing a heuristic approach to leadership. The model emphasizes a heuristic approach to decision-making through the active participation of both members of the dyad. The demand among…

  16. Prediction of power requirements for a longwall armored face conveyor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broadfoot, A.R.; Betz, R.E.

    1997-01-01

    Longwall armored face conveyors (AFC`s) have traditionally been designed using a combination of heuristics and simple models. However, as longwalls increase in length, these design procedures are proving to be inadequate. The result has either been a costly loss of production due to AFC stalling or component failure, or larger than necessary capital investment due to overdesign. In order to allow accurate estimation of the power requirements for an AFC, this paper develops a comprehensive model of all the friction forces associated with the AFC. Power requirement predictions obtained from these models are then compared with measurements from two minemore » faces.« less

  17. Heuristic evaluation of infusion pumps: implications for patient safety in Intensive Care Units.

    PubMed

    Graham, Mark J; Kubose, Tate K; Jordan, Desmond; Zhang, Jiajie; Johnson, Todd R; Patel, Vimla L

    2004-11-01

    The goal of this research was to use a heuristic evaluation methodology to uncover design and interface deficiencies of infusion pumps that are currently in use in Intensive Care Units (ICUs). Because these infusion systems cannot be readily replaced due to lease agreements and large-scale institutional purchasing procedures, we argue that it is essential to systematically identify the existing usability problems so that the possible causes of errors can be better understood, passed on to the end-users (e.g., critical care nurses), and used to make policy recommendations. Four raters conducted the heuristic evaluation of the three-channel infusion pump interface. Three raters had a cognitive science background as well as experience with the heuristic evaluation methodology. The fourth rater was a veteran critical care nurse who had extensive experience operating the pumps. The usability experts and the domain expert independently evaluated the user interface and physical design of the infusion pump and generated a list of heuristic violations based upon a set of 14 heuristics developed in previous research. The lists were compiled and then rated on the severity of the violation. From 14 usability heuristics considered in this evaluation of the Infusion Pump, there were 231 violations. Two heuristics, "Consistency" and "Language", were found to have the most violations. The one with fewest violations was "Document". While some heuristic evaluation categories had more violations than others, the most severe ones were not confined to one type. The Primary interface location (e.g., where loading the pump, changing doses, and confirming drug settings takes place) had the most occurrences of heuristic violations. We believe that the Heuristic Evaluation methodology provides a simple and cost-effective approach to discovering medical device deficiencies that affect a patient's general well being. While this methodology provides information for the infusion pump designs of the future, it also identifies important insights concerning equipment that is currently in use in critical care environments.

  18. Improving performances of suboptimal greedy iterative biclustering heuristics via localization.

    PubMed

    Erten, Cesim; Sözdinler, Melih

    2010-10-15

    Biclustering gene expression data is the problem of extracting submatrices of genes and conditions exhibiting significant correlation across both the rows and the columns of a data matrix of expression values. Even the simplest versions of the problem are computationally hard. Most of the proposed solutions therefore employ greedy iterative heuristics that locally optimize a suitably assigned scoring function. We provide a fast and simple pre-processing algorithm called localization that reorders the rows and columns of the input data matrix in such a way as to group correlated entries in small local neighborhoods within the matrix. The proposed localization algorithm takes its roots from effective use of graph-theoretical methods applied to problems exhibiting a similar structure to that of biclustering. In order to evaluate the effectivenesss of the localization pre-processing algorithm, we focus on three representative greedy iterative heuristic methods. We show how the localization pre-processing can be incorporated into each representative algorithm to improve biclustering performance. Furthermore, we propose a simple biclustering algorithm, Random Extraction After Localization (REAL) that randomly extracts submatrices from the localization pre-processed data matrix, eliminates those with low similarity scores, and provides the rest as correlated structures representing biclusters. We compare the proposed localization pre-processing with another pre-processing alternative, non-negative matrix factorization. We show that our fast and simple localization procedure provides similar or even better results than the computationally heavy matrix factorization pre-processing with regards to H-value tests. We next demonstrate that the performances of the three representative greedy iterative heuristic methods improve with localization pre-processing when biological correlations in the form of functional enrichment and PPI verification constitute the main performance criteria. The fact that the random extraction method based on localization REAL performs better than the representative greedy heuristic methods under same criteria also confirms the effectiveness of the suggested pre-processing method. Supplementary material including code implementations in LEDA C++ library, experimental data, and the results are available at http://code.google.com/p/biclustering/ cesim@khas.edu.tr; melihsozdinler@boun.edu.tr Supplementary data are available at Bioinformatics online.

  19. Approximation algorithms for the min-power symmetric connectivity problem

    NASA Astrophysics Data System (ADS)

    Plotnikov, Roman; Erzin, Adil; Mladenovic, Nenad

    2016-10-01

    We consider the NP-hard problem of synthesis of optimal spanning communication subgraph in a given arbitrary simple edge-weighted graph. This problem occurs in the wireless networks while minimizing the total transmission power consumptions. We propose several new heuristics based on the variable neighborhood search metaheuristic for the approximation solution of the problem. We have performed a numerical experiment where all proposed algorithms have been executed on the randomly generated test samples. For these instances, on average, our algorithms outperform the previously known heuristics.

  20. DETECTORS AND EXPERIMENTAL METHODS: Heuristic approach for peak regions estimation in gamma-ray spectra measured by a NaI detector

    NASA Astrophysics Data System (ADS)

    Zhu, Meng-Hua; Liu, Liang-Gang; You, Zhong; Xu, Ao-Ao

    2009-03-01

    In this paper, a heuristic approach based on Slavic's peak searching method has been employed to estimate the width of peak regions for background removing. Synthetic and experimental data are used to test this method. With the estimated peak regions using the proposed method in the whole spectrum, we find it is simple and effective enough to be used together with the Statistics-sensitive Nonlinear Iterative Peak-Clipping method.

  1. Atomic Dynamics in Simple Liquid: de Gennes Narrowing Revisited

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Iwashita, Takuya; Egami, Takeshi

    2018-03-01

    The de Gennes narrowing phenomenon is frequently observed by neutron or x -ray scattering measurements of the dynamics of complex systems, such as liquids, proteins, colloids, and polymers. The characteristic slowing down of dynamics in the vicinity of the maximum of the total scattering intensity is commonly attributed to enhanced cooperativity. In this Letter, we present an alternative view on its origin through the examination of the time-dependent pair correlation function, the van Hove correlation function, for a model liquid in two, three, and four dimensions. We find that the relaxation time increases monotonically with distance and the dependence on distance varies with dimension. We propose a heuristic explanation of this dependence based on a simple geometrical model. This finding sheds new light on the interpretation of the de Gennes narrowing phenomenon and the α -relaxation time.

  2. New scheduling rules for a dynamic flexible flow line problem with sequence-dependent setup times

    NASA Astrophysics Data System (ADS)

    Kia, Hamidreza; Ghodsypour, Seyed Hassan; Davoudpour, Hamid

    2017-09-01

    In the literature, the application of multi-objective dynamic scheduling problem and simple priority rules are widely studied. Although these rules are not efficient enough due to simplicity and lack of general insight, composite dispatching rules have a very suitable performance because they result from experiments. In this paper, a dynamic flexible flow line problem with sequence-dependent setup times is studied. The objective of the problem is minimization of mean flow time and mean tardiness. A 0-1 mixed integer model of the problem is formulated. Since the problem is NP-hard, four new composite dispatching rules are proposed to solve it by applying genetic programming framework and choosing proper operators. Furthermore, a discrete-event simulation model is made to examine the performances of scheduling rules considering four new heuristic rules and the six adapted heuristic rules from the literature. It is clear from the experimental results that composite dispatching rules that are formed from genetic programming have a better performance in minimization of mean flow time and mean tardiness than others.

  3. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    ERIC Educational Resources Information Center

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2007-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer…

  4. On the suitability of fast and frugal heuristics for designing values clarification methods in patient decision aids: a critical analysis.

    PubMed

    Pieterse, Arwen H; de Vries, Marieke

    2013-09-01

    Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference-sensitive health-care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic-based VCMs. To critically analyse the suitability of the 'take the best' (TTB) and 'tallying' fast and frugal heuristics in the context of patient decision making. Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. The specific nature of patient preference-sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. © 2011 John Wiley & Sons Ltd.

  5. The Probability Heuristics Model of Syllogistic Reasoning.

    ERIC Educational Resources Information Center

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  6. Modeling the human as a controller in a multitask environment

    NASA Technical Reports Server (NTRS)

    Govindaraj, T.; Rouse, W. B.

    1978-01-01

    Modeling the human as a controller of slowly responding systems with preview is considered. Along with control tasks, discrete noncontrol tasks occur at irregular intervals. In multitask situations such as these, it has been observed that humans tend to apply piecewise constant controls. It is believed that the magnitude of controls and the durations for which they remain constant are dependent directly on the system bandwidth, preview distance, complexity of the trajectory to be followed, and nature of the noncontrol tasks. A simple heuristic model of human control behavior in this situation is presented. The results of a simulation study, whose purpose was determination of the sensitivity of the model to its parameters, are discussed.

  7. Linking indices for biodiversity monitoring to extinction risk theory.

    PubMed

    McCarthy, Michael A; Moore, Alana L; Krauss, Jochen; Morgan, John W; Clements, Christopher F

    2014-12-01

    Biodiversity indices often combine data from different species when used in monitoring programs. Heuristic properties can suggest preferred indices, but we lack objective ways to discriminate between indices with similar heuristics. Biodiversity indices can be evaluated by determining how well they reflect management objectives that a monitoring program aims to support. For example, the Convention on Biological Diversity requires reporting about extinction rates, so simple indices that reflect extinction risk would be valuable. We developed 3 biodiversity indices that are based on simple models of population viability that relate extinction risk to abundance. We based the first index on the geometric mean abundance of species and the second on a more general power mean. In a third index, we integrated the geometric mean abundance and trend. These indices require the same data as previous indices, but they also relate directly to extinction risk. Field data for butterflies and woodland plants and experimental studies of protozoan communities show that the indices correlate with local extinction rates. Applying the index based on the geometric mean to global data on changes in avian abundance suggested that the average extinction probability of birds has increased approximately 1% from 1970 to 2009. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.

  8. Navigating a Mobile Robot Across Terrain Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Howard, Ayanna; Bon, Bruce

    2003-01-01

    A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.

  9. Evaluation of the Priority Heuristic as a Descriptive Model of Risky Decision Making: Comment on Brandstatter, Gigerenzer, and Hertwig (2006)

    ERIC Educational Resources Information Center

    Birnbaum, Michael H.

    2008-01-01

    E. Brandstatter, G. Gigerenzer, and R. Hertwig (2006) contended that their priority heuristic, a type of lexicographic semiorder model, is more accurate than cumulative prospect theory (CPT) or transfer of attention exchange (TAX) models in describing risky decisions. However, there are 4 problems with their argument. First, their heuristic is not…

  10. Generalizing a model beyond the inherence heuristic and applying it to beliefs about objective value.

    PubMed

    Wood, Graham

    2014-10-01

    The inherence heuristic is characterized as part of an instantiation of a more general model that describes the interaction between undeveloped intuitions, produced by System 1 heuristics, and developed beliefs, constructed by System 2 reasoning. The general model is described and illustrated by examining another instantiation of the process that constructs belief in objective moral value.

  11. An adaptive toolbox approach to the route to expertise in sport.

    PubMed

    de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.

  12. An adaptive toolbox approach to the route to expertise in sport

    PubMed Central

    de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673

  13. Extracting Drug-Drug Interactions with Word and Character-Level Recurrent Neural Networks

    PubMed Central

    Kavuluru, Ramakanth; Rios, Anthony; Tran, Tung

    2017-01-01

    Drug-drug interactions (DDIs) are known to be responsible for nearly a third of all adverse drug reactions. Hence several current efforts focus on extracting signal from EMRs to prioritize DDIs that need further exploration. To this end, being able to extract explicit mentions of DDIs in free text narratives is an important task. In this paper, we explore recurrent neural network (RNN) architectures to detect and classify DDIs from unstructured text using the DDIExtraction dataset from the SemEval 2013 (task 9) shared task. Our methods are in line with those used in other recent deep learning efforts for relation extraction including DDI extraction. However, to our knowledge, we are the first to investigate the potential of character-level RNNs (Char-RNNs) for DDI extraction (and relation extraction in general). Furthermore, we explore a simple but effective model bootstrapping method to (a). build model averaging ensembles, (b). derive confidence intervals around mean micro-F scores (MMF), and (c). assess the average behavior of our methods. Without any rule based filtering of negative examples, a popular heuristic used by most earlier efforts, we achieve an MMF of 69.13. By adding simple replicable heuristics to filter negative instances we are able to achieve an MMF of 70.38. Furthermore, our best ensembles produce micro F-scores of 70.81 (without filtering) and 72.13 (with filtering), which are superior to metrics reported in published results. Although Char-RNNs turnout to be inferior to regular word based RNN models in overall comparisons, we find that ensembling models from both architectures results in nontrivial gains over simply using either alone, indicating that they complement each other. PMID:29034375

  14. Evaluating and comparing methods of sinkhole susceptibility mapping in the Ebro Valley evaporite karst (NE Spain)

    NASA Astrophysics Data System (ADS)

    Galve, J. P.; Gutiérrez, F.; Remondo, J.; Bonachea, J.; Lucha, P.; Cendrero, A.

    2009-10-01

    Multiple sinkhole susceptibility models have been generated in three study areas of the Ebro Valley evaporite karst (NE Spain) applying different methods (nearest neighbour distance, sinkhole density, heuristic scoring system and probabilistic analysis) for each sinkhole type separately (cover collapse sinkholes, cover and bedrock collapse sinkholes and cover and bedrock sagging sinkholes). The quantitative and independent evaluation of the predictive capability of the models reveals that: (1) The most reliable susceptibility models are those derived from the nearest neighbour distance and sinkhole density. These models can be generated in a simple and rapid way from detailed geomorphological maps. (2) The reliability of the nearest neighbour distance and density models is conditioned by the degree of clustering of the sinkholes. Consequently, the karst areas in which sinkholes show a higher clustering are a priori more favourable for predicting new occurrences. (3) The predictive capability of the best models obtained in this research is significantly higher (12.5-82.5%) than that of the heuristic sinkhole susceptibility model incorporated into the General Urban Plan for the municipality of Zaragoza. Although the probabilistic approach provides lower quality results than the methods based on sinkhole proximity and density, it helps to identify the most significant factors and select the most effective mitigation strategies and may be applied to model susceptibility in different future scenarios.

  15. Optimal dynamic remapping of parallel computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Reynolds, Paul F., Jr.

    1987-01-01

    A large class of computations are characterized by a sequence of phases, with phase changes occurring unpredictably. The decision problem was considered regarding the remapping of workload to processors in a parallel computation when the utility of remapping and the future behavior of the workload is uncertain, and phases exhibit stable execution requirements during a given phase, but requirements may change radically between phases. For these problems a workload assignment generated for one phase may hinder performance during the next phase. This problem is treated formally for a probabilistic model of computation with at most two phases. The fundamental problem of balancing the expected remapping performance gain against the delay cost was addressed. Stochastic dynamic programming is used to show that the remapping decision policy minimizing the expected running time of the computation has an extremely simple structure. Because the gain may not be predictable, the performance of a heuristic policy that does not require estimnation of the gain is examined. The heuristic method's feasibility is demonstrated by its use on an adaptive fluid dynamics code on a multiprocessor. The results suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change. The results also suggest that this heuristic is applicable to computations with more than two phases.

  16. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  17. Heuristic Diagrams as a Tool to Teach History of Science

    NASA Astrophysics Data System (ADS)

    Chamizo, José A.

    2012-05-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.

  18. The Effectiveness of Local Culture-Based Mathematical Heuristic-KR Learning towards Enhancing Student's Creative Thinking Skill

    ERIC Educational Resources Information Center

    Tandiseru, Selvi Rajuaty

    2015-01-01

    The problem in this research is the lack of creative thinking skills of students. One of the learning models that is expected to enhance student's creative thinking skill is the local culture-based mathematical heuristic-KR learning model (LC-BMHLM). Heuristic-KR is a learning model which was introduced by Krulik and Rudnick (1995) that is the…

  19. Cognitive models of risky choice: parameter stability and predictive accuracy of prospect theory.

    PubMed

    Glöckner, Andreas; Pachur, Thorsten

    2012-04-01

    In the behavioral sciences, a popular approach to describe and predict behavior is cognitive modeling with adjustable parameters (i.e., which can be fitted to data). Modeling with adjustable parameters allows, among other things, measuring differences between people. At the same time, parameter estimation also bears the risk of overfitting. Are individual differences as measured by model parameters stable enough to improve the ability to predict behavior as compared to modeling without adjustable parameters? We examined this issue in cumulative prospect theory (CPT), arguably the most widely used framework to model decisions under risk. Specifically, we examined (a) the temporal stability of CPT's parameters; and (b) how well different implementations of CPT, varying in the number of adjustable parameters, predict individual choice relative to models with no adjustable parameters (such as CPT with fixed parameters, expected value theory, and various heuristics). We presented participants with risky choice problems and fitted CPT to each individual's choices in two separate sessions (which were 1 week apart). All parameters were correlated across time, in particular when using a simple implementation of CPT. CPT allowing for individual variability in parameter values predicted individual choice better than CPT with fixed parameters, expected value theory, and the heuristics. CPT's parameters thus seem to pick up stable individual differences that need to be considered when predicting risky choice. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Heuristic Modeling for TRMM Lifetime Predictions

    NASA Technical Reports Server (NTRS)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  1. Evaluation of the priority heuristic as a descriptive model of risky decision making: comment on Brandstätter, Gigerenzer, and Hertwig (2006).

    PubMed

    Birnbaum, Michael H

    2008-01-01

    E. Brandstätter, G. Gigerenzer, and R. Hertwig (2006) contended that their priority heuristic, a type of lexicographic semiorder model, is more accurate than cumulative prospect theory (CPT) or transfer of attention exchange (TAX) models in describing risky decisions. However, there are 4 problems with their argument. First, their heuristic is not descriptive of certain data that they did not review. Second, their analysis relied on a global index of fit, percentage of correct predictions of the modal choice. Such analyses can lead to wrong conclusions when parameters are not properly estimated from the data. When parameters are estimated from the data, CPT and TAX fit the D. Kahneman and A. Tversky (1979) data perfectly. Reanalysis shows that TAX and CPT do as well as the priority heuristic for 2 of the data sets reviewed and outperform the priority heuristic for the other 3. Third, when 2 of these sets of data are reexamined, the priority heuristic is seen to make systematic violations. Fourth, new critical implications have been devised for testing the family of lexicographic semiorders including the priority heuristic; new results with these critical tests show systematic evidence against lexicographic semiorder models. (c) 2008 APA, all rights reserved

  2. What is behind the priority heuristic? A mathematical analysis and comment on Brandstätter, Gigerenzer, and Hertwig (2006).

    PubMed

    Rieger, Marc Oliver; Wang, Mei

    2008-01-01

    Comments on the article by E. Brandstätter, G. Gigerenzer, and R. Hertwig. The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral economics. They also discuss how general models for decisions under risk based on a heuristic approach can be understood mathematically to gain some insight in their limitations. They finally consider whether the priority heuristic model can lead to some understanding of the decision process of individuals or whether it is better seen as an as-if model. (c) 2008 APA, all rights reserved

  3. A Heuristic Potential Theory of Electric and Magnetic Monopoles without Strings.

    ERIC Educational Resources Information Center

    Barker, William A.; Graziani, Frank

    1978-01-01

    Shows how Maxwell's equations can be obtained by starting with a relatively simple pseudoscalar and scalar potential employing only the Lorentz transformation for a four vector (or pseudovector). (GA)

  4. Atomic Dynamics in Simple Liquid: de Gennes Narrowing Revisited

    DOE PAGES

    Wu, Bin; Iwashita, Takuya; Egami, Takeshi

    2018-03-27

    The de Gennes narrowing phenomenon is frequently observed by neutron or x-ray scattering measurements of the dynamics of complex systems, such as liquids, proteins, colloids, and polymers. The characteristic slowing down of dynamics in the vicinity of the maximum of the total scattering intensity is commonly attributed to enhanced cooperativity. In this Letter, we present an alternative view on its origin through the examination of the time-dependent pair correlation function, the van Hove correlation function, for a model liquid in two, three, and four dimensions. We find that the relaxation time increases monotonically with distance and the dependence on distancemore » varies with dimension. We propose a heuristic explanation of this dependence based on a simple geometrical model. Furthermore, this finding sheds new light on the interpretation of the de Gennes narrowing phenomenon and the α-relaxation time.« less

  5. Atomic Dynamics in Simple Liquid: de Gennes Narrowing Revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bin; Iwashita, Takuya; Egami, Takeshi

    The de Gennes narrowing phenomenon is frequently observed by neutron or x-ray scattering measurements of the dynamics of complex systems, such as liquids, proteins, colloids, and polymers. The characteristic slowing down of dynamics in the vicinity of the maximum of the total scattering intensity is commonly attributed to enhanced cooperativity. In this Letter, we present an alternative view on its origin through the examination of the time-dependent pair correlation function, the van Hove correlation function, for a model liquid in two, three, and four dimensions. We find that the relaxation time increases monotonically with distance and the dependence on distancemore » varies with dimension. We propose a heuristic explanation of this dependence based on a simple geometrical model. Furthermore, this finding sheds new light on the interpretation of the de Gennes narrowing phenomenon and the α-relaxation time.« less

  6. Aiding USAF/UPT (Undergraduate Pilot Training) Aircrew Scheduling Using Network Flow Models.

    DTIC Science & Technology

    1986-06-01

    51 3.4 Heuristic Modifications ............ 55 CHAPTER 4 STUDENT SCHEDULING PROBLEM (LEVEL 2) 4.0 Introduction 4.01 Constraints ............. 60 4.02...Covering" Complete Enumeration . . .. . 71 4.14 Heuristics . ............. 72 4.2 Heuristic Method for the Level 2 Problem 4.21 Step I ............... 73...4.22 Step 2 ............... 74 4.23 Advantages to the Heuristic Method. .... .. 78 4.24 Problems with the Heuristic Method. . ... 79 :,., . * CHAPTER5

  7. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Current Barriers to Successful Implementation of FIST Principles

    DTIC Science & Technology

    2013-07-01

    risks will surface during development that could not have been predicted. Managing a thin budget with no schedule slack for these unknown-unknowns is...Fleischer » Keywords: Fast, Inexpensive, Simple, Tiny (FIST); Program Management ; Heuristics; Innovation; Oversight Current Barriers to Successful...Implementation of FIST Principles Capt Brandon Keller, USAF, and Lt Col J. Robert Wirthlin, USAF The Fast, Inexpensive, Simple, and Tiny (FIST

  9. Heuristics and Problem Solving.

    ERIC Educational Resources Information Center

    Abel, Charles F.

    2003-01-01

    Defines heuristics as cognitive "rules of thumb" that can help problem solvers work more efficiently and effectively. Professors can use a heuristic model of problem solving to guide students in all disciplines through the steps of problem-solving. (SWM)

  10. Fluency Heuristic: A Model of How the Mind Exploits a By-Product of Information Retrieval

    ERIC Educational Resources Information Center

    Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten

    2008-01-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the…

  11. Kinetics versus thermodynamics in materials modeling: The case of the di-vacancy in iron

    NASA Astrophysics Data System (ADS)

    Djurabekova, F.; Malerba, L.; Pasianot, R. C.; Olsson, P.; Nordlund, K.

    2010-07-01

    Monte Carlo models are widely used for the study of microstructural and microchemical evolution of materials under irradiation. However, they often link explicitly the relevant activation energies to the energy difference between local equilibrium states. We provide a simple example (di-vacancy migration in iron) in which a rigorous activation energy calculation, by means of both empirical interatomic potentials and density functional theory methods, clearly shows that such a link is not granted, revealing a migration mechanism that a thermodynamics-linked activation energy model cannot predict. Such a mechanism is, however, fully consistent with thermodynamics. This example emphasizes the importance of basing Monte Carlo methods on models where the activation energies are rigorously calculated, rather than deduced from widespread heuristic equations.

  12. How Recent History Affects Perception: The Normative Approach and Its Heuristic Approximation

    PubMed Central

    Raviv, Ofri; Ahissar, Merav; Loewenstein, Yonatan

    2012-01-01

    There is accumulating evidence that prior knowledge about expectations plays an important role in perception. The Bayesian framework is the standard computational approach to explain how prior knowledge about the distribution of expected stimuli is incorporated with noisy observations in order to improve performance. However, it is unclear what information about the prior distribution is acquired by the perceptual system over short periods of time and how this information is utilized in the process of perceptual decision making. Here we address this question using a simple two-tone discrimination task. We find that the “contraction bias”, in which small magnitudes are overestimated and large magnitudes are underestimated, dominates the pattern of responses of human participants. This contraction bias is consistent with the Bayesian hypothesis in which the true prior information is available to the decision-maker. However, a trial-by-trial analysis of the pattern of responses reveals that the contribution of most recent trials to performance is overweighted compared with the predictions of a standard Bayesian model. Moreover, we study participants' performance in a-typical distributions of stimuli and demonstrate substantial deviations from the ideal Bayesian detector, suggesting that the brain utilizes a heuristic approximation of the Bayesian inference. We propose a biologically plausible model, in which decision in the two-tone discrimination task is based on a comparison between the second tone and an exponentially-decaying average of the first tone and past tones. We show that this model accounts for both the contraction bias and the deviations from the ideal Bayesian detector hypothesis. These findings demonstrate the power of Bayesian-like heuristics in the brain, as well as their limitations in their failure to fully adapt to novel environments. PMID:23133343

  13. On the suitability of fast and frugal heuristics for designing values clarification methods in patient decision aids: a critical analysis

    PubMed Central

    Pieterse, Arwen H.; de Vries, Marieke

    2011-01-01

    Abstract Background  Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference‐sensitive health‐care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic‐based VCMs. Objective  To critically analyse the suitability of the ‘take the best’ (TTB) and ‘tallying’ fast and frugal heuristics in the context of patient decision making. Strategy  Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. Conclusion  The specific nature of patient preference‐sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. PMID:21902770

  14. How to Talk about Professional Communication: Metalanguage and Heuristic Power.

    ERIC Educational Resources Information Center

    Killingsworth, M. Jimmie

    1989-01-01

    Analyzes several examples of metalanguage from current literature on professional writing, applying three principles for evaluating metalanguage in industry and academe. Considers a potentially effective metalanguage based on simple grammatical expressions. (MM)

  15. Understanding Singular Vectors

    ERIC Educational Resources Information Center

    James, David; Botteron, Cynthia

    2013-01-01

    matrix yields a surprisingly simple, heuristical approximation to its singular vectors. There are correspondingly good approximations to the singular values. Such rules of thumb provide an intuitive interpretation of the singular vectors that helps explain why the SVD is so…

  16. A Heuristic Decision Making Model to Mitigate Adverse Consequences in a Network Centric Warfare/Sense and Respond System

    DTIC Science & Technology

    2005-05-01

    made. 4. Do military decision makers identify / analyze adverse consequences presently? Few do based on this research and most don’t do it effectively ...A HEURISTIC DECISION MAKING MODEL TO MITIGATE ADVERSE CONSEQUENCES IN A NETWORK CENTRIC WARFARE / SENSE AND RESPOND SYSTEM...ENS/05-01 A HEURISTIC DECISION MAKING MODEL TO MITIGATE ADVERSE CONSEQUENCES IN A NETWORK CENTRIC WARFARE / SENSE AND RESPOND SYSTEM

  17. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.

  18. A Behavior Analysis of Individuals' Use of the Fairness Heuristic when Interacting with Groups and Organizations

    ERIC Educational Resources Information Center

    Goltz, Sonia M.

    2013-01-01

    In the present analysis the author utilizes the groups as patches model (Goltz, 2009, 2010) to extend fairness heuristic theory (Lind, 2001) in which the concept of fairness is thought to be a heuristic that allows individuals to match responses to consequences they receive from groups. In this model, individuals who are reviewing possible groups…

  19. Fast optimization of multipump Raman amplifiers based on a simplified wavelength and power budget heuristic

    NASA Astrophysics Data System (ADS)

    de O. Rocha, Helder R.; Castellani, Carlos E. S.; Silva, Jair A. L.; Pontes, Maria J.; Segatto, Marcelo E. V.

    2015-01-01

    We report a simple budget heuristic for a fast optimization of multipump Raman amplifiers based on the reallocation of the pump wavelengths and the optical powers. A set of different optical fibers are analyzed as the Raman gain medium, and a four-pump amplifier setup is optimized for each of them in order to achieve ripples close to 1 dB and gains up to 20 dB in the C band. Later, a comparison between our proposed heuristic and a multiobjective optimization based on a nondominated sorting genetic algorithm is made, highlighting the fact that our new approach can give similar solutions after at least an order of magnitude fewer iterations. The results shown in this paper can potentially pave the way for real-time optimization of multipump Raman amplifier systems.

  20. Parameter estimation using meta-heuristics in systems biology: a comprehensive review.

    PubMed

    Sun, Jianyong; Garibaldi, Jonathan M; Hodgman, Charlie

    2012-01-01

    This paper gives a comprehensive review of the application of meta-heuristics to optimization problems in systems biology, mainly focussing on the parameter estimation problem (also called the inverse problem or model calibration). It is intended for either the system biologist who wishes to learn more about the various optimization techniques available and/or the meta-heuristic optimizer who is interested in applying such techniques to problems in systems biology. First, the parameter estimation problems emerging from different areas of systems biology are described from the point of view of machine learning. Brief descriptions of various meta-heuristics developed for these problems follow, along with outlines of their advantages and disadvantages. Several important issues in applying meta-heuristics to the systems biology modelling problem are addressed, including the reliability and identifiability of model parameters, optimal design of experiments, and so on. Finally, we highlight some possible future research directions in this field.

  1. Heuristics guide the implementation of social preferences in one-shot Prisoner's Dilemma experiments

    PubMed Central

    Capraro, Valerio; Jordan, Jillian J.; Rand, David G.

    2014-01-01

    Cooperation in one-shot anonymous interactions is a widely documented aspect of human behaviour. Here we shed light on the motivations behind this behaviour by experimentally exploring cooperation in a one-shot continuous-strategy Prisoner's Dilemma (i.e. one-shot two-player Public Goods Game). We examine the distribution of cooperation amounts, and how that distribution varies based on the benefit-to-cost ratio of cooperation (b/c). Interestingly, we find a trimodal distribution at all b/c values investigated. Increasing b/c decreases the fraction of participants engaging in zero cooperation and increases the fraction engaging in maximal cooperation, suggesting a role for efficiency concerns. However, a substantial fraction of participants consistently engage in 50% cooperation regardless of b/c. The presence of these persistent 50% cooperators is surprising, and not easily explained by standard models of social preferences. We present evidence that this behaviour is a result of social preferences guided by simple decision heuristics, rather than the rational examination of payoffs assumed by most social preference models. We also find a strong correlation between play in the Prisoner's Dilemma and in a subsequent Dictator Game, confirming previous findings suggesting a common prosocial motivation underlying altruism and cooperation. PMID:25348470

  2. Heuristics guide the implementation of social preferences in one-shot Prisoner's Dilemma experiments.

    PubMed

    Capraro, Valerio; Jordan, Jillian J; Rand, David G

    2014-10-28

    Cooperation in one-shot anonymous interactions is a widely documented aspect of human behaviour. Here we shed light on the motivations behind this behaviour by experimentally exploring cooperation in a one-shot continuous-strategy Prisoner's Dilemma (i.e. one-shot two-player Public Goods Game). We examine the distribution of cooperation amounts, and how that distribution varies based on the benefit-to-cost ratio of cooperation (b/c). Interestingly, we find a trimodal distribution at all b/c values investigated. Increasing b/c decreases the fraction of participants engaging in zero cooperation and increases the fraction engaging in maximal cooperation, suggesting a role for efficiency concerns. However, a substantial fraction of participants consistently engage in 50% cooperation regardless of b/c. The presence of these persistent 50% cooperators is surprising, and not easily explained by standard models of social preferences. We present evidence that this behaviour is a result of social preferences guided by simple decision heuristics, rather than the rational examination of payoffs assumed by most social preference models. We also find a strong correlation between play in the Prisoner's Dilemma and in a subsequent Dictator Game, confirming previous findings suggesting a common prosocial motivation underlying altruism and cooperation.

  3. FUNDAMENTALS OF THRESHOLD LOGIC.

    DTIC Science & Technology

    These notes on threshold logic are intended as intermediary material between a completely geometric, heuristic presentation and the more formal...source material available in the literature. Basic definitions and simple properties of threshold function are developed, followed by a complete treatment

  4. The myopia of crowds: Cognitive load and collective evaluation of answers on Stack Exchange

    PubMed Central

    Burghardt, Keith; Alsina, Emanuel F.; Girvan, Michelle; Rand, William; Lerman, Kristina

    2017-01-01

    Crowds can often make better decisions than individuals or small groups of experts by leveraging their ability to aggregate diverse information. Question answering sites, such as Stack Exchange, rely on the “wisdom of crowds” effect to identify the best answers to questions asked by users. We analyze data from 250 communities on the Stack Exchange network to pinpoint factors affecting which answers are chosen as the best answers. Our results suggest that, rather than evaluate all available answers to a question, users rely on simple cognitive heuristics to choose an answer to vote for or accept. These cognitive heuristics are linked to an answer’s salience, such as the order in which it is listed and how much screen space it occupies. While askers appear to depend on heuristics to a greater extent than voters when choosing an answer to accept as the most helpful one, voters use acceptance itself as a heuristic, and they are more likely to choose the answer after it has been accepted than before that answer was accepted. These heuristics become more important in explaining and predicting behavior as the number of available answers to a question increases. Our findings suggest that crowd judgments may become less reliable as the number of answers grows. PMID:28301531

  5. Anticipation and Choice Heuristics in the Dynamic Consumption of Pain Relief

    PubMed Central

    Story, Giles W.; Vlaev, Ivo; Dayan, Peter; Seymour, Ben; Darzi, Ara; Dolan, Raymond J.

    2015-01-01

    Humans frequently need to allocate resources across multiple time-steps. Economic theory proposes that subjects do so according to a stable set of intertemporal preferences, but the computational demands of such decisions encourage the use of formally less competent heuristics. Few empirical studies have examined dynamic resource allocation decisions systematically. Here we conducted an experiment involving the dynamic consumption over approximately 15 minutes of a limited budget of relief from moderately painful stimuli. We had previously elicited the participants’ time preferences for the same painful stimuli in one-off choices, allowing us to assess self-consistency. Participants exhibited three characteristic behaviors: saving relief until the end, spreading relief across time, and early spending, of which the last was markedly less prominent. The likelihood that behavior was heuristic rather than normative is suggested by the weak correspondence between one-off and dynamic choices. We show that the consumption choices are consistent with a combination of simple heuristics involving early-spending, spreading or saving of relief until the end, with subjects predominantly exhibiting the last two. PMID:25793302

  6. Anticipation and choice heuristics in the dynamic consumption of pain relief.

    PubMed

    Story, Giles W; Vlaev, Ivo; Dayan, Peter; Seymour, Ben; Darzi, Ara; Dolan, Raymond J

    2015-03-01

    Humans frequently need to allocate resources across multiple time-steps. Economic theory proposes that subjects do so according to a stable set of intertemporal preferences, but the computational demands of such decisions encourage the use of formally less competent heuristics. Few empirical studies have examined dynamic resource allocation decisions systematically. Here we conducted an experiment involving the dynamic consumption over approximately 15 minutes of a limited budget of relief from moderately painful stimuli. We had previously elicited the participants' time preferences for the same painful stimuli in one-off choices, allowing us to assess self-consistency. Participants exhibited three characteristic behaviors: saving relief until the end, spreading relief across time, and early spending, of which the last was markedly less prominent. The likelihood that behavior was heuristic rather than normative is suggested by the weak correspondence between one-off and dynamic choices. We show that the consumption choices are consistent with a combination of simple heuristics involving early-spending, spreading or saving of relief until the end, with subjects predominantly exhibiting the last two.

  7. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    PubMed Central

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2010-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer interactions and relationships, social problem solving and communication, social-affective and cognitive-executive processes, and their neural substrates. The model is illustrated by research on a specific form of childhood brain disorder, traumatic brain injury. The heuristic model may promote research regarding the neural and cognitive-affective substrates of children’s social development. It also may engender more precise methods of measuring impairments and disabilities in children with brain disorder and suggest ways to promote their social adaptation. PMID:17469991

  8. Neural model of gene regulatory network: a survey on supportive meta-heuristics.

    PubMed

    Biswas, Surama; Acharyya, Sriyankar

    2016-06-01

    Gene regulatory network (GRN) is produced as a result of regulatory interactions between different genes through their coded proteins in cellular context. Having immense importance in disease detection and drug finding, GRN has been modelled through various mathematical and computational schemes and reported in survey articles. Neural and neuro-fuzzy models have been the focus of attraction in bioinformatics. Predominant use of meta-heuristic algorithms in training neural models has proved its excellence. Considering these facts, this paper is organized to survey neural modelling schemes of GRN and the efficacy of meta-heuristic algorithms towards parameter learning (i.e. weighting connections) within the model. This survey paper renders two different structure-related approaches to infer GRN which are global structure approach and substructure approach. It also describes two neural modelling schemes, such as artificial neural network/recurrent neural network based modelling and neuro-fuzzy modelling. The meta-heuristic algorithms applied so far to learn the structure and parameters of neutrally modelled GRN have been reviewed here.

  9. Cultural heuristics in risk assessment of HIV/AIDS.

    PubMed

    Bailey, Ajay; Hutter, Inge

    2006-01-01

    Behaviour change models in HIV prevention tend to consider that risky sexual behaviours reflect risk assessments and that by changing risk assessments behaviour can be changed. Risk assessment is however culturally constructed. Individuals use heuristics or bounded cognitive devices derived from broader cultural meaning systems to rationalize uncertainty. In this study, we identify some of the cultural heuristics used by migrant men in Goa, India to assess their risk of HIV infection from different sexual partners. Data derives from a series of in-depth interviews and a locally informed survey. Cultural heuristics identified include visual heuristics, heuristics of gender roles, vigilance and trust. The paper argues that, for more culturally informed HIV/AIDS behaviour change interventions, knowledge of cultural heuristics is essential.

  10. Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Timmermans, Harry

    2011-06-01

    Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

  11. The Priority Heuristic: Making Choices Without Trade-Offs

    PubMed Central

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2010-01-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, we generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (i) Allais' paradox, (ii) risk aversion for gains if probabilities are high, (iii) risk seeking for gains if probabilities are low (lottery tickets), (iv) risk aversion for losses if probabilities are low (buying insurance), (v) risk seeking for losses if probabilities are high, (vi) certainty effect, (vii) possibility effect, and (viii) intransitivities. We test how accurately the heuristic predicts people's choices, compared to previously proposed heuristics and three modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. PMID:16637767

  12. Testing Bayesian and heuristic predictions of mass judgments of colliding objects

    PubMed Central

    Sanborn, Adam N.

    2014-01-01

    Mass judgments of colliding objects have been used to explore people's understanding of the physical world because they are ecologically relevant, yet people display biases that are most easily explained by a small set of heuristics. Recent work has challenged the heuristic explanation, by producing the same biases from a model that copes with perceptual uncertainty by using Bayesian inference with a prior based on the correct combination rules from Newtonian mechanics (noisy Newton). Here I test the predictions of the leading heuristic model (Gilden and Proffitt, 1989) against the noisy Newton model using a novel manipulation of the standard mass judgment task: making one of the objects invisible post-collision. The noisy Newton model uses the remaining information to predict above-chance performance, while the leading heuristic model predicts chance performance when one or the other final velocity is occluded. An experiment using two different types of occlusion showed better-than-chance performance and response patterns that followed the predictions of the noisy Newton model. The results demonstrate that people can make sensible physical judgments even when information critical for the judgment is missing, and that a Bayesian model can serve as a guide in these situations. Possible algorithmic-level accounts of this task that more closely correspond to the noisy Newton model are explored. PMID:25206345

  13. The application of the heuristic-systematic processing model to treatment decision making about prostate cancer.

    PubMed

    Steginga, Suzanne K; Occhipinti, Stefano

    2004-01-01

    The study investigated the utility of the Heuristic-Systematic Processing Model as a framework for the investigation of patient decision making. A total of 111 men recently diagnosed with localized prostate cancer were assessed using Verbal Protocol Analysis and self-report measures. Study variables included men's use of nonsystematic and systematic information processing, desire for involvement in decision making, and the individual differences of health locus of control, tolerance of ambiguity, and decision-related uncertainty. Most men (68%) preferred that decision making be shared equally between them and their doctor. Men's use of the expert opinion heuristic was related to men's verbal reports of decisional uncertainty and having a positive orientation to their doctor and medical care; a desire for greater involvement in decision making was predicted by a high internal locus of health control. Trends were observed for systematic information processing to increase when the heuristic strategy used was negatively affect laden and when men were uncertain about the probabilities for cure and side effects. There was a trend for decreased systematic processing when the expert opinion heuristic was used. Findings were consistent with the Heuristic-Systematic Processing Model and suggest that this model has utility for future research in applied decision making about health.

  14. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  15. Fear and Loving in Las Vegas: Evolution, Emotion, and Persuasion.

    PubMed

    Griskevicius, Vladas; Goldstein, Noah J; Mortensen, Chad R; Sundie, Jill M; Cialdini, Robert B; Kenrick, Douglas T

    2009-06-01

    How do arousal-inducing contexts, such as frightening or romantic television programs, influence the effectiveness of basic persuasion heuristics? Different predictions are made by three theoretical models: A general arousal model predicts that arousal should increase effectiveness of heuristics; an affective valence model predicts that effectiveness should depend on whether the context elicits positive or negative affect; an evolutionary model predicts that persuasiveness should depend on both the specific emotion that is elicited and the content of the particular heuristic. Three experiments examined how fear-inducing versus romantic contexts influenced the effectiveness of two widely used heuristics-social proof (e.g., "most popular") and scarcity (e.g., "limited edition"). Results supported predictions from an evolutionary model, showing that fear can lead scarcity appeals to be counter-persuasive, and that romantic desire can lead social proof appeals to be counter-persuasive. The findings highlight how an evolutionary theoretical approach can lead to novel theoretical and practical marketing insights.

  16. An extended abstract: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Johnston, Mark D.; Philips, Andrew B.; Laird, Philip

    1992-01-01

    The work described in this paper was inspired by a surprisingly effective neural network developed for scheduling astronomical observations on the Hubble Space Telescope. Our heuristic constraint satisfaction problem (CSP) method was distilled from an analysis of the network. In the process of carrying out the analysis, we discovered that the effectiveness of the network has little to do with its connectionist implementation. Furthermore, the ideas employed in the network can be implemented very efficiently within a symbolic CSP framework. The symbolic implementation is extremely simple. It also has the advantage that several different search strategies can be employed, although we have found that hill-climbing methods are particularly well-suited for the applications that we have investigated. We begin the paper with a brief review of the neural network. Following this, we describe our symbolic method for heuristic repair.

  17. Basic Research on Adaptive Model Algorithmic Control

    DTIC Science & Technology

    1985-12-01

    Control Conference. Richalet, J., A. Rault, J.L. Testud and J. Papon (1978). Model predictive heuristic control: applications to industrial...pp.977-982. Richalet, J., A. Rault, J. L. Testud and J. Papon (1978). Model predictive heuristic control: applications to industrial processes

  18. The heuristic-analytic theory of reasoning: extension and evaluation.

    PubMed

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  19. Protein Structure Prediction with Evolutionary Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, W.E.; Krasnogor, N.; Pelta, D.A.

    1999-02-08

    Evolutionary algorithms have been successfully applied to a variety of molecular structure prediction problems. In this paper we reconsider the design of genetic algorithms that have been applied to a simple protein structure prediction problem. Our analysis considers the impact of several algorithmic factors for this problem: the confirmational representation, the energy formulation and the way in which infeasible conformations are penalized, Further we empirically evaluated the impact of these factors on a small set of polymer sequences. Our analysis leads to specific recommendations for both GAs as well as other heuristic methods for solving PSP on the HP model.

  20. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions

    PubMed Central

    2017-01-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469

  1. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.

    PubMed

    Nantha, Yogarabindranath Swarna

    2017-11-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.

  2. FITPOP, a heuristic simulation model of population dynamics and genetics with special reference to fisheries

    USGS Publications Warehouse

    McKenna, James E.

    2000-01-01

    Although, perceiving genetic differences and their effects on fish population dynamics is difficult, simulation models offer a means to explore and illustrate these effects. I partitioned the intrinsic rate of increase parameter of a simple logistic-competition model into three components, allowing specification of effects of relative differences in fitness and mortality, as well as finite rate of increase. This model was placed into an interactive, stochastic environment to allow easy manipulation of model parameters (FITPOP). Simulation results illustrated the effects of subtle differences in genetic and population parameters on total population size, overall fitness, and sensitivity of the system to variability. Several consequences of mixing genetically distinct populations were illustrated. For example, behaviors such as depression of population size after initial introgression and extirpation of native stocks due to continuous stocking of genetically inferior fish were reproduced. It also was shown that carrying capacity relative to the amount of stocking had an important influence on population dynamics. Uncertainty associated with parameter estimates reduced confidence in model projections. The FITPOP model provided a simple tool to explore population dynamics, which may assist in formulating management strategies and identifying research needs.

  3. On Dual Processing and Heuristic Approaches to Moral Cognition

    ERIC Educational Resources Information Center

    Lapsley, Daniel K.; Hill, Patrick L.

    2008-01-01

    We examine the implications of dual-processing theories of cognition for the moral domain, with particular emphasis upon "System 1" theories: the Social Intuitionist Model (Haidt), moral heuristics (Sunstein), fast-and-frugal moral heuristics (Gigerenzer), schema accessibility (Lapsley & Narvaez) and moral expertise (Narvaez). We argue that these…

  4. The priority heuristic: making choices without trade-offs.

    PubMed

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2006-04-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, the authors generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (a) the Allais paradox, (b) risk aversion for gains if probabilities are high, (c) risk seeking for gains if probabilities are low (e.g., lottery tickets), (d) risk aversion for losses if probabilities are low (e.g., buying insurance), (e) risk seeking for losses if probabilities are high, (f) the certainty effect, (g) the possibility effect, and (h) intransitivities. The authors test how accurately the heuristic predicts people's choices, compared with previously proposed heuristics and 3 modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. ((c) 2006 APA, all rights reserved).

  5. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    ERIC Educational Resources Information Center

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  6. Asymptotically Exact Heuristics for Prime Divisors of the Sequence {a^k+b^k}_{k=1}^infty

    NASA Astrophysics Data System (ADS)

    Moree, Pieter

    2006-07-01

    Let N_{a,b}(x) count the number of primes p<= x with p dividing a^k+b^k for some k>= 1. It is known that N_{a,b}(x)sim c(a,b)x/log x for some rational number c(a,b) that depends in a rather intricate way on a and b. A simple heuristic formula for N_{a,b}(x) is proposed and it is proved that it is asymptotically exact, i.e., has the same asymptotic behavior as N_{a,b}(x). Connections with Ramanujan sums and character sums are discussed.

  7. A Plea for Process in Personality Prevarication

    PubMed Central

    Kuncel, Nathan R.; Goldberg, Lewis R.; Kiger, Tom

    2011-01-01

    We make a series recommendations for focusing research on personality test faking. Overall we suggest that a focus on the response process test takers go through will accelerate our understanding of faking behavior. We argue that the decision making process for faking must be simple and dependent on a modest set of decision rules or heuristics. The set of heuristics used by any given test taker will, in turn, be the result of test taker goals and situational press. By focusing in on what the test taker is doing we will avoid adopting the wrong frame of reference and, hopefully, make ever more rapid progress. PMID:21966260

  8. Slime moulds use heuristics based on within-patch experience to decide when to leave.

    PubMed

    Latty, Tanya; Beekman, Madeleine

    2015-04-15

    Animals foraging in patchy, non-renewing or slowly renewing environments must make decisions about how long to remain within a patch. Organisms can use heuristics ('rules of thumb') based on available information to decide when to leave the patch. Here, we investigated proximate patch-departure heuristics in two species of giant, brainless amoeba: the slime moulds Didymium bahiense and Physarum polycephalum. We explicitly tested the importance of information obtained through experience by eliminating chemosensory cues of patch quality. In P. polycephalum, patch departure was influenced by the consumption of high, and to a much lesser extent low, quality food items such that engulfing a food item increased patch-residency time. Physarum polycephalum also tended to forage for longer in darkened, 'safe' patches. In D. bahiense, engulfment of any food item increased patch residency irrespective of that food item's quality. Exposure to light had no effect on the patch-residency time of D. bahiense. Given that these organisms lack a brain, our results illustrate how the use of simple heuristics can give the impression that individuals make sophisticated foraging decisions. © 2015. Published by The Company of Biologists Ltd.

  9. A simple strategy for varying the restart parameter in GMRES(m)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A H; Jessup, E R; Kolev, T V

    2007-10-02

    When solving a system of linear equations with the restarted GMRES method, a fixed restart parameter is typically chosen. We present numerical experiments that demonstrate the beneficial effects of changing the value of the restart parameter in each restart cycle on the total time to solution. We propose a simple strategy for varying the restart parameter and provide some heuristic explanations for its effectiveness based on analysis of the symmetric case.

  10. Towards an Understanding of Instructional Design Heuristics: An Exploratory Delphi Study

    ERIC Educational Resources Information Center

    York, Cindy S.; Ertmer, Peggy A.

    2011-01-01

    Evidence suggests that experienced instructional designers often use heuristics and adapted models when engaged in the instructional design problem-solving process. This study used the Delphi technique to identify a core set of heuristics designers reported as being important to the success of the design process. The overarching purpose of the…

  11. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    PubMed

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  12. Mathematical programming formulations for satellite synthesis

    NASA Technical Reports Server (NTRS)

    Bhasin, Puneet; Reilly, Charles H.

    1987-01-01

    The problem of satellite synthesis can be described as optimally allotting locations and sometimes frequencies and polarizations, to communication satellites so that interference from unwanted satellite signals does not exceed a specified threshold. In this report, mathematical programming models and optimization methods are used to solve satellite synthesis problems. A nonlinear programming formulation which is solved using Zoutendijk's method and a gradient search method is described. Nine mixed integer programming models are considered. Results of computer runs with these nine models and five geographically compatible scenarios are presented and evaluated. A heuristic solution procedure is also used to solve two of the models studied. Heuristic solutions to three large synthesis problems are presented. The results of our analysis show that the heuristic performs very well, both in terms of solution quality and solution time, on the two models to which it was applied. It is concluded that the heuristic procedure is the best of the methods considered for solving satellite synthesis problems.

  13. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    PubMed

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  14. The development of adaptive decision making: Recognition-based inference in children and adolescents.

    PubMed

    Horn, Sebastian S; Ruggeri, Azzurra; Pachur, Thorsten

    2016-09-01

    Judgments about objects in the world are often based on probabilistic information (or cues). A frugal judgment strategy that utilizes memory (i.e., the ability to discriminate between known and unknown objects) as a cue for inference is the recognition heuristic (RH). The usefulness of the RH depends on the structure of the environment, particularly the predictive power (validity) of recognition. Little is known about developmental differences in use of the RH. In this study, the authors examined (a) to what extent children and adolescents recruit the RH when making judgments, and (b) around what age adaptive use of the RH emerges. Primary schoolchildren (M = 9 years), younger adolescents (M = 12 years), and older adolescents (M = 17 years) made comparative judgments in task environments with either high or low recognition validity. Reliance on the RH was measured with a hierarchical multinomial model. Results indicated that primary schoolchildren already made systematic use of the RH. However, only older adolescents adaptively adjusted their strategy use between environments and were better able to discriminate between situations in which the RH led to correct versus incorrect inferences. These findings suggest that the use of simple heuristics does not progress unidirectionally across development but strongly depends on the task environment, in line with the perspective of ecological rationality. Moreover, adaptive heuristic inference seems to require experience and a developed base of domain knowledge. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Analysis of Levene's Test under Design Imbalance.

    ERIC Educational Resources Information Center

    Keyes, Tim K.; Levy, Martin S.

    1997-01-01

    H. Levene (1960) proposed a heuristic test for heteroscedasticity in the case of a balanced two-way layout, based on analysis of variance of absolute residuals. Conditions under which design imbalance affects the test's characteristics are identified, and a simple correction involving leverage is proposed. (SLD)

  16. Task as a Heuristic for Understanding Student Learning and Motivation.

    ERIC Educational Resources Information Center

    Blumenfeld, Phyllis C.; And Others

    1987-01-01

    Describes the cognitive characteristics and procedural "forms" associated with common school learning tasks. Illustrates how variations in these can affect student motivation and learning. Concludes that simple task content and unvaried procedures tend to result in limited thinkers and alienated workers. (JDH)

  17. Good-enough linguistic representations and online cognitive equilibrium in language processing.

    PubMed

    Karimi, Hossein; Ferreira, Fernanda

    2016-01-01

    We review previous research showing that representations formed during language processing are sometimes just "good enough" for the task at hand and propose the "online cognitive equilibrium" hypothesis as the driving force behind the formation of good-enough representations in language processing. Based on this view, we assume that the language comprehension system by default prefers to achieve as early as possible and remain as long as possible in a state of cognitive equilibrium where linguistic representations are successfully incorporated with existing knowledge structures (i.e., schemata) so that a meaningful and coherent overall representation is formed, and uncertainty is resolved or at least minimized. We also argue that the online equilibrium hypothesis is consistent with current theories of language processing, which maintain that linguistic representations are formed through a complex interplay between simple heuristics and deep syntactic algorithms and also theories that hold that linguistic representations are often incomplete and lacking in detail. We also propose a model of language processing that makes use of both heuristic and algorithmic processing, is sensitive to online cognitive equilibrium, and, we argue, is capable of explaining the formation of underspecified representations. We review previous findings providing evidence for underspecification in relation to this hypothesis and the associated language processing model and argue that most of these findings are compatible with them.

  18. Process-driven inference of biological network structure: feasibility, minimality, and multiplicity

    NASA Astrophysics Data System (ADS)

    Zeng, Chen

    2012-02-01

    For a given dynamic process, identifying the putative interaction networks to achieve it is the inference problem. In this talk, we address the computational complexity of inference problem in the context of Boolean networks under dominant inhibition condition. The first is a proof that the feasibility problem (is there a network that explains the dynamics?) can be solved in polynomial-time. Second, while the minimality problem (what is the smallest network that explains the dynamics?) is shown to be NP-hard, a simple polynomial-time heuristic is shown to produce near-minimal solutions, as demonstrated by simulation. Third, the theoretical framework also leads to a fast polynomial-time heuristic to estimate the number of network solutions with reasonable accuracy. We will apply these approaches to two simplified Boolean network models for the cell cycle process of budding yeast (Li 2004) and fission yeast (Davidich 2008). Our results demonstrate that each of these networks contains a giant backbone motif spanning all the network nodes that provides the desired main functionality, while the remaining edges in the network form smaller motifs whose role is to confer stability properties rather than provide function. Moreover, we show that the bioprocesses of these two cell cycle models differ considerably from a typically generated process and are intrinsically cascade-like.

  19. Minimizing makespan in a two-stage flow shop with parallel batch-processing machines and re-entrant jobs

    NASA Astrophysics Data System (ADS)

    Huang, J. D.; Liu, J. J.; Chen, Q. X.; Mao, N.

    2017-06-01

    Against a background of heat-treatment operations in mould manufacturing, a two-stage flow-shop scheduling problem is described for minimizing makespan with parallel batch-processing machines and re-entrant jobs. The weights and release dates of jobs are non-identical, but job processing times are equal. A mixed-integer linear programming model is developed and tested with small-scale scenarios. Given that the problem is NP hard, three heuristic construction methods with polynomial complexity are proposed. The worst case of the new constructive heuristic is analysed in detail. A method for computing lower bounds is proposed to test heuristic performance. Heuristic efficiency is tested with sets of scenarios. Compared with the two improved heuristics, the performance of the new constructive heuristic is superior.

  20. An investigation of the use of temporal decomposition in space mission scheduling

    NASA Technical Reports Server (NTRS)

    Bullington, Stanley E.; Narayanan, Venkat

    1994-01-01

    This research involves an examination of techniques for solving scheduling problems in long-duration space missions. The mission timeline is broken up into several time segments, which are then scheduled incrementally. Three methods are presented for identifying the activities that are to be attempted within these segments. The first method is a mathematical model, which is presented primarily to illustrate the structure of the temporal decomposition problem. Since the mathematical model is bound to be computationally prohibitive for realistic problems, two heuristic assignment procedures are also presented. The first heuristic method is based on dispatching rules for activity selection, and the second heuristic assigns performances of a model evenly over timeline segments. These heuristics are tested using a sample Space Station mission and a Spacelab mission. The results are compared with those obtained by scheduling the missions without any problem decomposition. The applicability of this approach to large-scale mission scheduling problems is also discussed.

  1. Hyper-heuristics with low level parameter adaptation.

    PubMed

    Ren, Zhilei; Jiang, He; Xuan, Jifeng; Luo, Zhongxuan

    2012-01-01

    Recent years have witnessed the great success of hyper-heuristics applying to numerous real-world applications. Hyper-heuristics raise the generality of search methodologies by manipulating a set of low level heuristics (LLHs) to solve problems, and aim to automate the algorithm design process. However, those LLHs are usually parameterized, which may contradict the domain independent motivation of hyper-heuristics. In this paper, we show how to automatically maintain low level parameters (LLPs) using a hyper-heuristic with LLP adaptation (AD-HH), and exemplify the feasibility of AD-HH by adaptively maintaining the LLPs for two hyper-heuristic models. Furthermore, aiming at tackling the search space expansion due to the LLP adaptation, we apply a heuristic space reduction (SAR) mechanism to improve the AD-HH framework. The integration of the LLP adaptation and the SAR mechanism is able to explore the heuristic space more effectively and efficiently. To evaluate the performance of the proposed algorithms, we choose the p-median problem as a case study. The empirical results show that with the adaptation of the LLPs and the SAR mechanism, the proposed algorithms are able to achieve competitive results over the three heterogeneous classes of benchmark instances.

  2. Model for the design of distributed data bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ram, S.

    This research focuses on developing a model to solve the File Allocation Problem (FAP). The model integrates two major design issues, namely Concurrently Control and Data Distribution. The central node locking mechanism is incorporated in developing a nonlinear integer programming model. Two solution algorithms are proposed, one of which was implemented in FORTRAN.V. The allocation of data bases and programs are examined using this heuristic. Several decision rules were also formulated based on the results of the heuristic. A second more comprehensive heuristic was proposed, based on the knapsack problem. The development and implementation of this algorithm has been leftmore » as a topic for future research.« less

  3. A methodology to design heuristics for model selection based on the characteristics of data: Application to investigate when the Negative Binomial Lindley (NB-L) is preferred over the Negative Binomial (NB).

    PubMed

    Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy

    2017-10-01

    Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Making decisions at the end of life when caring for a person with dementia: a literature review to explore the potential use of heuristics in difficult decision-making.

    PubMed

    Mathew, R; Davies, N; Manthorpe, J; Iliffe, S

    2016-07-19

    Decision-making, when providing care and treatment for a person with dementia at the end of life, can be complex and challenging. There is a lack of guidance available to support practitioners and family carers, and even those experienced in end of life dementia care report a lack of confidence in decision-making. It is thought that the use of heuristics (rules of thumb) may aid decision-making. The aim of this study is to identify whether heuristics are used in end of life dementia care, and if so, to identify the context in which they are being used. A narrative literature review was conducted taking a systematic approach to the search strategy, using the Centre for Reviews and Dissemination guidelines. Rapid appraisal methodology was used in order to source specific and relevant literature regarding the use of heuristics in end of life dementia care. A search using terms related to dementia, palliative care and decision-making was conducted across 4 English language electronic databases (MEDLINE, EMBASE, PsycINFO and CINAHL) in 2015. The search identified 12 papers that contained an algorithm, guideline, decision tool or set of principles that we considered compatible with heuristic decision-making. The papers addressed swallowing and feeding difficulties, the treatment of pneumonia, management of pain and agitation, rationalising medication, ending life-sustaining treatment, and ensuring a good death. The use of heuristics in palliative or end of life dementia care is not described in the research literature. However, this review identified important decision-making principles, which are largely a reflection of expert opinion. These principles may have the potential to be developed into simple heuristics that could be used in practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Making decisions at the end of life when caring for a person with dementia: a literature review to explore the potential use of heuristics in difficult decision-making

    PubMed Central

    Mathew, R; Davies, N; Manthorpe, J; Iliffe, S

    2016-01-01

    Objective Decision-making, when providing care and treatment for a person with dementia at the end of life, can be complex and challenging. There is a lack of guidance available to support practitioners and family carers, and even those experienced in end of life dementia care report a lack of confidence in decision-making. It is thought that the use of heuristics (rules of thumb) may aid decision-making. The aim of this study is to identify whether heuristics are used in end of life dementia care, and if so, to identify the context in which they are being used. Design A narrative literature review was conducted taking a systematic approach to the search strategy, using the Centre for Reviews and Dissemination guidelines. Rapid appraisal methodology was used in order to source specific and relevant literature regarding the use of heuristics in end of life dementia care. Data sources A search using terms related to dementia, palliative care and decision-making was conducted across 4 English language electronic databases (MEDLINE, EMBASE, PsycINFO and CINAHL) in 2015. Results The search identified 12 papers that contained an algorithm, guideline, decision tool or set of principles that we considered compatible with heuristic decision-making. The papers addressed swallowing and feeding difficulties, the treatment of pneumonia, management of pain and agitation, rationalising medication, ending life-sustaining treatment, and ensuring a good death. Conclusions The use of heuristics in palliative or end of life dementia care is not described in the research literature. However, this review identified important decision-making principles, which are largely a reflection of expert opinion. These principles may have the potential to be developed into simple heuristics that could be used in practice. PMID:27436665

  6. Inhibitory mechanism of the matching heuristic in syllogistic reasoning.

    PubMed

    Tse, Ping Ping; Moreno Ríos, Sergio; García-Madruga, Juan Antonio; Bajo Molina, María Teresa

    2014-11-01

    A number of heuristic-based hypotheses have been proposed to explain how people solve syllogisms with automatic processes. In particular, the matching heuristic employs the congruency of the quantifiers in a syllogism—by matching the quantifier of the conclusion with those of the two premises. When the heuristic leads to an invalid conclusion, successful solving of these conflict problems requires the inhibition of automatic heuristic processing. Accordingly, if the automatic processing were based on processing the set of quantifiers, no semantic contents would be inhibited. The mental model theory, however, suggests that people reason using mental models, which always involves semantic processing. Therefore, whatever inhibition occurs in the processing implies the inhibition of the semantic contents. We manipulated the validity of the syllogism and the congruency of the quantifier of its conclusion with those of the two premises according to the matching heuristic. A subsequent lexical decision task (LDT) with related words in the conclusion was used to test any inhibition of the semantic contents after each syllogistic evaluation trial. In the LDT, the facilitation effect of semantic priming diminished after correctly solved conflict syllogisms (match-invalid or mismatch-valid), but was intact after no-conflict syllogisms. The results suggest the involvement of an inhibitory mechanism of semantic contents in syllogistic reasoning when there is a conflict between the output of the syntactic heuristic and actual validity. Our results do not support a uniquely syntactic process of syllogistic reasoning but fit with the predictions based on mental model theory. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. [Case finding in early prevention networks - a heuristic for ambulatory care settings].

    PubMed

    Barth, Michael; Belzer, Florian

    2016-06-01

    One goal of early prevention is the support of families with small children up to three years who are exposed to psychosocial risks. The identification of these cases is often complex and not well-directed, especially in the ambulatory care setting. Development of a model of a feasible and empirical based strategy for case finding in ambulatory care. Based on the risk factors of postpartal depression, lack of maternal responsiveness, parental stress with regulation disorders and poverty a lexicographic and non-compensatory heuristic model with simple decision rules, will be constructed and empirically tested. Therefore the original data set from an evaluation of the pediatric documentary form on psychosocial issues of families with small children in well-child visits will be used and reanalyzed. The first diagnostic step in the non-compensatory and hierarchical classification process is the assessment of postpartal depression followed by maternal responsiveness, parental stress and poverty. The classification model identifies 89.0 % cases from the original study. Compared to the original study the decision process becomes clearer and more concise. The evidence-based and data-driven model exemplifies a strategy for the assessment of psychosocial risk factors in ambulatory care settings. It is based on four evidence-based risk factors and offers a quick and reliable classification. A further advantage of this model is that after a risk factor is identified the diagnostic procedure will be stopped and the counselling process can commence. For further validation of the model studies, in well suited early prevention networks are needed.

  8. Sequence-based heuristics for faster annotation of non-coding RNA families.

    PubMed

    Weinberg, Zasha; Ruzzo, Walter L

    2006-01-01

    Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.

  9. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.

    PubMed

    Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán

    2014-03-11

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.

  10. Fitness landscapes, heuristics and technological paradigms: A critique on random search models in evolutionary economics

    NASA Astrophysics Data System (ADS)

    Frenken, Koen

    2001-06-01

    The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.

  11. A model for solving the prescribed burn planning problem.

    PubMed

    Rachmawati, Ramya; Ozlen, Melih; Reinke, Karin J; Hearne, John W

    2015-01-01

    The increasing frequency of destructive wildfires, with a consequent loss of life and property, has led to fire and land management agencies initiating extensive fuel management programs. This involves long-term planning of fuel reduction activities such as prescribed burning or mechanical clearing. In this paper, we propose a mixed integer programming (MIP) model that determines when and where fuel reduction activities should take place. The model takes into account multiple vegetation types in the landscape, their tolerance to frequency of fire events, and keeps track of the age of each vegetation class in each treatment unit. The objective is to minimise fuel load over the planning horizon. The complexity of scheduling fuel reduction activities has led to the introduction of sophisticated mathematical optimisation methods. While these approaches can provide optimum solutions, they can be computationally expensive, particularly for fuel management planning which extends across the landscape and spans long term planning horizons. This raises the question of how much better do exact modelling approaches compare to simpler heuristic approaches in their solutions. To answer this question, the proposed model is run using an exact MIP (using commercial MIP solver) and two heuristic approaches that decompose the problem into multiple single-period sub problems. The Knapsack Problem (KP), which is the first heuristic approach, solves the single period problems, using an exact MIP approach. The second heuristic approach solves the single period sub problem using a greedy heuristic approach. The three methods are compared in term of model tractability, computational time and the objective values. The model was tested using randomised data from 711 treatment units in the Barwon-Otway district of Victoria, Australia. Solutions for the exact MIP could be obtained for up to a 15-year planning only using a standard implementation of CPLEX. Both heuristic approaches can solve significantly larger problems, involving 100-year or even longer planning horizons. Furthermore there are no substantial differences in the solutions produced by the three approaches. It is concluded that for practical purposes a heuristic method is to be preferred to the exact MIP approach.

  12. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    NASA Astrophysics Data System (ADS)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  13. Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment

    NASA Astrophysics Data System (ADS)

    Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.

    2017-03-01

    Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.

  14. Fast and Frugal Heuristics Are Plausible Models of Cognition: Reply to Dougherty, Franco-Watkins, and Thomas (2008)

    ERIC Educational Resources Information Center

    Gigerenzer, Gerd; Hoffrage, Ulrich; Goldstein, Daniel G.

    2008-01-01

    M. R. Dougherty, A. M. Franco-Watkins, and R. Thomas (2008) conjectured that fast and frugal heuristics need an automatic frequency counter for ordering cues. In fact, only a few heuristics order cues, and these orderings can arise from evolutionary, social, or individual learning, none of which requires automatic frequency counting. The idea that…

  15. The Memory State Heuristic: A Formal Model Based on Repeated Recognition Judgments

    ERIC Educational Resources Information Center

    Castela, Marta; Erdfelder, Edgar

    2017-01-01

    The recognition heuristic (RH) theory predicts that, in comparative judgment tasks, if one object is recognized and the other is not, the recognized one is chosen. The memory-state heuristic (MSH) extends the RH by assuming that choices are not affected by recognition judgments per se, but by the memory states underlying these judgments (i.e.,…

  16. Varying execution discipline to increase performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, P.L.; Maccabe, A.B.

    1993-12-22

    This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less

  17. Strategy selection as rational metareasoning.

    PubMed

    Lieder, Falk; Griffiths, Thomas L

    2017-11-01

    Many contemporary accounts of human reasoning assume that the mind is equipped with multiple heuristics that could be deployed to perform a given task. This raises the question of how the mind determines when to use which heuristic. To answer this question, we developed a rational model of strategy selection, based on the theory of rational metareasoning developed in the artificial intelligence literature. According to our model people learn to efficiently choose the strategy with the best cost-benefit tradeoff by learning a predictive model of each strategy's performance. We found that our model can provide a unifying explanation for classic findings from domains ranging from decision-making to arithmetic by capturing the variability of people's strategy choices, their dependence on task and context, and their development over time. Systematic model comparisons supported our theory, and 4 new experiments confirmed its distinctive predictions. Our findings suggest that people gradually learn to make increasingly more rational use of fallible heuristics. This perspective reconciles the 2 poles of the debate about human rationality by integrating heuristics and biases with learning and rationality. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Fast and Frugal Framing Effects?

    ERIC Educational Resources Information Center

    Mccloy, Rachel; Beaman, C. Philip; Frosch, Caren A.; Goddard, Kate

    2010-01-01

    Using 3 experiments, we examine whether simple pairwise comparison judgments, involving the "recognition heuristic" (Goldstein & Gigerenzer, 2002), are sensitive to implicit cues to the nature of the comparison required. In Experiments 1 and 2, we show that participants frequently choose the recognized option of a pair if asked to make "larger"…

  19. Is Popper's Falsificationist Heuristic a Helpful Resource for Developing Critical Thinking?

    ERIC Educational Resources Information Center

    Lam, Chi-Ming

    2007-01-01

    Based on a rather simple thesis that we can learn from our mistakes, Karl Popper developed a falsificationist epistemology in which knowledge grows through falsifying, or criticizing, our theories. According to him, knowledge, especially scientific knowledge, progresses through conjectures (i.e. tentative solutions to problems) that are controlled…

  20. Diagnosis of Spinal Lesions Using Heuristic and Pharmacokinetic Parameters Measured by Dynamic Contrast-Enhanced MRI.

    PubMed

    Lang, Ning; Yuan, Huishu; Yu, Hon J; Su, Min-Ying

    2017-07-01

    This study aimed to evaluate the diagnostic performance of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in differentiation of four spinal lesions by using heuristic and pharmacokinetic parameters analyzed from DCE signal intensity time course. DCE-MRI of 62 subjects with confirmed myeloma (n = 9), metastatic cancer (n = 22), lymphoma (n = 7), and inflammatory tuberculosis (TB) (n = 24) in the spine were analyzed retrospectively. The region of interest was placed on strongly enhanced tissues. The DCE time course was categorized as the "wash-out," "plateau," or "persistent enhancement" pattern. The maximum enhancement, steepest wash-in enhancement, and wash-out slope using the signal intensity at 67 seconds after contrast injection as reference were measured. The Tofts 2-compartmental pharmacokinetic model was applied to obtain K trans and k ep . Pearson correlation between heuristic and pharmacokinetic parameters was evaluated, and receiver operating characteristic curve analysis was performed for pairwise group differentiation. The mean wash-out slope was -22% ± 10% for myeloma, 1% ± 0.4% for metastatic cancer, 3% ± 3% for lymphoma, and 7% ± 10% for TB, and it could significantly distinguish myeloma from metastasis (area under the curve [AUC] = 0.884), lymphoma (AUC = 1.0), and TB (AUC = 1.0) with P = .001, and distinguish metastasis from TB (AUC = 0.741) with P = .005. The k ep and wash-out slope were highly correlated (r = 0.92), and they showed a similar diagnostic performance. The K trans was significantly correlated with the maximum enhancement (r = 0.71) and the steepest wash-in enhancement (r = 0.85), but they had inferior diagnostic performance compared to the wash-out slope. DCE-MRI may provide additional diagnostic information, and a simple wash-out slope had the best diagnostic performance. The heuristic and pharmacokinetic parameters were highly correlated. Copyright © 2017. Published by Elsevier Inc.

  1. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets

    PubMed Central

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  2. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    PubMed

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  3. Beyond Decision Making: Cultural Ideology as Heuristic Paradigmatic Models.

    ERIC Educational Resources Information Center

    Whitley, L. Darrell

    A paradigmatic model of cultural ideology provides a context for understanding the relationship between decision-making and personal and cultural rationality. Cultural rules or heuristics exist which indicate that many decisions can be made on the basis of established strategy rather than continual analytical calculations. When an optimal solution…

  4. Three hybridization models based on local search scheme for job shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Balbi Fraga, Tatiana

    2015-05-01

    This work presents three different hybridization models based on the general schema of Local Search Heuristics, named Hybrid Successive Application, Hybrid Neighborhood, and Hybrid Improved Neighborhood. Despite similar approaches might have already been presented in the literature in other contexts, in this work these models are applied to analyzes the solution of the job shop scheduling problem, with the heuristics Taboo Search and Particle Swarm Optimization. Besides, we investigate some aspects that must be considered in order to achieve better solutions than those obtained by the original heuristics. The results demonstrate that the algorithms derived from these three hybrid models are more robust than the original algorithms and able to get better results than those found by the single Taboo Search.

  5. Fluency heuristic: a model of how the mind exploits a by-product of information retrieval.

    PubMed

    Hertwig, Ralph; Herzog, Stefan M; Schooler, Lael J; Reimer, Torsten

    2008-09-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the most of an automatic by-product of retrieval from memory, namely, retrieval fluency. In 4 experiments, the authors show that retrieval fluency can be a proxy for real-world quantities, that people can discriminate between two objects' retrieval fluencies, and that people's inferences are in line with the fluency heuristic (in particular fast inferences) and with experimentally manipulated fluency. The authors conclude that the fluency heuristic may be one tool in the mind's repertoire of strategies that artfully probes memory for encapsulated frequency information that can veridically reflect statistical regularities in the world. (c) 2008 APA, all rights reserved.

  6. Biology as population dynamics: heuristics for transmission risk.

    PubMed

    Keebler, Daniel; Walwyn, David; Welte, Alex

    2013-02-01

    Population-type models, accounting for phenomena such as population lifetimes, mixing patterns, recruitment patterns, genetic evolution and environmental conditions, can be usefully applied to the biology of HIV infection and viral replication. A simple dynamic model can explore the effect of a vaccine-like stimulus on the mortality and infectiousness, which formally looks like fertility, of invading virions; the mortality of freshly infected cells; and the availability of target cells, all of which impact on the probability of infection. Variations on this model could capture the importance of the timing and duration of different key events in viral transmission, and hence be applied to questions of mucosal immunology. The dynamical insights and assumptions of such models are compatible with the continuum of between- and within-individual risks in sexual violence and may be helpful in making sense of the sparse data available on the association between HIV transmission and sexual violence. © 2012 John Wiley & Sons A/S.

  7. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis.

    PubMed

    Tashkova, Katerina; Korošec, Peter; Silc, Jurij; Todorovski, Ljupčo; Džeroski, Sašo

    2011-10-11

    We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology.

  8. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis

    PubMed Central

    2011-01-01

    Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology. PMID:21989196

  9. Spatial competition and price formation

    NASA Astrophysics Data System (ADS)

    Nagel, Kai; Shubik, Martin; Paczuski, Maya; Bak, Per

    2000-12-01

    We look at price formation in a retail setting, that is, companies set prices, and consumers either accept prices or go someplace else. In contrast to most other models in this context, we use a two-dimensional spatial structure for information transmission, that is, consumers can only learn from nearest neighbors. Many aspects of this can be understood in terms of generalized evolutionary dynamics. In consequence, we first look at spatial competition and cluster formation without price. This leads to establishement size distributions, which we compare to reality. After some theoretical considerations, which at least heuristically explain our simulation results, we finally return to price formation, where we demonstrate that our simple model with nearly no organized planning or rationality on the part of any of the agents indeed leads to an economically plausible price.

  10. A Heuristic Approach to Examining Volatile Equilibrium at Titan's Surface

    NASA Technical Reports Server (NTRS)

    Samuelson, Robert E.

    1999-01-01

    R. D. Lorenz, J. I. Lunine, and C. P. McKay have shown in a manuscript accepted for publication that, for a given ethane abundance and surface temperature, the nitrogen and methane abundances in Titan's atmosphere can be calculated, yielding a surface pressure that can be compared with the observed value. This is potentially a very valuable tool for examining the evolution of Titan's climatology. Its validity does depend on two important assumptions, however: 1) that the atmosphere of Titan is in global radiative equilibrium, and 2) that volatiles present are in vapor equilibrium with the surface. The former assumption has been shown to be likely, but the latter has not. Water vapor in the Earth's atmosphere, in fact, is generally not very close to equilibrium in a global sense. In the present work a heuristic approach is used to examine the likelihood that methane vapor is in equilibrium with Titan's surface. Plausible climate scenerios are examined that are consistent with methane vapor abundances derived from Voyager IRIS data. Simple precipitation and surface diffusion models are incorporated into the analysis. It is tentatively inferred that methane may be in surface equilibrium near the poles, but that equilibrium at low latitudes is more difficult to establish.

  11. Combining local search with co-evolution in a remarkably simple way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boettcher, S.; Percus, A.

    2000-05-01

    The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less

  12. Approaches to eliminate waste and reduce cost for recycling glass.

    PubMed

    Chao, Chien-Wen; Liao, Ching-Jong

    2011-12-01

    In recent years, the issue of environmental protection has received considerable attention. This paper adds to the literature by investigating a scheduling problem in the manufacturing of a glass recycling factory in Taiwan. The objective is to minimize the sum of the total holding cost and loss cost. We first represent the problem as an integer programming (IP) model, and then develop two heuristics based on the IP model to find near-optimal solutions for the problem. To validate the proposed heuristics, comparisons between optimal solutions from the IP model and solutions from the current method are conducted. The comparisons involve two problem sizes, small and large, where the small problems range from 15 to 45 jobs, and the large problems from 50 to 100 jobs. Finally, a genetic algorithm is applied to evaluate the proposed heuristics. Computational experiments show that the proposed heuristics can find good solutions in a reasonable time for the considered problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Whole School English Learner Reform: A Heuristic Approach to Professional Learning in Middle Schools

    ERIC Educational Resources Information Center

    Plough, Bobbie; Garcia, Ray

    2015-01-01

    This work highlights a heuristic model for professional learning while examining the implementation of a reform initiative. The researchers used longitudinal data collected from surveys to develop and fit a model of professional learning where patterns of interaction among teachers changed the discussion about English learner instruction. Data…

  14. Academic Freedom in Classroom Speech: A Heuristic Model for U.S. Catholic Higher Education

    ERIC Educational Resources Information Center

    Jacobs, Richard M.

    2010-01-01

    As the nation's Catholic universities and colleges continually clarify their identity, this article examines academic freedom in classroom speech, offering a heuristic model for use as board members, academic administrators, and faculty leaders discuss, evaluate, and judge allegations of misconduct in classroom speech. Focusing upon the practice…

  15. The fallacy of financial heuristics.

    PubMed

    Langabeer, James

    2007-01-01

    In turbulent times, the financial policies and decisions about cash and debt make or break hospitals' financial condition. Decisions about whether to continue saving cash or reduce debt burdens are probably the most vital policy decision for the hospital CFO. Unfortunately, my research shows that most administrators are relying on judgment, or best-guess heuristics to address these policy issues. This article explores one of the most common heuristics in health finance-ratios gauging debt and cash on hand. The subject is explored through the research and analysis of over 40 hospitals in a very competitive marketplace-the boroughs of New York City. Analyses of financial strength, through various statistical models, were conducted to explore the linkages between traditional heuristics and long-term economic results. Data were collected for 30 operational and financial indicators. Findings suggest that organizations require different cash-debt positions based on their overall financial health, and that a one-number heuristic does not fit all. Extremely financially constrained hospitals (those approaching bankruptcy conditions) should be building free cash flow and minimizing debt service, while financially secure hospitals need to minimize cash on hand while reducing debt. If all hospitals continue to try to meet an arbitrary days of cash heuristic, this simplification could cripple an organization. A much more effective metric requires each organization to model decisions more comprehensively.

  16. Community-aware task allocation for social networked multiagent systems.

    PubMed

    Wang, Wanyuan; Jiang, Yichuan

    2014-09-01

    In this paper, we propose a novel community-aware task allocation model for social networked multiagent systems (SN-MASs), where the agent' cooperation domain is constrained in community and each agent can negotiate only with its intracommunity member agents. Under such community-aware scenarios, we prove that it remains NP-hard to maximize system overall profit. To solve this problem effectively, we present a heuristic algorithm that is composed of three phases: 1) task selection: select the desirable task to be allocated preferentially; 2) allocation to community: allocate the selected task to communities based on a significant task-first heuristics; and 3) allocation to agent: negotiate resources for the selected task based on a nonoverlap agent-first and breadth-first resource negotiation mechanism. Through the theoretical analyses and experiments, the advantages of our presented heuristic algorithm and community-aware task allocation model are validated. 1) Our presented heuristic algorithm performs very closely to the benchmark exponential brute-force optimal algorithm and the network flow-based greedy algorithm in terms of system overall profit in small-scale applications. Moreover, in the large-scale applications, the presented heuristic algorithm achieves approximately the same overall system profit, but significantly reduces the computational load compared with the greedy algorithm. 2) Our presented community-aware task allocation model reduces the system communication cost compared with the previous global-aware task allocation model and improves the system overall profit greatly compared with the previous local neighbor-aware task allocation model.

  17. User Interface Problems of a Nationwide Inpatient Information System: A Heuristic Evaluation.

    PubMed

    Atashi, Alireza; Khajouei, Reza; Azizi, Amirabbas; Dadashi, Ali

    2016-01-01

    While studies have shown that usability evaluation could uncover many design problems of health information systems, the usability of health information systems in developing countries using their native language is poorly studied. The objective of this study was to evaluate the usability of a nationwide inpatient information system used in many academic hospitals in Iran. Three trained usability evaluators independently evaluated the system using Nielsen's 10 usability heuristics. The evaluators combined identified problems in a single list and independently rated the severity of the problems. We statistically compared the number and severity of problems identified by HIS experienced and non-experienced evaluators. A total of 158 usability problems were identified. After removing duplications 99 unique problems were left. The highest mismatch with usability principles was related to "Consistency and standards" heuristic (25%) and the lowest related to "Flexibility and efficiency of use" (4%). The average severity of problems ranged from 2.4 (Major problem) to 3.3 (Catastrophe problem). The experienced evaluator with HIS identified significantly more problems and gave higher severities to problems (p<0.02). Heuristic Evaluation identified a high number of usability problems in a widely used inpatient information system in many academic hospitals. These problems, if remain unsolved, may waste users' and patients' time, increase errors and finally threaten patient's safety. Many of them can be fixed with simple redesign solutions such as using clear labels and better layouts. This study suggests conducting further studies to confirm the findings concerning effect of evaluator experience on the results of Heuristic Evaluation.

  18. User Interface Problems of a Nationwide Inpatient Information System: A Heuristic Evaluation

    PubMed Central

    Atashi, Alireza; Azizi, Amirabbas; Dadashi, Ali

    2016-01-01

    Summary Introduction While studies have shown that usability evaluation could uncover many design problems of health information systems, the usability of health information systems in developing countries using their native language is poorly studied. The objective of this study was to evaluate the usability of a nationwide inpatient information system used in many academic hospitals in Iran. Material and Methods Three trained usability evaluators independently evaluated the system using Nielsen’s 10 usability heuristics. The evaluators combined identified problems in a single list and independently rated the severity of the problems. We statistically compared the number and severity of problems identified by HIS experienced and non-experienced evaluators. Results A total of 158 usability problems were identified. After removing duplications 99 unique problems were left. The highest mismatch with usability principles was related to “Consistency and standards” heuristic (25%) and the lowest related to “Flexibility and efficiency of use” (4%). The average severity of problems ranged from 2.4 (Major problem) to 3.3 (Catastrophe problem). The experienced evaluator with HIS identified significantly more problems and gave higher severities to problems (p<0.02). Discussion Heuristic Evaluation identified a high number of usability problems in a widely used inpatient information system in many academic hospitals. These problems, if remain unsolved, may waste users’ and patients’ time, increase errors and finally threaten patient’s safety. Many of them can be fixed with simple redesign solutions such as using clear labels and better layouts. This study suggests conducting further studies to confirm the findings concerning effect of evaluator experience on the results of Heuristic Evaluation. PMID:27081409

  19. Memory-based decision-making with heuristics: evidence for a controlled activation of memory representations.

    PubMed

    Khader, Patrick H; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rösler, Frank

    2011-11-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by activating long-term memory representations of only those attributes that are necessary for the decision. However, from behavioral studies alone, it is unclear whether using heuristics is indeed associated with limited memory search. The present study tested this assumption by monitoring the activation of specific long-term-memory representations with fMRI while participants made memory-based decisions using the "take-the-best" heuristic. For different decision trials, different numbers and types of information had to be retrieved and processed. The attributes consisted of visual information known to be represented in different parts of the posterior cortex. We found that the amount of information required for a decision was mirrored by a parametric activation of the dorsolateral PFC. Such a parametric pattern was also observed in all posterior areas, suggesting that activation was not limited to those attributes required for a decision. However, the posterior increases were systematically modulated by the relative importance of the information for making a decision. These findings suggest that memory-based decision-making is mediated by the dorsolateral PFC, which selectively controls posterior storage areas. In addition, the systematic modulations of the posterior activations indicate a selective boosting of activation of decision-relevant attributes.

  20. A Heuristic Bioinspired for 8-Piece Puzzle

    NASA Astrophysics Data System (ADS)

    Machado, M. O.; Fabres, P. A.; Melo, J. C. L.

    2017-10-01

    This paper investigates a mathematical model inspired by nature, and presents a Meta-Heuristic that is efficient in improving the performance of an informed search, when using strategy A * using a General Search Tree as data structure. The work hypothesis suggests that the investigated meta-heuristic is optimal in nature and may be promising in minimizing the computational resources required by an objective-based agent in solving high computational complexity problems (n-part puzzle) as well as In the optimization of objective functions for local search agents. The objective of this work is to describe qualitatively the characteristics and properties of the mathematical model investigated, correlating the main concepts of the A * function with the significant variables of the metaheuristic used. The article shows that the amount of memory required to perform this search when using the metaheuristic is less than using the A * function to evaluate the nodes of a general search tree for the eight-piece puzzle. It is concluded that the meta-heuristic must be parameterized according to the chosen heuristic and the level of the tree that contains the possible solutions to the chosen problem.

  1. The Counter-Intuitive Non-Informative Prior for the Bernoulli Family

    ERIC Educational Resources Information Center

    Zhu, Mu; Lu, Arthur Y.

    2004-01-01

    In Bayesian statistics, the choice of the prior distribution is often controversial. Different rules for selecting priors have been suggested in the literature, which, sometimes, produce priors that are difficult for the students to understand intuitively. In this article, we use a simple heuristic to illustrate to the students the rather…

  2. Toward Intelligent Systems for Testing. Technical Report LSP-1.

    ERIC Educational Resources Information Center

    Lesgold, Alan; And Others

    This report illustrates one way in which the technologies of testing might combine with cognitive science techniques to help steer instruction. Steering testing is brief diagnostic testing that steers, or individualizes, the course of instruction. Steering testing uses simple heuristics for reasoning about the level of a student's competence in a…

  3. The Light Side of Dark Matter

    NASA Astrophysics Data System (ADS)

    Cisneros, Sophia

    2013-04-01

    We present a new, heuristic, two-parameter model for predicting the rotation curves of disc galaxies. The model is tested on (22) randomly chosen galaxies, represented in 35 data sets. This Lorentz Convolution [LC] model is derived from a non-linear, relativistic solution of a Kerr-type wave equation, where small changes in the photon's frequencies, resulting from the curved space time, are convolved into a sequence of Lorentz transformations. The LC model is parametrized with only the diffuse, luminous stellar and gaseous masses reported with each data set of observations used. The LC model predicts observed rotation curves across a wide range of disk galaxies. The LC model was constructed to occupy the same place in the explanation of rotation curves that Dark Matter does, so that a simple investigation of the relation between luminous and dark matter might be made, via by a parameter (a). We find the parameter (a) to demonstrate interesting structure. We compare the new model prediction to both the NFW model and MOND fits when available.

  4. Decision support for hospital bed management using adaptable individual length of stay estimations and shared resources.

    PubMed

    Schmidt, Robert; Geisler, Sandra; Spreckelsen, Cord

    2013-01-07

    Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients' condition, the necessity of the treatment, and the patients' preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient's recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model's cost factors. A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model's cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning.

  5. Statistical methods for astronomical data with upper limits. I - Univariate distributions

    NASA Technical Reports Server (NTRS)

    Feigelson, E. D.; Nelson, P. I.

    1985-01-01

    The statistical treatment of univariate censored data is discussed. A heuristic derivation of the Kaplan-Meier maximum-likelihood estimator from first principles is presented which results in an expression amenable to analytic error analysis. Methods for comparing two or more censored samples are given along with simple computational examples, stressing the fact that most astronomical problems involve upper limits while the standard mathematical methods require lower limits. The application of univariate survival analysis to six data sets in the recent astrophysical literature is described, and various aspects of the use of survival analysis in astronomy, such as the limitations of various two-sample tests and the role of parametric modelling, are discussed.

  6. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  7. The Art of Snaring Dragons. Artificial Intelligence Memo Number 338. Revised.

    ERIC Educational Resources Information Center

    Cohen, Harvey A.

    Several models for problem solving are discussed, and the idea of a heuristic frame is developed. This concept provides a description of the evolution of problem-solving skills in terms of the growth of the number of algorithms available and increased sophistication in their use. The heuristic frame model is applied to two sets of physical…

  8. Heuristics for Planning University Study at a Distance.

    ERIC Educational Resources Information Center

    Dodds, Agnes E.; Lawrence, Jeanette A.

    A model to describe how adults work on university courses at a distance from campus was developed at an Australian university. The model was designed to describe how students define the task/goal and plan their study, based on G. Ploya's (1957) Heuristic and A. Newell's and H. A. Simon's (1972) General Problem Solver. Verbal reports were obtained…

  9. Improving Critical Thinking Skills Using Learning Model Logan Avenue Problem Solving (LAPS)-Heuristic

    ERIC Educational Resources Information Center

    Anggrianto, Desi; Churiyah, Madziatul; Arief, Mohammad

    2016-01-01

    This research was conducted in order to know the effect of Logan Avenue Problem Solving (LAPS)-Heuristic learning model towards critical thinking skills of students of class X Office Administration (APK) in SMK Negeri 1 Ngawi, East Java, Indonesia on material curve and equilibrium of demand and supply, subject Introduction to Economics and…

  10. Displacements Of Brownian Particles In Terms Of Marian Von Smoluchowski's Heuristic Model

    ERIC Educational Resources Information Center

    Klein, Hermann; Woermann, Dietrich

    2005-01-01

    Albert Einstein's theory of the Brownian motion, Marian von Smoluchowski's heuristic model, and Perrin's experimental results helped to bring the concept of molecules from a state of being a useful hypothesis in chemistry to objects existing in reality. Central to the theory of Brownian motion is the relation between mean particle displacement and…

  11. Stick or Switch: A Selection Heuristic Predicts when People Take the Perspective of Others or Communicate Egocentrically.

    PubMed

    Rogers, Shane L; Fay, Nicolas

    2016-01-01

    This paper examines a cognitive mechanism that drives perspective-taking and egocentrism in interpersonal communication. Using a conceptual referential communication task, in which participants describe a range of abstract geometric shapes, Experiment 1 shows that perspective-taking and egocentric communication are frequent communication strategies. Experiment 2 tests a selection heuristic account of perspective-taking and egocentric communication. It uses participants' shape description ratings to predict their communication strategy. Participants' communication strategy was predicted by how informative they perceived the different shape descriptions to be. When participants' personal shape description was perceived to be more informative than their addressee's shape description, there was a strong bias to communicate egocentrically. By contrast, when their addressee's shape description was perceived to be more informative, there was a strong bias to take their addressee's perspective. When the shape descriptions were perceived to be equally informative, there was a moderate bias to communicate egocentrically. This simple, but powerful, selection heuristic may be critical to the cumulative cultural evolution of human communication systems, and cumulative cultural evolution more generally.

  12. 1-D DC Resistivity Modeling and Interpretation in Anisotropic Media Using Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Pekşen, Ertan; Yas, Türker; Kıyak, Alper

    2014-09-01

    We examine the one-dimensional direct current method in anisotropic earth formation. We derive an analytic expression of a simple, two-layered anisotropic earth model. Further, we also consider a horizontally layered anisotropic earth response with respect to the digital filter method, which yields a quasi-analytic solution over anisotropic media. These analytic and quasi-analytic solutions are useful tests for numerical codes. A two-dimensional finite difference earth model in anisotropic media is presented in order to generate a synthetic data set for a simple one-dimensional earth. Further, we propose a particle swarm optimization method for estimating the model parameters of a layered anisotropic earth model such as horizontal and vertical resistivities, and thickness. The particle swarm optimization is a naturally inspired meta-heuristic algorithm. The proposed method finds model parameters quite successfully based on synthetic and field data. However, adding 5 % Gaussian noise to the synthetic data increases the ambiguity of the value of the model parameters. For this reason, the results should be controlled by a number of statistical tests. In this study, we use probability density function within 95 % confidence interval, parameter variation of each iteration and frequency distribution of the model parameters to reduce the ambiguity. The result is promising and the proposed method can be used for evaluating one-dimensional direct current data in anisotropic media.

  13. Models of SOL transport and their relation to scaling of the divertor heat flux width in DIII-D

    DOE PAGES

    Makowski, M. A.; Lasnier, C. J.; Leonard, A. W.; ...

    2014-10-06

    Strong support for the critical pressure gradient model for the heat flux width has been obtained, in that the measured separatrix pressure gradient lies below and scales similarly to the pressure gradient limit obtained from the ideal, infinite-n stability codes, BALOO and 2DX, in all cases that have been examined. Predictions of a heuristic drift model for the heat flux width are also in qualitative agreement with the measurements. We obtained these results by using an improved high rep-rate and higher edge spatial resolution Thomson scattering system on DIII-D to measure the upstream electron temperature and density profiles. In ordermore » to compare theory and experiment, profiles of density, temperature, and pressure for both electrons and ions are needed as well values of these quantitities at the separatrix. We also developed a simple method to identify a proxy for the separatrix.« less

  14. Outbreak Column 16: Cognitive errors in outbreak decision making.

    PubMed

    Curran, Evonne T

    2015-01-01

    During outbreaks, decisions must be made without all the required information. People, including infection prevention and control teams (IPCTs), who have to make decisions during uncertainty use heuristics to fill the missing data gaps. Heuristics are mental model short cuts that by-and-large enable us to make good decisions quickly. However, these heuristics contain biases and effects that at times lead to cognitive (thinking) errors. These cognitive errors are not made to deliberately misrepresent any given situation; we are subject to heuristic biases when we are trying to perform optimally. The science of decision making is large; there are over 100 different biases recognised and described. Outbreak Column 16 discusses and relates these heuristics and biases to decision making during outbreak prevention, preparedness and management. Insights as to how we might recognise and avoid them are offered.

  15. K-Partite RNA Secondary Structures

    NASA Astrophysics Data System (ADS)

    Jiang, Minghui; Tejada, Pedro J.; Lasisi, Ramoni O.; Cheng, Shanhong; Fechser, D. Scott

    RNA secondary structure prediction is a fundamental problem in structural bioinformatics. The prediction problem is difficult because RNA secondary structures may contain pseudoknots formed by crossing base pairs. We introduce k-partite secondary structures as a simple classification of RNA secondary structures with pseudoknots. An RNA secondary structure is k-partite if it is the union of k pseudoknot-free sub-structures. Most known RNA secondary structures are either bipartite or tripartite. We show that there exists a constant number k such that any secondary structure can be modified into a k-partite secondary structure with approximately the same free energy. This offers a partial explanation of the prevalence of k-partite secondary structures with small k. We give a complete characterization of the computational complexities of recognizing k-partite secondary structures for all k ≥ 2, and show that this recognition problem is essentially the same as the k-colorability problem on circle graphs. We present two simple heuristics, iterated peeling and first-fit packing, for finding k-partite RNA secondary structures. For maximizing the number of base pair stackings, our iterated peeling heuristic achieves a constant approximation ratio of at most k for 2 ≤ k ≤ 5, and at most frac6{1-(1-6/k)^k} le frac6{1-e^{-6}} < 6.01491 for k ≥ 6. Experiment on sequences from PseudoBase shows that our first-fit packing heuristic outperforms the leading method HotKnots in predicting RNA secondary structures with pseudoknots. Source code, data set, and experimental results are available at http://www.cs.usu.edu/ mjiang/rna/kpartite/.

  16. Plying Your Craft: Instructional Development and the Use of Heuristics.

    ERIC Educational Resources Information Center

    Noel, Kent L.; Hewlett, Brent

    1981-01-01

    Examines an instructional systems design (ISD) model used by Bell Laboratories as an illustration of how heuristics can be brought to bear upon the design and development of instructional materials. Ten references are listed. (Author/MER)

  17. NEST: a comprehensive model for scintillation yield in liquid xenon

    DOE PAGES

    Szydagis, M.; Barry, N.; Kazkaz, K.; ...

    2011-10-03

    Here, a comprehensive model for explaining scintillation yield in liquid xenon is introduced. We unify various definitions of work function which abound in the literature and incorporate all available data on electron recoil scintillation yield. This results in a better understanding of electron recoil, and facilitates an improved description of nuclear recoil. An incident gamma energy range of O(1 keV) to O(1 MeV) and electric fields between 0 and O(10 kV/cm) are incorporated into this heuristic model. We show results from a Geant4 implementation, but because the model has a few free parameters, implementation in any simulation package should bemore » simple. We use a quasi-empirical approach, with an objective of improving detector calibrations and performance verification. The model will aid in the design and optimization of future detectors. This model is also easy to extend to other noble elements. In this paper we lay the foundation for an exhaustive simulation code which we call NEST (Noble Element Simulation Technique).« less

  18. Of mental models, assumptions and heuristics: The case of acids and acid strength

    NASA Astrophysics Data System (ADS)

    McClary, Lakeisha Michelle

    This study explored what cognitive resources (i.e., units of knowledge necessary to learn) first-semester organic chemistry students used to make decisions about acid strength and how those resources guided the prediction, explanation and justification of trends in acid strength. We were specifically interested in the identifying and characterizing the mental models, assumptions and heuristics that students relied upon to make their decisions, in most cases under time constraints. The views about acids and acid strength were investigated for twenty undergraduate students. Data sources for this study included written responses and individual interviews. The data was analyzed using a qualitative methodology to answer five research questions. Data analysis regarding these research questions was based on existing theoretical frameworks: problem representation (Chi, Feltovich & Glaser, 1981), mental models (Johnson-Laird, 1983); intuitive assumptions (Talanquer, 2006), and heuristics (Evans, 2008). These frameworks were combined to develop the framework from which our data were analyzed. Results indicated that first-semester organic chemistry students' use of cognitive resources was complex and dependent on their understanding of the behavior of acids. Expressed mental models were generated using prior knowledge and assumptions about acids and acid strength; these models were then employed to make decisions. Explicit and implicit features of the compounds in each task mediated participants' attention, which triggered the use of a very limited number of heuristics, or shortcut reasoning strategies. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength.

  19. Analytic and heuristic processes in the detection and resolution of conflict.

    PubMed

    Ferreira, Mário B; Mata, André; Donkin, Christopher; Sherman, Steven J; Ihmels, Max

    2016-10-01

    Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dual-process models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and single-process accounts, which are discussed.

  20. What Is behind the Priority Heuristic? A Mathematical Analysis and Comment on Brandstatter, Gigerenzer, and Hertwig (2006)

    ERIC Educational Resources Information Center

    Rieger, Marc Oliver; Wang, Mei

    2008-01-01

    Comments on the article by E. Brandstatter, G. Gigerenzer, and R. Hertwig (2006). The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral…

  1. Heuristic Model Of The Composite Quality Index Of Environmental Assessment

    NASA Astrophysics Data System (ADS)

    Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.

    2017-01-01

    The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.

  2. Fluent, Fast, and Frugal? A Formal Model Evaluation of the Interplay between Memory, Fluency, and Comparative Judgments

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.; Erdfelder, Edgar; Pohl, Rudiger F.

    2011-01-01

    A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency--that is, the speed with which objects are recognized--will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has…

  3. The Role of Source Confusion in Cultivation Effects May Depend on Processing Strategy: A Comment on Mares (1996).

    ERIC Educational Resources Information Center

    Shrum, L. J.

    1997-01-01

    States M.L. Mares presents evidence that source confusions play a role in the cultivation effect. Clarifies some of Mares' findings that have implications for the heuristic model of cultivation effects and shows that Mares' findings are compatible with and can be integrated into the heuristic processing model. Discusses implications of Mares'…

  4. Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.

    PubMed

    Stupple, Edward J N; Waterhouse, Eleanor F

    2009-08-01

    An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.

  5. Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1992-01-01

    Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)

  6. Interactions of timing and prediction error learning.

    PubMed

    Kirkpatrick, Kimberly

    2014-01-01

    Timing and prediction error learning have historically been treated as independent processes, but growing evidence has indicated that they are not orthogonal. Timing emerges at the earliest time point when conditioned responses are observed, and temporal variables modulate prediction error learning in both simple conditioning and cue competition paradigms. In addition, prediction errors, through changes in reward magnitude or value alter timing of behavior. Thus, there appears to be a bi-directional interaction between timing and prediction error learning. Modern theories have attempted to integrate the two processes with mixed success. A neurocomputational approach to theory development is espoused, which draws on neurobiological evidence to guide and constrain computational model development. Heuristics for future model development are presented with the goal of sparking new approaches to theory development in the timing and prediction error fields. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Utility-free heuristic models of two-option choice can mimic predictions of utility-stage models under many conditions

    PubMed Central

    Piantadosi, Steven T.; Hayden, Benjamin Y.

    2015-01-01

    Economists often model choices as if decision-makers assign each option a scalar value variable, known as utility, and then select the option with the highest utility. It remains unclear whether as-if utility models describe real mental and neural steps in choice. Although choices alone cannot prove the existence of a utility stage, utility transformations are often taken to provide the most parsimonious or psychologically plausible explanation for choice data. Here, we show that it is possible to mathematically transform a large set of common utility-stage two-option choice models (specifically ones in which dimensions are can be decomposed into additive functions) into a heuristic model (specifically, a dimensional prioritization heuristic) that has no utility computation stage. We then show that under a range of plausible assumptions, both classes of model predict similar neural responses. These results highlight the difficulties in using neuroeconomic data to infer the existence of a value stage in choice. PMID:25914613

  8. Health on impulse: when low self-control promotes healthy food choices.

    PubMed

    Salmon, Stefanie J; Fennis, Bob M; de Ridder, Denise T D; Adriaanse, Marieke A; de Vet, Emely

    2014-02-01

    Food choices are often made mindlessly, when individuals are not able or willing to exert self-control. Under low self-control, individuals have difficulties to resist palatable but unhealthy food products. In contrast to previous research aiming to foster healthy choices by promoting high self-control, this study exploits situations of low self-control, by strategically using the tendency under these conditions to rely on heuristics (simple decision rules) as quick guides to action. More specifically, the authors associated healthy food products with the social proof heuristic (i.e., normative cues that convey majority endorsement for those products). One hundred seventy-seven students (119 men), with an average age of 20.47 years (SD = 2.25) participated in the experiment. This study used a 2 (low vs. high self-control) × 2 (social proof vs. no heuristic) × 2 (trade-off vs. control choice) design, with the latter as within-subjects factor. The dependent variable was the number of healthy food choices in a food-choice task. In line with previous studies, people made fewer healthy food choices under low self-control. However, this negative effect of low self-control on food choice was reversed when the healthy option was associated with the social proof heuristic. In that case, people made more healthy choices under conditions of low self-control. Low self-control may be even more beneficial for healthy food choices than high self-control in the presence of a heuristic. Exploiting situations of low self-control is a new and promising method to promote health on impulse. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  9. Diving into traversable wormholes

    NASA Astrophysics Data System (ADS)

    Maldacena, Juan; Stanford, Douglas; Yang, Zhenbin

    2017-05-01

    We study various aspects of wormholes that are made traversable by an interaction beween the two asymptotic boundaries. We concentrate on the case of nearly-$AdS_2$ gravity and discuss a very simple mechanical picture for the gravitational dynamics. We derive a formula for the two sided correlators that includes the effect of gravitational backreaction, which limits the amount of information we can send through the wormhole. We emphasize that the process can be viewed as a teleportation protocol where the teleportee feels nothing special as he/she goes through the wormhole. We discuss some applications to the cloning paradox for old black holes. We point out that the same formula we derived for $AdS_2$ gravity is also valid for the simple SYK quantum mechanical theory, around the thermofield double state. We present a heuristic picture for this phenomenon in terms of an operator growth model. Finally, we show that a similar effect is present in a completely classical chaotic system with a large number of degrees of freedom.

  10. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  11. Selection of actuator locations for static shape control of large space structures by heuristic integer programing

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Adelman, H. M.

    1984-01-01

    Orbiting spacecraft such as large space antennas have to maintain a highly accurate space to operate satisfactorily. Such structures require active and passive controls to mantain an accurate shape under a variety of disturbances. Methods for the optimum placement of control actuators for correcting static deformations are described. In particular, attention is focused on the case were control locations have to be selected from a large set of available sites, so that integer programing methods are called for. The effectiveness of three heuristic techniques for obtaining a near-optimal site selection is compared. In addition, efficient reanalysis techniques for the rapid assessment of control effectiveness are presented. Two examples are used to demonstrate the methods: a simple beam structure and a 55m space-truss-parabolic antenna.

  12. Solving large-scale fixed cost integer linear programming models for grid-based location problems with heuristic techniques

    NASA Astrophysics Data System (ADS)

    Noor-E-Alam, Md.; Doucette, John

    2015-08-01

    Grid-based location problems (GBLPs) can be used to solve location problems in business, engineering, resource exploitation, and even in the field of medical sciences. To solve these decision problems, an integer linear programming (ILP) model is designed and developed to provide the optimal solution for GBLPs considering fixed cost criteria. Preliminary results show that the ILP model is efficient in solving small to moderate-sized problems. However, this ILP model becomes intractable in solving large-scale instances. Therefore, a decomposition heuristic is proposed to solve these large-scale GBLPs, which demonstrates significant reduction of solution runtimes. To benchmark the proposed heuristic, results are compared with the exact solution via ILP. The experimental results show that the proposed method significantly outperforms the exact method in runtime with minimal (and in most cases, no) loss of optimality.

  13. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    PubMed

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.

  14. Fluent, fast, and frugal? A formal model evaluation of the interplay between memory, fluency, and comparative judgments.

    PubMed

    Hilbig, Benjamin E; Erdfelder, Edgar; Pohl, Rüdiger F

    2011-07-01

    A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency-that is, the speed with which objects are recognized-will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has remained largely untested due to methodological difficulties. To overcome the latter, we propose a measurement model from the class of multinomial processing tree models that can estimate true single-cue reliance on recognition and retrieval fluency. We applied this model to aggregate and individual data from a probabilistic inference experiment and considered both goodness of fit and model complexity to evaluate different hypotheses. The results were relatively clear-cut, revealing that the fluency heuristic is an unlikely candidate for describing comparative judgments concerning recognized objects. These findings are discussed in light of a broader theoretical view on the interplay of memory and judgment processes.

  15. Homo heuristicus: why biased minds make better inferences.

    PubMed

    Gigerenzer, Gerd; Brighton, Henry

    2009-01-01

    Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We review the major progress made so far: (a) the discovery of less-is-more effects; (b) the study of the ecological rationality of heuristics, which examines in which environments a given strategy succeeds or fails, and why; (c) an advancement from vague labels to computational models of heuristics; (d) the development of a systematic theory of heuristics that identifies their building blocks and the evolved capacities they exploit, and views the cognitive system as relying on an "adaptive toolbox;" and (e) the development of an empirical methodology that accounts for individual differences, conducts competitive tests, and has provided evidence for people's adaptive use of heuristics. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. Copyright © 2009 Cognitive Science Society, Inc.

  16. A Scalable Heuristic for Viral Marketing Under the Tipping Model

    DTIC Science & Technology

    2013-09-01

    removal of high-degree nodes. The rest of the paper is organized as follows. In Section 2, we provide formal definitions of the tipping model. This is...that must be activated for it to become activate as well. A Scalable Heuristic for Viral Marketing Under the Tipping Model 3 Definition 1 (Threshold...returns a set of active nodes after one time step. Definition 2 (Activation Function) Given a threshold function, θ, an ac- tivation function Aθ maps

  17. A linked simulation-optimization model for solving the unknown groundwater pollution source identification problems.

    PubMed

    Ayvaz, M Tamer

    2010-09-20

    This study proposes a linked simulation-optimization model for solving the unknown groundwater pollution source identification problems. In the proposed model, MODFLOW and MT3DMS packages are used to simulate the flow and transport processes in the groundwater system. These models are then integrated with an optimization model which is based on the heuristic harmony search (HS) algorithm. In the proposed simulation-optimization model, the locations and release histories of the pollution sources are treated as the explicit decision variables and determined through the optimization model. Also, an implicit solution procedure is proposed to determine the optimum number of pollution sources which is an advantage of this model. The performance of the proposed model is evaluated on two hypothetical examples for simple and complex aquifer geometries, measurement error conditions, and different HS solution parameter sets. Identified results indicated that the proposed simulation-optimization model is an effective way and may be used to solve the inverse pollution source identification problems. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  18. Identifying influential spreaders in complex networks through local effective spreading paths

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojie; Zhang, Xue; Yi, Dongyun; Zhao, Chengli

    2017-05-01

    How to effectively identify a set of influential spreaders in complex networks is of great theoretical and practical value, which can help to inhibit the rapid spread of epidemics, promote the sales of products by word-of-mouth advertising, and so on. A naive strategy is to select the top ranked nodes as identified by some centrality indices, and other strategies are mainly based on greedy methods and heuristic methods. However, most of those approaches did not concern the connections between nodes. Usually, the distances between the selected spreaders are very close, leading to a serious overlapping of their influence. As a consequence, the global influence of the spreaders in networks will be greatly reduced, which largely restricts the performance of those methods. In this paper, a simple and efficient method is proposed to identify a set of discrete yet influential spreaders. By analyzing the spreading paths in the network, we present the concept of effective spreading paths and measure the influence of nodes via expectation calculation. The numerical analysis in undirected and directed networks all show that our proposed method outperforms many other centrality-based and heuristic benchmarks, especially in large-scale networks. Besides, experimental results on different spreading models and parameters demonstrates the stability and wide applicability of our method.

  19. Mapping small molecule binding data to structural domains

    PubMed Central

    2012-01-01

    Background Large-scale bioactivity/SAR Open Data has recently become available, and this has allowed new analyses and approaches to be developed to help address the productivity and translational gaps of current drug discovery. One of the current limitations of these data is the relative sparsity of reported interactions per protein target, and complexities in establishing clear relationships between bioactivity and targets using bioinformatics tools. We detail in this paper the indexing of targets by the structural domains that bind (or are likely to bind) the ligand within a full-length protein. Specifically, we present a simple heuristic to map small molecule binding to Pfam domains. This profiling can be applied to all proteins within a genome to give some indications of the potential pharmacological modulation and regulation of all proteins. Results In this implementation of our heuristic, ligand binding to protein targets from the ChEMBL database was mapped to structural domains as defined by profiles contained within the Pfam-A database. Our mapping suggests that the majority of assay targets within the current version of the ChEMBL database bind ligands through a small number of highly prevalent domains, and conversely the majority of Pfam domains sampled by our data play no currently established role in ligand binding. Validation studies, carried out firstly against Uniprot entries with expert binding-site annotation and secondly against entries in the wwPDB repository of crystallographic protein structures, demonstrate that our simple heuristic maps ligand binding to the correct domain in about 90 percent of all assessed cases. Using the mappings obtained with our heuristic, we have assembled ligand sets associated with each Pfam domain. Conclusions Small molecule binding has been mapped to Pfam-A domains of protein targets in the ChEMBL bioactivity database. The result of this mapping is an enriched annotation of small molecule bioactivity data and a grouping of activity classes following the Pfam-A specifications of protein domains. This is valuable for data-focused approaches in drug discovery, for example when extrapolating potential targets of a small molecule with known activity against one or few targets, or in the assessment of a potential target for drug discovery or screening studies. PMID:23282026

  20. The Priority Heuristic: Making Choices without Trade-Offs

    ERIC Educational Resources Information Center

    Brandstatter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2006-01-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, the authors generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic…

  1. The Robust Beauty of Ordinary Information

    ERIC Educational Resources Information Center

    Katsikopoulos, Konstantinos V.; Schooler, Lael J.; Hertwig, Ralph

    2010-01-01

    Heuristics embodying limited information search and noncompensatory processing of information can yield robust performance relative to computationally more complex models. One criticism raised against heuristics is the argument that complexity is hidden in the calculation of the cue order used to make predictions. We discuss ways to order cues…

  2. An Illustrative Case Study of the Heuristic Practices of a High-Performing Research Department: Toward Building a Model Applicable in the Context of Large Urban Districts

    ERIC Educational Resources Information Center

    Munoz, Marco A.; Rodosky, Robert J.

    2011-01-01

    This case study provides an illustration of the heuristic practices of a high-performing research department, which in turn, will help build much needed models applicable in the context of large urban districts. This case study examines the accountability, planning, evaluation, testing, and research functions of a research department in a large…

  3. Universal Fragment Descriptors for Predicting Electronic and Mechanical Properties of Inorganic Crystals

    NASA Astrophysics Data System (ADS)

    Oses, Corey; Isayev, Olexandr; Toher, Cormac; Curtarolo, Stefano; Tropsha, Alexander

    Historically, materials discovery is driven by a laborious trial-and-error process. The growth of materials databases and emerging informatics approaches finally offer the opportunity to transform this practice into data- and knowledge-driven rational design-accelerating discovery of novel materials exhibiting desired properties. By using data from the AFLOW repository for high-throughput, ab-initio calculations, we have generated Quantitative Materials Structure-Property Relationship (QMSPR) models to predict critical materials properties, including the metal/insulator classification, band gap energy, and bulk modulus. The prediction accuracy obtained with these QMSPR models approaches training data for virtually any stoichiometric inorganic crystalline material. We attribute the success and universality of these models to the construction of new materials descriptors-referred to as the universal Property-Labeled Material Fragments (PLMF). This representation affords straightforward model interpretation in terms of simple heuristic design rules that could guide rational materials design. This proof-of-concept study demonstrates the power of materials informatics to dramatically accelerate the search for new materials.

  4. A guide to the use of theoretical models of the solar nebula for the interpretation of the meteoritic record

    NASA Technical Reports Server (NTRS)

    Cassen, Pat

    1991-01-01

    Attempts to derive a theoretical framework for the interpretation of the meteoritic record have been frustrated by our incomplete understanding of the fundamental processes that controlled the evolution of the primitive solar nebula. Nevertheless, it is possible to develop qualitative models of the nebula that illuminate its dynamic character, as well as the roles of some key parameters. These models draw on the growing body of observational data on the properties of disks around young, solar-type stars, and are constructed by applying the results of known solutions of protostellar collapse problems; making simple assumptions about the radial variations of nebular variables; and imposing the integral constraints demanded by conservation of mass, angular momentum, and energy. The models so constructed are heuristic, rather than predictive; they are intended to help us think about the nebula in realistic ways, but they cannot provide a definitive description of conditions in the nebula.

  5. Fear and Loving in Las Vegas: Evolution, Emotion, and Persuasion

    PubMed Central

    Griskevicius, Vladas; Goldstein, Noah J.; Mortensen, Chad R.; Sundie, Jill M.; Cialdini, Robert B.; Kenrick, Douglas T.

    2009-01-01

    How do arousal-inducing contexts, such as frightening or romantic television programs, influence the effectiveness of basic persuasion heuristics? Different predictions are made by three theoretical models: A general arousal model predicts that arousal should increase effectiveness of heuristics; an affective valence model predicts that effectiveness should depend on whether the context elicits positive or negative affect; an evolutionary model predicts that persuasiveness should depend on both the specific emotion that is elicited and the content of the particular heuristic. Three experiments examined how fear-inducing versus romantic contexts influenced the effectiveness of two widely used heuristics—social proof (e.g., “most popular”) and scarcity (e.g., “limited edition”). Results supported predictions from an evolutionary model, showing that fear can lead scarcity appeals to be counter-persuasive, and that romantic desire can lead social proof appeals to be counter-persuasive. The findings highlight how an evolutionary theoretical approach can lead to novel theoretical and practical marketing insights. PMID:19727416

  6. Risky choice with heuristics: reply to Birnbaum (2008), Johnson, Schulte-Mecklenbeck, and Willemsen (2008), and Rieger and Wang (2008).

    PubMed

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2008-01-01

    E. Brandstätter, G. Gigerenzer, and R. Hertwig (2006) showed that the priority heuristic matches or outperforms modifications of expected utility theory in predicting choice in 4 diverse problem sets. M. H. Birnbaum (2008) argued that sets exist in which the opposite is true. The authors agree--but stress that all choice strategies have regions of good and bad performance. The accuracy of various strategies systematically depends on choice difficulty, which the authors consider a triggering variable underlying strategy selection. Agreeing with E. J. Johnson, M. Schulte-Mecklenbeck, and M. C. Willemsen (2008) that process (not "as-if") models need to be formulated, the authors show how quantitative predictions can be derived and test them. Finally, they demonstrate that many of Birnbaum's and M. O. Rieger and M. Wang's (2008) case studies championing their preferred models involved biased tests in which the priority heuristic predicted data, whereas the parameterized models were fitted to the same data. The authors propose an adaptive toolbox approach of risky choice, according to which people first seek a no-conflict solution before resorting to conflict-resolving strategies such as the priority heuristic. (c) 2008 APA, all rights reserved

  7. Multicriteria meta-heuristics for AGV dispatching control based on computational intelligence.

    PubMed

    Naso, David; Turchiano, Biagio

    2005-04-01

    In many manufacturing environments, automated guided vehicles are used to move the processed materials between various pickup and delivery points. The assignment of vehicles to unit loads is a complex problem that is often solved in real-time with simple dispatching rules. This paper proposes an automated guided vehicles dispatching approach based on computational intelligence. We adopt a fuzzy multicriteria decision strategy to simultaneously take into account multiple aspects in every dispatching decision. Since the typical short-term view of dispatching rules is one of the main limitations of such real-time assignment heuristics, we also incorporate in the multicriteria algorithm a specific heuristic rule that takes into account the empty-vehicle travel on a longer time-horizon. Moreover, we also adopt a genetic algorithm to tune the weights associated to each decision criteria in the global decision algorithm. The proposed approach is validated by means of a comparison with other dispatching rules, and with other recently proposed multicriteria dispatching strategies also based on computational Intelligence. The analysis of the results obtained by the proposed dispatching approach in both nominal and perturbed operating conditions (congestions, faults) confirms its effectiveness.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, Ryan; Marnay, Chris

    The on-site generation of electricity can offer buildingowners and occupiers financial benefits as well as social benefits suchas reduced grid congestion, improved energy efficiency, and reducedgreenhouse gas emissions. Combined heat and power (CHP), or cogeneration,systems make use of the waste heat from the generator for site heatingneeds. Real-time optimal dispatch of CHP systems is difficult todetermine because of complicated electricity tariffs and uncertainty inCHP equipment availability, energy prices, and system loads. Typically,CHP systems use simple heuristic control strategies. This paper describesa method of determining optimal control in real-time and applies it to alight industrial site in San Diego, California, tomore » examine: 1) the addedbenefit of optimal over heuristic controls, 2) the price elasticity ofthe system, and 3) the site-attributable greenhouse gas emissions, allunder three different tariff structures. Results suggest that heuristiccontrols are adequate under the current tariff structure and relativelyhigh electricity prices, capturing 97 percent of the value of thedistributed generation system. Even more value could be captured bysimply not running the CHP system during times of unusually high naturalgas prices. Under hypothetical real-time pricing of electricity,heuristic controls would capture only 70 percent of the value ofdistributed generation.« less

  9. The Use of Recognition in Group Decision-Making

    ERIC Educational Resources Information Center

    Reimer, Torsten; Katsikopoulos, Konstantinos V.

    2004-01-01

    Goldstein and Gigerenzer (2002) [Models of ecological rationality: The recognition heuristic. "Psychological Review," 109 (1), 75-90] found evidence for the use of the recognition heuristic. For example, if an individual recognizes only one of two cities, they tend to infer that the recognized city has a larger population. A prediction…

  10. A Heuristic for the Teaching of Persuasion.

    ERIC Educational Resources Information Center

    Schell, John F.

    Interpreting Aristotle's criteria for persuasive writing--ethos, logos, and pathos--as a concern for writer, language, and audience creates both an effective model for persuasive writing and a structure around which to organize discussions of relevant rhetorical issues. Use of this heuristic to analyze writing style, organization, and content…

  11. Wishful Thinking? Inside the Black Box of Exposure Assessment.

    PubMed

    Money, Annemarie; Robinson, Christine; Agius, Raymond; de Vocht, Frank

    2016-05-01

    Decision-making processes used by experts when undertaking occupational exposure assessment are relatively unknown, but it is often assumed that there is a common underlying method that experts employ. However, differences in training and experience of assessors make it unlikely that one general method for expert assessment would exist. Therefore, there are concerns about formalizing, validating, and comparing expert estimates within and between studies that are difficult, if not impossible, to characterize. Heuristics on the other hand (the processes involved in decision making) have been extensively studied. Heuristics are deployed by everyone as short-cuts to make the often complex process of decision-making simpler, quicker, and less burdensome. Experts' assessments are often subject to various simplifying heuristics as a way to reach a decision in the absence of sufficient data. Therefore, investigating the underlying heuristics or decision-making processes involved may help to shed light on the 'black box' of exposure assessment. A mixed method study was conducted utilizing both a web-based exposure assessment exercise incorporating quantitative and semiqualitative elements of data collection, and qualitative semi-structured interviews with exposure assessors. Qualitative data were analyzed using thematic analysis. Twenty-five experts completed the web-based exposure assessment exercise and 8 of these 25 were randomly selected to participate in the follow-up interview. Familiar key themes relating to the exposure assessment exercise emerged; 'intensity'; 'probability'; 'agent'; 'process'; and 'duration' of exposure. However, an important aspect of the detailed follow-up interviews revealed a lack of structure and order with which participants described their decision making. Participants mostly described some form of an iterative process, heavily relying on the anchoring and adjustment heuristic, which differed between experts. In spite of having undertaken comparable training (in occupational hygiene or exposure assessment), experts use different methods to assess exposure. Decision making appears to be an iterative process with heavy reliance on the key heuristic of anchoring and adjustment. Using multiple experts to assess exposure while providing some form of anchoring scenario to build from, and additional training in understanding the impact of simple heuristics on the process of decision making, is likely to produce a more methodical approach to assessment; thereby improving consistency and transparency in expert exposure assessment. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. Wishful Thinking? Inside the Black Box of Exposure Assessment

    PubMed Central

    Money, Annemarie; Robinson, Christine; Agius, Raymond; de Vocht, Frank

    2016-01-01

    Background: Decision-making processes used by experts when undertaking occupational exposure assessment are relatively unknown, but it is often assumed that there is a common underlying method that experts employ. However, differences in training and experience of assessors make it unlikely that one general method for expert assessment would exist. Therefore, there are concerns about formalizing, validating, and comparing expert estimates within and between studies that are difficult, if not impossible, to characterize. Heuristics on the other hand (the processes involved in decision making) have been extensively studied. Heuristics are deployed by everyone as short-cuts to make the often complex process of decision-making simpler, quicker, and less burdensome. Experts’ assessments are often subject to various simplifying heuristics as a way to reach a decision in the absence of sufficient data. Therefore, investigating the underlying heuristics or decision-making processes involved may help to shed light on the ‘black box’ of exposure assessment. Methods: A mixed method study was conducted utilizing both a web-based exposure assessment exercise incorporating quantitative and semiqualitative elements of data collection, and qualitative semi-structured interviews with exposure assessors. Qualitative data were analyzed using thematic analysis. Results: Twenty-five experts completed the web-based exposure assessment exercise and 8 of these 25 were randomly selected to participate in the follow-up interview. Familiar key themes relating to the exposure assessment exercise emerged; ‘intensity’; ‘probability’; ‘agent’; ‘process’; and ‘duration’ of exposure. However, an important aspect of the detailed follow-up interviews revealed a lack of structure and order with which participants described their decision making. Participants mostly described some form of an iterative process, heavily relying on the anchoring and adjustment heuristic, which differed between experts. Conclusion: In spite of having undertaken comparable training (in occupational hygiene or exposure assessment), experts use different methods to assess exposure. Decision making appears to be an iterative process with heavy reliance on the key heuristic of anchoring and adjustment. Using multiple experts to assess exposure while providing some form of anchoring scenario to build from, and additional training in understanding the impact of simple heuristics on the process of decision making, is likely to produce a more methodical approach to assessment; thereby improving consistency and transparency in expert exposure assessment. PMID:26764244

  13. Guided Iterative Substructure Search (GI-SSS) - A New Trick for an Old Dog.

    PubMed

    Weskamp, Nils

    2016-07-01

    Substructure search (SSS) is a fundamental technique supported by various chemical information systems. Many users apply it in an iterative manner: they modify their queries to shape the composition of the retrieved hit sets according to their needs. We propose and evaluate two heuristic extensions of SSS aimed at simplifying these iterative query modifications by collecting additional information during query processing and visualizing this information in an intuitive way. This gives the user a convenient feedback on how certain changes to the query would affect the retrieved hit set and reduces the number of trial-and-error cycles needed to generate an optimal search result. The proposed heuristics are simple, yet surprisingly effective and can be easily added to existing SSS implementations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    NASA Astrophysics Data System (ADS)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  15. Two-dimensional nanoscale correlations in the strong negative thermal expansion material ScF 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handunkanda, Sahan U.; Occhialini, Connor A.; Said, Ayman H.

    We present diffuse x-ray scattering data on the strong negative thermal expansion (NTE) material ScF3 and find that two-dimensional nanoscale correlations exist at momentum-space regions associated with possibly rigid rotations of the perovskite octahedra. We address the extent to which rigid octahedral motion describes the dynamical fluctuations behind NTE by generalizing a simple model supporting a single floppy mode that is often used to heuristically describe instances of NTE. We find this model has tendencies toward dynamic inhomogeneities and its application to recent and existing experimental data suggest an intricate link between the nanometer correlation length scale, the energy scalemore » for octahedral tilt fluctuations, and the coefficient of thermal expansion in ScF3. We then investigate the breakdown of the rigid limit and propose a resolution to an outstanding debate concerning the role of molecular rigidity in strong NTE materials.« less

  16. Robust position estimation of a mobile vehicle

    NASA Astrophysics Data System (ADS)

    Conan, Vania; Boulanger, Pierre; Elgazzar, Shadia

    1994-11-01

    The ability to estimate the position of a mobile vehicle is a key task for navigation over large distances in complex indoor environments such as nuclear power plants. Schematics of the plants are available, but they are incomplete, as real settings contain many objects, such as pipes, cables or furniture, that mask part of the model. The position estimation method described in this paper matches 3-D data with a simple schematic of a plant. It is basically independent of odometry information and viewpoint, robust to noisy data and spurious points and largely insensitive to occlusions. The method is based on a hypothesis/verification paradigm and its complexity is polynomial; it runs in (Omicron) (m4n4), where m represents the number of model patches and n the number of scene patches. Heuristics are presented to speed up the algorithm. Results on real 3-D data show good behavior even when the scene is very occluded.

  17. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  18. Monitoring methods and predictive models for water status in Jonathan apples.

    PubMed

    Trincă, Lucia Carmen; Căpraru, Adina-Mirela; Arotăriţei, Dragoş; Volf, Irina; Chiruţă, Ciprian

    2014-02-01

    Evaluation of water status in Jonathan apples was performed for 20 days. Loss moisture content (LMC) was carried out through slow drying of wholes apples and the moisture content (MC) was carried out through oven drying and lyophilisation for apple samples (chunks, crushed and juice). We approached a non-destructive method to evaluate LMC and MC of apples using image processing and multilayer neural networks (NN) predictor. We proposed a new simple algorithm that selects the texture descriptors based on initial set heuristically chosen. Both structure and weights of NN are optimised by a genetic algorithm with variable length genotype that led to a high precision of the predictive model (R(2)=0.9534). In our opinion, the developing of this non-destructive method for the assessment of LMC and MC (and of other chemical parameters) seems to be very promising in online inspection of food quality. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Approximation algorithms for a genetic diagnostics problem.

    PubMed

    Kosaraju, S R; Schäffer, A A; Biesecker, L G

    1998-01-01

    We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.

  20. Does interaction matter? Testing whether a confidence heuristic can replace interaction in collective decision-making.

    PubMed

    Bang, Dan; Fusaroli, Riccardo; Tylén, Kristian; Olsen, Karsten; Latham, Peter E; Lau, Jennifer Y F; Roepstorff, Andreas; Rees, Geraint; Frith, Chris D; Bahrami, Bahador

    2014-05-01

    In a range of contexts, individuals arrive at collective decisions by sharing confidence in their judgements. This tendency to evaluate the reliability of information by the confidence with which it is expressed has been termed the 'confidence heuristic'. We tested two ways of implementing the confidence heuristic in the context of a collective perceptual decision-making task: either directly, by opting for the judgement made with higher confidence, or indirectly, by opting for the faster judgement, exploiting an inverse correlation between confidence and reaction time. We found that the success of these heuristics depends on how similar individuals are in terms of the reliability of their judgements and, more importantly, that for dissimilar individuals such heuristics are dramatically inferior to interaction. Interaction allows individuals to alleviate, but not fully resolve, differences in the reliability of their judgements. We discuss the implications of these findings for models of confidence and collective decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Evaluation of a Heuristic Model for Tropical Cyclone Resilience

    DTIC Science & Technology

    2015-01-26

    in which the effective static stability vanishes in rising and sinking regions, the 13 heuristic model yields a poor approximation to the simulated...tilt configuration. However, in the moist-neutral 12 limit, in which the effective static stability vanishes in rising and sinking regions, the 13...larger, leading to more effective 13 damping of the tilt mode (e.g., Schecter and Montgomery 2007; see their Figs. 10 and 11 14 and accompanying

  2. Using decision tree models to depict primary care physicians CRC screening decision heuristics.

    PubMed

    Wackerbarth, Sarah B; Tarasenko, Yelena N; Curtis, Laurel A; Joyce, Jennifer M; Haist, Steven A

    2007-10-01

    The purpose of this study was to identify decision heuristics utilized by primary care physicians in formulating colorectal cancer screening recommendations. Qualitative research using in-depth semi-structured interviews. We interviewed 66 primary care internists and family physicians evenly drawn from academic and community practices. A majority of physicians were male, and almost all were white, non-Hispanic. Three researchers independently reviewed each transcript to determine the physician's decision criteria and developed decision trees. Final trees were developed by consensus. The constant comparative methodology was used to define the categories. Physicians were found to use 1 of 4 heuristics ("age 50," "age 50, if family history, then earlier," "age 50, if family history, then screen at age 40," or "age 50, if family history, then adjust relative to reference case") for the timing recommendation and 5 heuristics ["fecal occult blood test" (FOBT), "colonoscopy," "if not colonoscopy, then...," "FOBT and another test," and "a choice between options"] for the type decision. No connection was found between timing and screening type heuristics. We found evidence of heuristic use. Further research is needed to determine the potential impact on quality of care.

  3. Learning and inference using complex generative models in a spatial localization task.

    PubMed

    Bejjanki, Vikranth R; Knill, David C; Aslin, Richard N

    2016-01-01

    A large body of research has established that, under relatively simple task conditions, human observers integrate uncertain sensory information with learned prior knowledge in an approximately Bayes-optimal manner. However, in many natural tasks, observers must perform this sensory-plus-prior integration when the underlying generative model of the environment consists of multiple causes. Here we ask if the Bayes-optimal integration seen with simple tasks also applies to such natural tasks when the generative model is more complex, or whether observers rely instead on a less efficient set of heuristics that approximate ideal performance. Participants localized a "hidden" target whose position on a touch screen was sampled from a location-contingent bimodal generative model with different variances around each mode. Over repeated exposure to this task, participants learned the a priori locations of the target (i.e., the bimodal generative model), and integrated this learned knowledge with uncertain sensory information on a trial-by-trial basis in a manner consistent with the predictions of Bayes-optimal behavior. In particular, participants rapidly learned the locations of the two modes of the generative model, but the relative variances of the modes were learned much more slowly. Taken together, our results suggest that human performance in a more complex localization task, which requires the integration of sensory information with learned knowledge of a bimodal generative model, is consistent with the predictions of Bayes-optimal behavior, but involves a much longer time-course than in simpler tasks.

  4. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    NASA Technical Reports Server (NTRS)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  5. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    ERIC Educational Resources Information Center

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  6. Lattice fermions

    NASA Technical Reports Server (NTRS)

    Wilczek, Frank

    1987-01-01

    A simple heuristic proof of the Nielsen-Ninomaya theorem is given. A method is proposed whereby the multiplication of fermion species on a lattice is reduced to the minimal doubling, in any dimension, with retention of appropriate chiral symmetries. Also, it is suggested that use of spatially thinned fermion fields is likely to be a useful and appropriate approximation in QCD - in any case, it is a self-checking one.

  7. Some observations on boundary conditions for numerical conservation laws

    NASA Technical Reports Server (NTRS)

    Kamowitz, David

    1988-01-01

    Four choices of outflow boundary conditions are considered for numerical conservation laws. All four methods are stable for linear problems, for which examples are presented where either a boundary layer forms or the numerical scheme, together with the boundary condition, is unstable due to the formation of a reflected shock. A simple heuristic argument is presented for determining the suitability of the boundary condition.

  8. Decision support for hospital bed management using adaptable individual length of stay estimations and shared resources

    PubMed Central

    2013-01-01

    Background Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients’ condition, the necessity of the treatment, and the patients’ preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient’s recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. Methods The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model’s cost factors. Results A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model’s cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). Conclusions In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning. PMID:23289448

  9. Compensatory Reading among ESL Learners: A Reading Strategy Heuristic

    ERIC Educational Resources Information Center

    Ismail, Shaik Abdul Malik Mohamed; Petras, Yusof Ede; Mohamed, Abdul Rashid; Eng, Lin Siew

    2015-01-01

    This paper aims to gain an insight to the relationship of two different concepts about reading comprehension, namely, the linear model of comprehension and the interactive compensatory theory. Drawing on both the above concepts, a heuristic was constructed about three different reading strategies determined by the specific ways the literal,…

  10. One-Reason Decision Making Unveiled: A Measurement Model of the Recognition Heuristic

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.; Erdfelder, Edgar; Pohl, Rudiger F.

    2010-01-01

    The fast-and-frugal recognition heuristic (RH) theory provides a precise process description of comparative judgments. It claims that, in suitable domains, judgments between pairs of objects are based on recognition alone, whereas further knowledge is ignored. However, due to the confound between recognition and further knowledge, previous…

  11. Heuristics: A Step Toward Getting There.

    ERIC Educational Resources Information Center

    Anderson, G. Ernest, Jr.

    This paper describes a series of heuristic approaches to helping schools analyze problems by the use of a teletype time-sharing computer terminal. The examples detailed include 1) a Delphi exercise for students; 2) a budgeting model which examines the results of various levels of funding and of changes of relative priorities; 3) a school…

  12. Scientific Message Translation and the Heuristic Systematic Model: Insights for Designing Educational Messages About Progesterone and Breast Cancer Risks

    PubMed Central

    Perrault, Evan; Smith, Sandi; Keating, David M.; Nazione, Samantha; Silk, Kami; Russell, Jessica

    2017-01-01

    Results of ongoing scientific research on environmental determinants of breast cancer are not typically presented to the public in ways they can easily understand and use to take preventive actions. In this study, results of scientific studies on progesterone exposure as a risk factor for breast cancer were translated into high and low literacy level messages. Using the heuristic systematic model, this study examined how ability, motivation, and message processing (heuristic and systematic) influenced perceptions of risk beliefs and negative attitudes about progesterone exposure among women who read the translated scientific messages. Among the 1254 participants, those given the higher literacy level message had greater perceptions of risk about progesterone. Heuristic message cues of source credibility and perceived message quality, as well as motivation, also predicted risk beliefs. Finally, risk beliefs were a strong predictor of negative attitudes about exposure to progesterone. The results can help improve health education message design in terms of practitioners having better knowledge of message features that are the most persuasive to the target audiences on this topic. PMID:25903053

  13. A comparative study of the A* heuristic search algorithm used to solve efficiently a puzzle game

    NASA Astrophysics Data System (ADS)

    Iordan, A. E.

    2018-01-01

    The puzzle game presented in this paper consists in polyhedra (prisms, pyramids or pyramidal frustums) which can be moved using the free available spaces. The problem requires to be found the minimum number of movements in order the game reaches to a goal configuration starting from an initial configuration. Because the problem is enough complex, the principal difficulty in solving it is given by dimension of search space, that leads to necessity of a heuristic search. The improving of the search method consists into determination of a strong estimation by the heuristic function which will guide the search process to the most promising side of the search tree. The comparative study is realized among Manhattan heuristic and the Hamming heuristic using A* search algorithm implemented in Java. This paper also presents the necessary stages in object oriented development of a software used to solve efficiently this puzzle game. The modelling of the software is achieved through specific UML diagrams representing the phases of analysis, design and implementation, the system thus being described in a clear and practical manner. With the purpose to confirm the theoretical results which demonstrates that Manhattan heuristic is more efficient was used space complexity criterion. The space complexity was measured by the number of generated nodes from the search tree, by the number of the expanded nodes and by the effective branching factor. From the experimental results obtained by using the Manhattan heuristic, improvements were observed regarding space complexity of A* algorithm versus Hamming heuristic.

  14. Colloid Transport in Saturated Porous Media: Elimination of Attachment Efficiency in a New Colloid Transport Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landkamer, Lee L.; Harvey, Ronald W.; Scheibe, Timothy D.

    A new colloid transport model is introduced that is conceptually simple but captures the essential features of complicated attachment and detachment behavior of colloids when conditions of secondary minimum attachment exist. This model eliminates the empirical concept of collision efficiency; the attachment rate is computed directly from colloid filtration theory. Also, a new paradigm for colloid detachment based on colloid population heterogeneity is introduced. Assuming the dispersion coefficient can be estimated from tracer behavior, this model has only two fitting parameters: (1) the fraction of colloids that attach irreversibly and (2) the rate at which reversibly attached colloids leave themore » surface. These two parameters were correlated to physical parameters that control colloid transport such as the depth of the secondary minimum and pore water velocity. Given this correlation, the model serves as a heuristic tool for exploring the influence of physical parameters such as surface potential and fluid velocity on colloid transport. This model can be extended to heterogeneous systems characterized by both primary and secondary minimum deposition by simply increasing the fraction of colloids that attach irreversibly.« less

  15. POCO-MOEA: Using Evolutionary Algorithms to Solve the Controller Placement Problem

    DTIC Science & Technology

    2016-03-24

    to gather data on POCO-MOEA performance to a series of iv model networks. The algorithm’s behavior is then evaluated and compared to ex- haustive... evaluation of a third heuristic based on a Multi 3 Objective Evolutionary Algorithm (MOEA). This heuristic is modeled after one of the most well known MOEAs...researchers to extend into more realistic evaluations of the performance characteristics of SDN controllers, such as the use of simulators or live

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Philip LaRoche

    At the end of his life, Stephen Jay Kline, longtime professor of mechanical engineering at Stanford University, completed a book on how to address complex systems. The title of the book is 'Conceptual Foundations of Multi-Disciplinary Thinking' (1995), but the topic of the book is systems. Kline first establishes certain limits that are characteristic of our conscious minds. Kline then establishes a complexity measure for systems and uses that complexity measure to develop a hierarchy of systems. Kline then argues that our minds, due to their characteristic limitations, are unable to model the complex systems in that hierarchy. Computers aremore » of no help to us here. Our attempts at modeling these complex systems are based on the way we successfully model some simple systems, in particular, 'inert, naturally-occurring' objects and processes, such as what is the focus of physics. But complex systems overwhelm such attempts. As a result, the best we can do in working with these complex systems is to use a heuristic, what Kline calls the 'Guideline for Complex Systems.' Kline documents the problems that have developed due to 'oversimple' system models and from the inappropriate application of a system model from one domain to another. One prominent such problem is the Procrustean attempt to make the disciplines that deal with complex systems be 'physics-like.' Physics deals with simple systems, not complex ones, using Kline's complexity measure. The models that physics has developed are inappropriate for complex systems. Kline documents a number of the wasteful and dangerous fallacies of this type.« less

  17. Modeling adsorption with lattice Boltzmann equation

    PubMed Central

    Guo, Long; Xiao, Lizhi; Shan, Xiaowen; Zhang, Xiaoling

    2016-01-01

    The research of adsorption theory has recently gained renewed attention due to its critical relevance to a number of trending industrial applications, hydrogen storage and shale gas exploration for instance. The existing theoretical foundation, laid mostly in the early twentieth century, was largely based on simple heuristic molecular interaction models and static interaction potential which, although being insightful in illuminating the fundamental mechanisms, are insufficient for computations with realistic adsorbent structure and adsorbate hydrodynamics, both critical for real-life applications. Here we present and validate a novel lattice Boltzmann model incorporating both adsorbate-adsorbate and adsorbate-adsorbent interactions with hydrodynamics which, for the first time, allows adsorption to be computed with real-life details. Connection with the classic Ono-Kondo lattice theory is established and various adsorption isotherms, both within and beyond the IUPAC classification are observed as a pseudo-potential is varied. This new approach not only enables an important physical to be simulated for real-life applications, but also provides an enabling theoretical framework within which the fundamentals of adsorption can be studied. PMID:27256325

  18. A heuristic model for working memory deficit in schizophrenia.

    PubMed

    Qi, Zhen; Yu, Gina P; Tretter, Felix; Pogarell, Oliver; Grace, Anthony A; Voit, Eberhard O

    2016-11-01

    The life of schizophrenia patients is severely affected by deficits in working memory. In various brain regions, the reciprocal interactions between excitatory glutamatergic neurons and inhibitory GABAergic neurons are crucial. Other neurotransmitters, in particular dopamine, serotonin, acetylcholine, and norepinephrine, modulate the local balance between glutamate and GABA and therefore regulate the function of brain regions. Persistent alterations in the balances between the neurotransmitters can result in working memory deficits. Here we present a heuristic computational model that accounts for interactions among neurotransmitters across various brain regions. The model is based on the concept of a neurochemical interaction matrix at the biochemical level and combines this matrix with a mobile model representing physiological dynamic balances among neurotransmitter systems associated with working memory. The comparison of clinical and simulation results demonstrates that the model output is qualitatively very consistent with the available data. In addition, the model captured how perturbations migrated through different neurotransmitters and brain regions. Results showed that chronic administration of ketamine can cause a variety of imbalances, and application of an antagonist of the D2 receptor in PFC can also induce imbalances but in a very different manner. The heuristic computational model permits a variety of assessments of genetic, biochemical, and pharmacological perturbations and serves as an intuitive tool for explaining clinical and biological observations. The heuristic model is more intuitive than biophysically detailed models. It can serve as an important tool for interdisciplinary communication and even for psychiatric education of patients and relatives. This article is part of a Special Issue entitled "System Genetics" Guest Editor: Dr. Yudong Cai and Dr. Tao Huang. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Better ILP models for haplotype assembly.

    PubMed

    Etemadi, Maryam; Bagherian, Mehri; Chen, Zhi-Zhong; Wang, Lusheng

    2018-02-19

    The haplotype assembly problem for diploid is to find a pair of haplotypes from a given set of aligned Single Nucleotide Polymorphism (SNP) fragments (reads). It has many applications in association studies, drug design, and genetic research. Since this problem is computationally hard, both heuristic and exact algorithms have been designed for it. Although exact algorithms are much slower, they are still of great interest because they usually output significantly better solutions than heuristic algorithms in terms of popular measures such as the Minimum Error Correction (MEC) score, the number of switch errors, and the QAN50 score. Exact algorithms are also valuable because they can be used to witness how good a heuristic algorithm is. The best known exact algorithm is based on integer linear programming (ILP) and it is known that ILP can also be used to improve the output quality of every heuristic algorithm with a little decline in speed. Therefore, faster ILP models for the problem are highly demanded. As in previous studies, we consider not only the general case of the problem but also its all-heterozygous case where we assume that if a column of the input read matrix contains at least one 0 and one 1, then it corresponds to a heterozygous SNP site. For both cases, we design new ILP models for the haplotype assembly problem which aim at minimizing the MEC score. The new models are theoretically better because they contain significantly fewer constraints. More importantly, our experimental results show that for both simulated and real datasets, the new model for the all-heterozygous (respectively, general) case can usually be solved via CPLEX (an ILP solver) at least 5 times (respectively, twice) faster than the previous bests. Indeed, the running time can sometimes be 41 times better. This paper proposes a new ILP model for the haplotype assembly problem and its all-heterozygous case, respectively. Experiments with both real and simulated datasets show that the new models can be solved within much shorter time by CPLEX than the previous bests. We believe that the models can be used to improve heuristic algorithms as well.

  20. Universal fragment descriptors for predicting properties of inorganic crystals

    NASA Astrophysics Data System (ADS)

    Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander

    2017-06-01

    Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.

  1. Universal fragment descriptors for predicting properties of inorganic crystals.

    PubMed

    Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander

    2017-06-05

    Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.

  2. Taking Aim at the Cognitive Side of Learning in Sensorimotor Adaptation Tasks.

    PubMed

    McDougle, Samuel D; Ivry, Richard B; Taylor, Jordan A

    2016-07-01

    Sensorimotor adaptation tasks have been used to characterize processes responsible for calibrating the mapping between desired outcomes and motor commands. Research has focused on how this form of error-based learning takes place in an implicit and automatic manner. However, recent work has revealed the operation of multiple learning processes, even in this simple form of learning. This review focuses on the contribution of cognitive strategies and heuristics to sensorimotor learning, and how these processes enable humans to rapidly explore and evaluate novel solutions to enable flexible, goal-oriented behavior. This new work points to limitations in current computational models, and how these must be updated to describe the conjoint impact of multiple processes in sensorimotor learning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. [Application of optimized parameters SVM based on photoacoustic spectroscopy method in fault diagnosis of power transformer].

    PubMed

    Zhang, Yu-xin; Cheng, Zhi-feng; Xu, Zheng-ping; Bai, Jing

    2015-01-01

    In order to solve the problems such as complex operation, consumption for the carrier gas and long test period in traditional power transformer fault diagnosis approach based on dissolved gas analysis (DGA), this paper proposes a new method which is detecting 5 types of characteristic gas content in transformer oil such as CH4, C2H2, C2H4, C2H6 and H2 based on photoacoustic Spectroscopy and C2H2/C2H4, CH4/H2, C2H4/C2H6 three-ratios data are calculated. The support vector machine model was constructed using cross validation method under five support vector machine functions and four kernel functions, heuristic algorithms were used in parameter optimization for penalty factor c and g, which to establish the best SVM model for the highest fault diagnosis accuracy and the fast computing speed. Particles swarm optimization and genetic algorithm two types of heuristic algorithms were comparative studied in this paper for accuracy and speed in optimization. The simulation result shows that SVM model composed of C-SVC, RBF kernel functions and genetic algorithm obtain 97. 5% accuracy in test sample set and 98. 333 3% accuracy in train sample set, and genetic algorithm was about two times faster than particles swarm optimization in computing speed. The methods described in this paper has many advantages such as simple operation, non-contact measurement, no consumption for the carrier gas, long test period, high stability and sensitivity, the result shows that the methods described in this paper can instead of the traditional transformer fault diagnosis by gas chromatography and meets the actual project needs in transformer fault diagnosis.

  4. Fluency of pharmaceutical drug names predicts perceived hazardousness, assumed side effects and willingness to buy.

    PubMed

    Dohle, Simone; Siegrist, Michael

    2014-10-01

    The impact of pharmaceutical drug names on people's evaluations and behavioural intentions is still uncertain. According to the representativeness heuristic, evaluations should be more positive for complex drug names; in contrast, fluency theory suggests that evaluations should be more positive for simple drug names. Results of three experimental studies showed that complex drug names were perceived as more hazardous than simple drug names and negatively influenced willingness to buy. The results are of particular importance given the fact that there is a worldwide trend to make more drugs available for self-medication. © The Author(s) 2013.

  5. Quantifying the origins of life on a planetary scale.

    PubMed

    Scharf, Caleb; Cronin, Leroy

    2016-07-19

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of "microscale" factors and their role in determining "macroscale" abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities.

  6. Quantifying the origins of life on a planetary scale

    NASA Astrophysics Data System (ADS)

    Scharf, Caleb; Cronin, Leroy

    2016-07-01

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of “microscale” factors and their role in determining “macroscale” abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities.

  7. The slow-scale linear noise approximation: an accurate, reduced stochastic description of biochemical networks under timescale separation conditions

    PubMed Central

    2012-01-01

    Background It is well known that the deterministic dynamics of biochemical reaction networks can be more easily studied if timescale separation conditions are invoked (the quasi-steady-state assumption). In this case the deterministic dynamics of a large network of elementary reactions are well described by the dynamics of a smaller network of effective reactions. Each of the latter represents a group of elementary reactions in the large network and has associated with it an effective macroscopic rate law. A popular method to achieve model reduction in the presence of intrinsic noise consists of using the effective macroscopic rate laws to heuristically deduce effective probabilities for the effective reactions which then enables simulation via the stochastic simulation algorithm (SSA). The validity of this heuristic SSA method is a priori doubtful because the reaction probabilities for the SSA have only been rigorously derived from microscopic physics arguments for elementary reactions. Results We here obtain, by rigorous means and in closed-form, a reduced linear Langevin equation description of the stochastic dynamics of monostable biochemical networks in conditions characterized by small intrinsic noise and timescale separation. The slow-scale linear noise approximation (ssLNA), as the new method is called, is used to calculate the intrinsic noise statistics of enzyme and gene networks. The results agree very well with SSA simulations of the non-reduced network of elementary reactions. In contrast the conventional heuristic SSA is shown to overestimate the size of noise for Michaelis-Menten kinetics, considerably under-estimate the size of noise for Hill-type kinetics and in some cases even miss the prediction of noise-induced oscillations. Conclusions A new general method, the ssLNA, is derived and shown to correctly describe the statistics of intrinsic noise about the macroscopic concentrations under timescale separation conditions. The ssLNA provides a simple and accurate means of performing stochastic model reduction and hence it is expected to be of widespread utility in studying the dynamics of large noisy reaction networks, as is common in computational and systems biology. PMID:22583770

  8. Heuristics for Understanding the Concepts of Interaction, Polynomial Trend, and the General Linear Model.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    The relationship between analysis of variance (ANOVA) methods and their analogs (analysis of covariance and multiple analyses of variance and covariance--collectively referred to as OVA methods) and the more general analytic case is explored. A small heuristic data set is used, with a hypothetical sample of 20 subjects, randomly assigned to five…

  9. Ignorance- versus Evidence-Based Decision Making: A Decision Time Analysis of the Recognition Heuristic

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.; Pohl, Rudiger F.

    2009-01-01

    According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments--and its duration--is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of…

  10. A heuristic model of stone comminution in shock wave lithotripsy

    PubMed Central

    Smith, Nathan B.; Zhong, Pei

    2013-01-01

    A heuristic model is presented to describe the overall progression of stone comminution in shock wave lithotripsy (SWL), accounting for the effects of shock wave dose and the average peak pressure, P+(avg), incident on the stone during the treatment. The model is developed through adaptation of the Weibull theory for brittle fracture, incorporating threshold values in dose and P+(avg) that are required to initiate fragmentation. The model is validated against experimental data of stone comminution from two stone types (hard and soft BegoStone) obtained at various positions in lithotripter fields produced by two shock wave sources of different beam width and pulse profile both in water and in 1,3-butanediol (which suppresses cavitation). Subsequently, the model is used to assess the performance of a newly developed acoustic lens for electromagnetic lithotripters in comparison with its original counterpart both under static and simulated respiratory motion. The results have demonstrated the predictive value of this heuristic model in elucidating the physical basis for improved performance of the new lens. The model also provides a rationale for the selection of SWL treatment protocols to achieve effective stone comminution without elevating the risk of tissue injury. PMID:23927195

  11. Finding higher order Darboux polynomials for a family of rational first order ordinary differential equations

    NASA Astrophysics Data System (ADS)

    Avellar, J.; Claudino, A. L. G. C.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2015-10-01

    For the Darbouxian methods we are studying here, in order to solve first order rational ordinary differential equations (1ODEs), the most costly (computationally) step is the finding of the needed Darboux polynomials. This can be so grave that it can render the whole approach unpractical. Hereby we introduce a simple heuristics to speed up this process for a class of 1ODEs.

  12. Humans make efficient use of natural image statistics when performing spatial interpolation.

    PubMed

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  13. A Simple but Powerful Heuristic Method for Accelerating k-Means Clustering of Large-Scale Data in Life Science.

    PubMed

    Ichikawa, Kazuki; Morishita, Shinichi

    2014-01-01

    K-means clustering has been widely used to gain insight into biological systems from large-scale life science data. To quantify the similarities among biological data sets, Pearson correlation distance and standardized Euclidean distance are used most frequently; however, optimization methods have been largely unexplored. These two distance measurements are equivalent in the sense that they yield the same k-means clustering result for identical sets of k initial centroids. Thus, an efficient algorithm used for one is applicable to the other. Several optimization methods are available for the Euclidean distance and can be used for processing the standardized Euclidean distance; however, they are not customized for this context. We instead approached the problem by studying the properties of the Pearson correlation distance, and we invented a simple but powerful heuristic method for markedly pruning unnecessary computation while retaining the final solution. Tests using real biological data sets with 50-60K vectors of dimensions 10-2001 (~400 MB in size) demonstrated marked reduction in computation time for k = 10-500 in comparison with other state-of-the-art pruning methods such as Elkan's and Hamerly's algorithms. The BoostKCP software is available at http://mlab.cb.k.u-tokyo.ac.jp/~ichikawa/boostKCP/.

  14. Numerical evidences of universal trap-like aging dynamics

    NASA Astrophysics Data System (ADS)

    Cammarota, Chiara; Marinari, Enzo

    2018-04-01

    Trap models have been initially proposed as toy models for dynamical relaxation in extremely simplified rough potential energy landscapes. Their importance has recently grown considerably thanks to the discovery that the trap-like aging mechanism directly controls the out-of-equilibrium relaxation processes of more sophisticated spin models, that are considered as the solvable counterpart of real disordered systems. Further establishing the connection between these spin models, out-of-equilibrium behavior and the trap like aging mechanism could shed new light on the properties, which are still largely mysterious, for the activated out-of-equilibrium dynamics of disordered systems. In this work we discuss numerical evidence based on the computations of the permanence times of an emergent trap-like aging behavior in a variety of very simple disordered models—developed from the trap model paradigm. Our numerical results are backed by analytic derivations and heuristic discussions. Such exploration reveals some of the tricks needed to reveal the trap behavior in spite of the occurrence of secondary processes, of the existence of dynamical correlations and of strong finite system’s size effects.

  15. Simple and multiple linear regression: sample size considerations.

    PubMed

    Hanley, James A

    2016-11-01

    The suggested "two subjects per variable" (2SPV) rule of thumb in the Austin and Steyerberg article is a chance to bring out some long-established and quite intuitive sample size considerations for both simple and multiple linear regression. This article distinguishes two of the major uses of regression models that imply very different sample size considerations, neither served well by the 2SPV rule. The first is etiological research, which contrasts mean Y levels at differing "exposure" (X) values and thus tends to focus on a single regression coefficient, possibly adjusted for confounders. The second research genre guides clinical practice. It addresses Y levels for individuals with different covariate patterns or "profiles." It focuses on the profile-specific (mean) Y levels themselves, estimating them via linear compounds of regression coefficients and covariates. By drawing on long-established closed-form variance formulae that lie beneath the standard errors in multiple regression, and by rearranging them for heuristic purposes, one arrives at quite intuitive sample size considerations for both research genres. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. The Breslow estimator of the nonparametric baseline survivor function in Cox's regression model: some heuristics.

    PubMed

    Hanley, James A

    2008-01-01

    Most survival analysis textbooks explain how the hazard ratio parameters in Cox's life table regression model are estimated. Fewer explain how the components of the nonparametric baseline survivor function are derived. Those that do often relegate the explanation to an "advanced" section and merely present the components as algebraic or iterative solutions to estimating equations. None comment on the structure of these estimators. This note brings out a heuristic representation that may help to de-mystify the structure.

  17. Heuristics in primary care for recognition of unreported vision loss in older people: a technology development study.

    PubMed

    Wijeyekoon, Skanda; Kharicha, Kalpa; Iliffe, Steve

    2015-09-01

    To evaluate heuristics (rules of thumb) for recognition of undetected vision loss in older patients in primary care. Vision loss is associated with ageing, and its prevalence is increasing. Visual impairment has a broad impact on health, functioning and well-being. Unrecognised vision loss remains common, and screening interventions have yet to reduce its prevalence. An alternative approach is to enhance practitioners' skills in recognising undetected vision loss, by having a more detailed picture of those who are likely not to act on vision changes, report symptoms or have eye tests. This paper describes a qualitative technology development study to evaluate heuristics for recognition of undetected vision loss in older patients in primary care. Using a previous modelling study, two heuristics in the form of mnemonics were developed to aid pattern recognition and allow general practitioners to identify potential cases of unreported vision loss. These heuristics were then analysed with experts. Findings It was concluded that their implementation in modern general practice was unsuitable and an alternative solution should be sort.

  18. Approach to design neural cryptography: a generalized architecture and a heuristic rule.

    PubMed

    Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen

    2013-06-01

    Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.

  19. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGES

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; ...

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  20. Biomechanical interpretation of a free-breathing lung motion model

    NASA Astrophysics Data System (ADS)

    Zhao, Tianyu; White, Benjamin; Moore, Kevin L.; Lamb, James; Yang, Deshan; Lu, Wei; Mutic, Sasa; Low, Daniel A.

    2011-12-01

    The purpose of this paper is to develop a biomechanical model for free-breathing motion and compare it to a published heuristic five-dimensional (5D) free-breathing lung motion model. An ab initio biomechanical model was developed to describe the motion of lung tissue during free breathing by analyzing the stress-strain relationship inside lung tissue. The first-order approximation of the biomechanical model was equivalent to a heuristic 5D free-breathing lung motion model proposed by Low et al in 2005 (Int. J. Radiat. Oncol. Biol. Phys. 63 921-9), in which the motion was broken down to a linear expansion component and a hysteresis component. To test the biomechanical model, parameters that characterize expansion, hysteresis and angles between the two motion components were reported independently and compared between two models. The biomechanical model agreed well with the heuristic model within 5.5% in the left lungs and 1.5% in the right lungs for patients without lung cancer. The biomechanical model predicted that a histogram of angles between the two motion components should have two peaks at 39.8° and 140.2° in the left lungs and 37.1° and 142.9° in the right lungs. The data from the 5D model verified the existence of those peaks at 41.2° and 148.2° in the left lungs and 40.1° and 140° in the right lungs for patients without lung cancer. Similar results were also observed for the patients with lung cancer, but with greater discrepancies. The maximum-likelihood estimation of hysteresis magnitude was reported to be 2.6 mm for the lung cancer patients. The first-order approximation of the biomechanical model fit the heuristic 5D model very well. The biomechanical model provided new insights into breathing motion with specific focus on motion trajectory hysteresis.

  1. The psychology of nutrition messages.

    PubMed

    Schofield, Heather; Mullainathan, Sendhil

    2008-01-01

    The purpose of this paper is to explore consumer thinking about nutrition decisions and how firms can use consumers' awareness of the links between nutrients and health generated by public health messages to market products, including ones, which have little nutritional value. We approach this issue by tracking the development of public health messages based on scientific research, dissemination of those messages in the popular press, and use of nutrition claims in food advertisements to assess whether firms are timing the use of nutrition claims to take advantage of heuristic-based decision-making. Our findings suggest that the timing of the development of nutrition information, its dissemination in the press, and use in advertising accords well with a heuristic processing model in which firms take advantage of associations between nutrient information and health in their advertisements. However, the demonstrated relationships may not be causal. Further research will be needed to provide stronger and more comprehensive evidence regarding the proposed message hijacking process. If the message hijacking framework is borne out: (1) simple overall health rating scales could significantly improve consumer decision-making, (2) the impact of misleading advertisements could be mitigated by encouraging a multidimensional view of nutrition, and (3) more intensive regulation of product labeling could limit the impact of hijacked messages. Overall, this paper considers a novel hypothesis about the impact of public health messages on nutrition and health.

  2. Approximating Optimal Behavioural Strategies Down to Rules-of-Thumb: Energy Reserve Changes in Pairs of Social Foragers

    PubMed Central

    Rands, Sean A.

    2011-01-01

    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These ‘best’ strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or ‘rules-of-thumb’ that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose – particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour. PMID:21765938

  3. Approximating optimal behavioural strategies down to rules-of-thumb: energy reserve changes in pairs of social foragers.

    PubMed

    Rands, Sean A

    2011-01-01

    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.

  4. Heuristic Green's function of the time dependent radiative transfer equation for a semi-infinite medium.

    PubMed

    Martelli, Fabrizio; Sassaroli, Angelo; Pifferi, Antonio; Torricelli, Alessandro; Spinelli, Lorenzo; Zaccanti, Giovanni

    2007-12-24

    The Green's function of the time dependent radiative transfer equation for the semi-infinite medium is derived for the first time by a heuristic approach based on the extrapolated boundary condition and on an almost exact solution for the infinite medium. Monte Carlo simulations performed both in the simple case of isotropic scattering and of an isotropic point-like source, and in the more realistic case of anisotropic scattering and pencil beam source, are used to validate the heuristic Green's function. Except for the very early times, the proposed solution has an excellent accuracy (> 98 % for the isotropic case, and > 97 % for the anisotropic case) significantly better than the diffusion equation. The use of this solution could be extremely useful in the biomedical optics field where it can be directly employed in conditions where the use of the diffusion equation is limited, e.g. small volume samples, high absorption and/or low scattering media, short source-receiver distances and early times. Also it represents a first step to derive tools for other geometries (e.g. slab and slab with inhomogeneities inside) of practical interest for noninvasive spectroscopy and diffuse optical imaging. Moreover the proposed solution can be useful to several research fields where the study of a transport process is fundamental.

  5. The Aggregate Representation of Terrestrial Land Covers Within Global Climate Models (GCM)

    NASA Technical Reports Server (NTRS)

    Shuttleworth, W. James; Sorooshian, Soroosh

    1996-01-01

    This project had four initial objectives: (1) to create a realistic coupled surface-atmosphere model to investigate the aggregate description of heterogeneous surfaces; (2) to develop a simple heuristic model of surface-atmosphere interactions; (3) using the above models, to test aggregation rules for a variety of realistic cover and meteorological conditions; and (4) to reconcile biosphere-atmosphere transfer scheme (BATS) land covers with those that can be recognized from space; Our progress in meeting these objectives can be summarized as follows. Objective 1: The first objective was achieved in the first year of the project by coupling the Biosphere-Atmosphere Transfer Scheme (BATS) with a proven two-dimensional model of the atmospheric boundary layer. The resulting model, BATS-ABL, is described in detail in a Masters thesis and reported in a paper in the Journal of Hydrology Objective 2: The potential value of the heuristic model was re-evaluated early in the project and a decision was made to focus subsequent research around modeling studies with the BATS-ABL model. The value of using such coupled surface-atmosphere models in this research area was further confirmed by the success of the Tucson Aggregation Workshop. Objective 3: There was excellent progress in using the BATS-ABL model to test aggregation rules for a variety of realistic covers. The foci of attention have been the site of the First International Satellite Land Surface Climatology Project Field Experiment (FIFE) in Kansas and one of the study sites of the Anglo-Brazilian Amazonian Climate Observational Study (ABRACOS) near the city of Manaus, Amazonas, Brazil. These two sites were selected because of the ready availability of relevant field data to validate and initiate the BATS-ABL model. The results of these tests are given in a Masters thesis, and reported in two papers. Objective 4: Progress far exceeded original expectations not only in reconciling BATS land covers with those that can be recognized from space, but also in then applying remotely-sensed land cover data to map aggregate values of BATS parameters for heterogeneous covers and interpreting these parameters in terms of surface-atmosphere exchanges.

  6. Constructing high-quality bounding volume hierarchies for N-body computation using the acceptance volume heuristic

    NASA Astrophysics Data System (ADS)

    Olsson, O.

    2018-01-01

    We present a novel heuristic derived from a probabilistic cost model for approximate N-body simulations. We show that this new heuristic can be used to guide tree construction towards higher quality trees with improved performance over current N-body codes. This represents an important step beyond the current practice of using spatial partitioning for N-body simulations, and enables adoption of a range of state-of-the-art algorithms developed for computer graphics applications to yield further improvements in N-body simulation performance. We outline directions for further developments and review the most promising such algorithms.

  7. Multi-Criteria Optimization of the Deployment of a Grid for Rural Electrification Based on a Heuristic Method

    NASA Astrophysics Data System (ADS)

    Ortiz-Matos, L.; Aguila-Tellez, A.; Hincapié-Reyes, R. C.; González-Sanchez, J. W.

    2017-07-01

    In order to design electrification systems, recent mathematical models solve the problem of location, type of electrification components, and the design of possible distribution microgrids. However, due to the amount of points to be electrified increases, the solution to these models require high computational times, thereby becoming unviable practice models. This study posed a new heuristic method for the electrification of rural areas in order to solve the problem. This heuristic algorithm presents the deployment of rural electrification microgrids in the world, by finding routes for optimal placement lines and transformers in transmission and distribution microgrids. The challenge is to obtain a display with equity in losses, considering the capacity constraints of the devices and topology of the land at minimal economic cost. An optimal scenario ensures the electrification of all neighbourhoods to a minimum investment cost in terms of the distance between electric conductors and the amount of transformation devices.

  8. Leveraging social system networks in ubiquitous high-data-rate health systems.

    PubMed

    Massey, Tammara; Marfia, Gustavo; Stoelting, Adam; Tomasi, Riccardo; Spirito, Maurizio A; Sarrafzadeh, Majid; Pau, Giovanni

    2011-05-01

    Social system networks with high data rates and limited storage will discard data if the system cannot connect and upload the data to a central server. We address the challenge of limited storage capacity in mobile health systems during network partitions with a heuristic that achieves efficiency in storage capacity by modifying the granularity of the medical data during long intercontact periods. Patterns in the connectivity, reception rate, distance, and location are extracted from the social system network and leveraged in the global algorithm and online heuristic. In the global algorithm, the stochastic nature of the data is modeled with maximum likelihood estimation based on the distribution of the reception rates. In the online heuristic, the correlation between system position and the reception rate is combined with patterns in human mobility to estimate the intracontact and intercontact time. The online heuristic performs well with a low data loss of 2.1%-6.1%.

  9. Plan-graph Based Heuristics for Conformant Probabilistic Planning

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Salesh; Pollack, Martha E.; Smith, David E.

    2004-01-01

    In this paper, we introduce plan-graph based heuristics to solve a variation of the conformant probabilistic planning (CPP) problem. In many real-world problems, it is the case that the sensors are unreliable or take too many resources to provide knowledge about the environment. These domains are better modeled as conformant planning problems. POMDP based techniques are currently the most successful approach for solving CPP but have the limitation of state- space explosion. Recent advances in deterministic and conformant planning have shown that plan-graphs can be used to enhance the performance significantly. We show that this enhancement can also be translated to CPP. We describe our process for developing the plan-graph heuristics and estimating the probability of a partial plan. We compare the performance of our planner PVHPOP when used with different heuristics. We also perform a comparison with a POMDP solver to show over a order of magnitude improvement in performance.

  10. Predicting Vaccination Intention and Benefit and Risk Perceptions: The Incorporation of Affect, Trust, and Television Influence in a Dual-Mode Model.

    PubMed

    Chen, Nien-Tsu Nancy

    2015-07-01

    Major health behavior change models tend to consider health decisions as primarily resulting from a systematic appraisal of relevant beliefs, such as the perceived benefits and risks of a pharmacological intervention. Drawing on research from the disciplines of risk management, communication, and psychology, this study proposed the inclusion of a heuristic route in established theory and tested the direction of influence between heuristic and systematic process variables. Affect and social trust were included as key heuristics in the proposed dual-mode framework of health decision making. Furthermore, exposure to health-related coverage on television was considered potentially influential over both heuristic and systematic process variables. To test this framework, data were collected from a national probability sample of 584 adults in the United States in 2012 regarding their decision to vaccinate against a hypothetical avian flu. The results provided some support for the bidirectional influence between heuristic and systematic processing. Affect toward flu vaccination and trust in the Food and Drug Administration were found to be powerful predictors of vaccination intention, enhancing intention both directly and indirectly via certain systematic process variables. The direction of influence between perceived susceptibility and severity, on the one hand, and affect, on the other, is less clear, suggesting the need for further research. Contrary to the opinion of media critics, exposure to televised health coverage was negatively associated with the perceived risks of vaccination. Results from this study carry theoretical and practical implications, and applying this model to the acceptance of different health interventions constitutes an area for future inquiries. © 2015 Society for Risk Analysis.

  11. Finding the Two-Way Street: Women from Mother-Present/Father-Absent Homes and Their Ability to Make Close Female Friendships

    ERIC Educational Resources Information Center

    Marote, Melissa A.

    2011-01-01

    This heuristic study involving seven coresearchers, which included the author, explores the experiences of women from mother-present/father-absent homes and their ability to form and maintain close female friendships. The heuristic research model was chosen to provide the opportunity to conduct research in a very personalized, collaborative way…

  12. A three-stage heuristic for harvest scheduling with access road network development

    Treesearch

    Mark M. Clark; Russell D. Meller; Timothy P. McDonald

    2000-01-01

    In this article we present a new model for the scheduling of forest harvesting with spatial and temporal constraints. Our approach is unique in that we incorporate access road network development into the harvest scheduling selection process. Due to the difficulty of solving the problem optimally, we develop a heuristic that consists of a solution construction stage...

  13. New optimization model for routing and spectrum assignment with nodes insecurity

    NASA Astrophysics Data System (ADS)

    Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli

    2017-04-01

    By adopting the orthogonal frequency division multiplexing technology, elastic optical networks can provide the flexible and variable bandwidth allocation to each connection request and get higher spectrum utilization. The routing and spectrum assignment problem in elastic optical network is a well-known NP-hard problem. In addition, information security has received worldwide attention. We combine these two problems to investigate the routing and spectrum assignment problem with the guaranteed security in elastic optical network, and establish a new optimization model to minimize the maximum index of the used frequency slots, which is used to determine an optimal routing and spectrum assignment schemes. To solve the model effectively, a hybrid genetic algorithm framework integrating a heuristic algorithm into a genetic algorithm is proposed. The heuristic algorithm is first used to sort the connection requests and then the genetic algorithm is designed to look for an optimal routing and spectrum assignment scheme. In the genetic algorithm, tailor-made crossover, mutation and local search operators are designed. Moreover, simulation experiments are conducted with three heuristic strategies, and the experimental results indicate that the effectiveness of the proposed model and algorithm framework.

  14. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics.

    PubMed

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-08-01

    RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of [Formula: see text]. Subsequently, numerous faster 'Sankoff-style' approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity ([Formula: see text] quartic time). Breaking this barrier, we introduce the novel Sankoff-style algorithm 'sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)', which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff's original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. © The Author 2015. Published by Oxford University Press.

  15. Motion generation of peristaltic mobile robot with particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Homma, Takahiro; Kamamichi, Norihiro

    2015-03-01

    In developments of robots, bio-mimetics is attracting attention, which is a technology for the design of the structure and function inspired from biological system. There are a lot of examples of bio-mimetics in robotics such as legged robots, flapping robots, insect-type robots, fish-type robots. In this study, we focus on the motion of earthworm and aim to develop a peristaltic mobile robot. The earthworm is a slender animal moving in soil. It has a segmented body, and each segment can be shorted and lengthened by muscular actions. It can move forward by traveling expanding motions of each segment backward. By mimicking the structure and motion of the earthworm, we can construct a robot with high locomotive performance against an irregular ground or a narrow space. In this paper, to investigate the motion analytically, a dynamical model is introduced, which consist of a series-connected multi-mass model. Simple periodic patterns which mimic the motions of earthworms are applied in an open-loop fashion, and the moving patterns are verified through numerical simulations. Furthermore, to generate efficient motion of the robot, a particle swarm optimization algorithm, one of the meta-heuristic optimization, is applied. The optimized results are investigated by comparing to simple periodic patterns.

  16. Extension of the firefly algorithm and preference rules for solving MINLP problems

    NASA Astrophysics Data System (ADS)

    Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.

    2017-07-01

    An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.

  17. Localizing Ground Penetrating RADAR: A Step Towards Robust Autonomous Ground Vehicle Localization

    DTIC Science & Technology

    2016-07-14

    localization designed to complement existing approaches with a low sensitivity to failure modes of LIDAR, camera, and GPS/INS sensors due to its low...the detailed design and results from highway testing, which uses a simple heuristic for fusing LGPR estimates with a GPS/INS system. Cross-track... designed to enable a priori map-based local- ization. LGPR offers complementary capabilities to tradi- tional optics-based approaches to map-based

  18. Scheduling in the Face of Uncertain Resource Consumption and Utility

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Dearden, Richard

    2003-01-01

    We discuss the problem of scheduling tasks that consume uncertain amounts of a resource with known capacity and where the tasks have uncertain utility. In these circumstances, we would like to find schedules that exceed a lower bound on the expected utility when executed. We show that the problems are NP- complete, and present some results that characterize the behavior of some simple heuristics over a variety of problem classes.

  19. Task Assignment Heuristics for Parallel and Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.

  20. How do people judge risks: availability heuristic, affect heuristic, or both?

    PubMed

    Pachur, Thorsten; Hertwig, Ralph; Steinmann, Florian

    2012-09-01

    How does the public reckon which risks to be concerned about? The availability heuristic and the affect heuristic are key accounts of how laypeople judge risks. Yet, these two accounts have never been systematically tested against each other, nor have their predictive powers been examined across different measures of the public's risk perception. In two studies, we gauged risk perception in student samples by employing three measures (frequency, value of a statistical life, and perceived risk) and by using a homogeneous (cancer) and a classic set of heterogeneous causes of death. Based on these judgments of risk, we tested precise models of the availability heuristic and the affect heuristic and different definitions of availability and affect. Overall, availability-by-recall, a heuristic that exploits people's direct experience of occurrences of risks in their social network, conformed to people's responses best. We also found direct experience to carry a high degree of ecological validity (and one that clearly surpasses that of affective information). However, the relative impact of affective information (as compared to availability) proved more pronounced in value-of-a-statistical-life and perceived-risk judgments than in risk-frequency judgments. Encounters with risks in the media, in contrast, played a negligible role in people's judgments. Going beyond the assumption of exclusive reliance on either availability or affect, we also found evidence for mechanisms that combine both, either sequentially or in a composite fashion. We conclude with a discussion of policy implications of our results, including how to foster people's risk calibration and the success of education campaigns.

  1. Dynamic Resource Management for Parallel Tasks in an Oversubscribed Energy-Constrained Heterogeneous Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Koenig, Gregory A; Machovec, Dylan

    2016-01-01

    Abstract: The worth of completing parallel tasks is modeled using utility functions, which monotonically-decrease with time and represent the importance and urgency of a task. These functions define the utility earned by a task at the time of its completion. The performance of such a system is measured as the total utility earned by all completed tasks over some interval of time (e.g., 24 hours). To maximize system performance when scheduling dynamically arriving parallel tasks onto a high performance computing (HPC) system that is oversubscribed and energy-constrained, we have designed, analyzed, and compared different heuristic techniques. Four utility-aware heuristics (i.e.,more » Max Utility, Max Utility-per-Time, Max Utility-per-Resource, and Max Utility-per-Energy), three FCFS-based heuristics (Conservative Backfilling, EASY Backfilling, and FCFS with Multiple Queues), and a Random heuristic were examined in this study. A technique that is often used with the FCFS-based heuristics is the concept of a permanent reservation. We compare the performance of permanent reservations with temporary place-holders to demonstrate the advantages that place-holders can provide. We also present a novel energy filtering technique that constrains the maximum energy-per-resource used by each task. We conducted a simulation study to evaluate the performance of these heuristics and techniques in an energy-constrained oversubscribed HPC environment. With place-holders, energy filtering, and dropping tasks with low potential utility, our utility-aware heuristics are able to significantly outperform the existing FCFS-based techniques.« less

  2. Meta-heuristic algorithms as tools for hydrological science

    NASA Astrophysics Data System (ADS)

    Yoo, Do Guen; Kim, Joong Hoon

    2014-12-01

    In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.

  3. Distributed deep learning networks among institutions for medical imaging.

    PubMed

    Chang, Ken; Balachandar, Niranjan; Lam, Carson; Yi, Darvin; Brown, James; Beers, Andrew; Rosen, Bruce; Rubin, Daniel L; Kalpathy-Cramer, Jayashree

    2018-03-29

    Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study, we propose methods of distributing deep learning models as an attractive alternative to sharing patient data. We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

  4. Neural Signatures of Rational and Heuristic Choice Strategies: A Single Trial ERP Analysis.

    PubMed

    Wichary, Szymon; Magnuski, Mikołaj; Oleksy, Tomasz; Brzezicka, Aneta

    2017-01-01

    In multi-attribute choice, people use heuristics to simplify decision problems. We studied the use of heuristic and rational strategies and their electrophysiological correlates. Since previous work linked the P3 ERP component to attention and decision making, we were interested whether the amplitude of this component is associated with decision strategy use. To this end, we recorded EEG when participants performed a two-alternative choice task, where they could acquire decision cues in a sequential manner and use them to make choices. We classified participants' choices as consistent with a rational Weighted Additive rule (WADD) or a simple heuristic Take The Best (TTB). Participants differed in their preference for WADD and TTB. Using a permutation-based single trial approach, we analyzed EEG responses to consecutive decision cues and their relation to the individual strategy preference. The preference for WADD over TTB was associated with overall higher signal amplitudes to decision cues in the P3 time window. Moreover, the preference for WADD was associated with similar P3 amplitudes to consecutive cues, whereas the preference for TTB was associated with substantial decreases in P3 amplitudes to consecutive cues. We also found that the preference for TTB was associated with enhanced N1 component to cues that discriminated decision alternatives, suggesting very early attention allocation to such cues by TTB users. Our results suggest that preference for either WADD or TTB has an early neural signature reflecting differences in attentional weighting of decision cues. In light of recent findings and hypotheses regarding P3, we interpret these results as indicating the involvement of catecholamine arousal systems in shaping predecisional information processing and strategy selection.

  5. Neural Signatures of Rational and Heuristic Choice Strategies: A Single Trial ERP Analysis

    PubMed Central

    Wichary, Szymon; Magnuski, Mikołaj; Oleksy, Tomasz; Brzezicka, Aneta

    2017-01-01

    In multi-attribute choice, people use heuristics to simplify decision problems. We studied the use of heuristic and rational strategies and their electrophysiological correlates. Since previous work linked the P3 ERP component to attention and decision making, we were interested whether the amplitude of this component is associated with decision strategy use. To this end, we recorded EEG when participants performed a two-alternative choice task, where they could acquire decision cues in a sequential manner and use them to make choices. We classified participants’ choices as consistent with a rational Weighted Additive rule (WADD) or a simple heuristic Take The Best (TTB). Participants differed in their preference for WADD and TTB. Using a permutation-based single trial approach, we analyzed EEG responses to consecutive decision cues and their relation to the individual strategy preference. The preference for WADD over TTB was associated with overall higher signal amplitudes to decision cues in the P3 time window. Moreover, the preference for WADD was associated with similar P3 amplitudes to consecutive cues, whereas the preference for TTB was associated with substantial decreases in P3 amplitudes to consecutive cues. We also found that the preference for TTB was associated with enhanced N1 component to cues that discriminated decision alternatives, suggesting very early attention allocation to such cues by TTB users. Our results suggest that preference for either WADD or TTB has an early neural signature reflecting differences in attentional weighting of decision cues. In light of recent findings and hypotheses regarding P3, we interpret these results as indicating the involvement of catecholamine arousal systems in shaping predecisional information processing and strategy selection. PMID:28867996

  6. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  7. The enduring value of Gánti's chemoton model and life criteria: Heuristic pursuit of exact theoretical biology.

    PubMed

    Griesemer, James

    2015-09-21

    Gánti's chemoton model of the minimal chemical organization of living systems and life criteria for the living state and a living world are characterized. It is argued that these are better interpreted as part of a heuristic pursuit of an exact theoretical biology than as a "definition of life." Several problems with efforts to define life are discussed. Clarifying the proper use of Gánti's ideas to serve constructive engineering idealizations helps to show their enduring value. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Optimization Techniques for Clustering,Connectivity, and Flow Problems in Complex Networks

    DTIC Science & Technology

    2012-10-01

    discrete optimization and for analysis of performance of algorithm portfolios; introducing a metaheuristic framework of variable objective search that...The results of empirical evaluation of the proposed algorithm are also included. 1.3 Theoretical analysis of heuristics and designing new metaheuristic ...analysis of heuristics for inapproximable problems and designing new metaheuristic approaches for the problems of interest; (IV) Developing new models

  9. A heuristic simulation model of Lake Ontario circulation and mass balance transport

    USGS Publications Warehouse

    McKenna, J.E.; Chalupnicki, M.A.

    2011-01-01

    The redistribution of suspended organisms and materials by large-scale currents is part of natural ecological processes in large aquatic systems but can contribute to ecosystem disruption when exotic elements are introduced into the system. Toxic compounds and planktonic organisms spend various lengths of time in suspension before settling to the bottom or otherwise being removed. We constructed a simple physical simulation model, including the influence of major tributaries, to qualitatively examine circulation patterns in Lake Ontario. We used a simple mass balance approach to estimate the relative water input to and export from each of 10 depth regime-specific compartments (nearshore vs. offshore) comprising Lake Ontario. Despite its simplicity, our model produced circulation patterns similar to those reported by more complex studies in the literature. A three-gyre pattern, with the classic large counterclockwise central lake circulation, and a simpler two-gyre system were both observed. These qualitative simulations indicate little offshore transport along the south shore, except near the mouths of the Niagara River and Oswego River. Complex flow structure was evident, particularly near the Niagara River mouth and in offshore waters of the eastern basin. Average Lake Ontario residence time is 8 years, but the fastest model pathway indicated potential transport of plankton through the lake in as little as 60 days. This simulation illustrates potential invasion pathways and provides rough estimates of planktonic larval dispersal or chemical transport among nearshore and offshore areas of Lake Ontario. ?? 2011 Taylor & Francis.

  10. MetaPIGA v2.0: maximum likelihood large phylogeny estimation using the metapopulation genetic algorithm and other stochastic heuristics.

    PubMed

    Helaers, Raphaël; Milinkovitch, Michel C

    2010-07-15

    The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org.

  11. MetaPIGA v2.0: maximum likelihood large phylogeny estimation using the metapopulation genetic algorithm and other stochastic heuristics

    PubMed Central

    2010-01-01

    Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org. PMID:20633263

  12. A Sharp methodology for VLSI layout

    NASA Astrophysics Data System (ADS)

    Bapat, Shekhar

    1993-01-01

    The layout problem for VLSI circuits is recognized as a very difficult problem and has been traditionally decomposed into the several seemingly independent sub-problems of placement, global routing, and detailed routing. Although this structure achieves a reduction in programming complexity, it is also typically accompanied by a reduction in solution quality. Most current placement research recognizes that the separation is artificial, and that the placement and routing problems should be solved ideally in tandem. We propose a new interconnection model, Sharp and an associated partitioning algorithm. The Sharp interconnection model uses a partitioning shape that roughly resembles the musical sharp 'number sign' and makes extensive use of pre-computed rectilinear Steiner trees. The model is designed to generate strategic routing information along with the partitioning results. Additionally, the Sharp model also generates estimates of the routing congestion. We also propose the Sharp layout heuristic that solves the layout problem in its entirety. The Sharp layout heuristic makes extensive use of the Sharp partitioning model. The use of precomputed Steiner tree forms enables the method to model accurately net characteristics. For example, the Steiner tree forms can model both the length of the net and more importantly its route. In fact, the tree forms are also appropriate for modeling the timing delays of nets. The Sharp heuristic works to minimize both the total layout area by minimizing total net length (thus reducing the total wiring area), and the congestion imbalances in the various channels (thus reducing the unused or wasted channel area). Our heuristic uses circuit element movements amongst the different partitioning blocks and selection of alternate minimal Steiner tree forms to achieve this goal. The objective function for the algorithm can be modified readily to include other important circuit constraints like propagation delays. The layout technique first computes a very high-level approximation of the layout solution (i.e., the positions of the circuit elements and the associated net routes). The approximate solution is alternately refined, objective function. The technique creates well defined sub-problems and offers intermediary steps that can be solved in parallel, as well as a parallel mechanism to merge the sub-problem solutions.

  13. Intuitive Physics: Current Research and Controversies.

    PubMed

    Kubricht, James R; Holyoak, Keith J; Lu, Hongjing

    2017-10-01

    Early research in the field of intuitive physics provided extensive evidence that humans succumb to common misconceptions and biases when predicting, judging, and explaining activity in the physical world. Recent work has demonstrated that, across a diverse range of situations, some biases can be explained by the application of normative physical principles to noisy perceptual inputs. However, it remains unclear how knowledge of physical principles is learned, represented, and applied to novel situations. In this review we discuss theoretical advances from heuristic models to knowledge-based, probabilistic simulation models, as well as recent deep-learning models. We also consider how recent work may be reconciled with earlier findings that favored heuristic models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  15. Weak-noise limit of a piecewise-smooth stochastic differential equation.

    PubMed

    Chen, Yaming; Baule, Adrian; Touchette, Hugo; Just, Wolfram

    2013-11-01

    We investigate the validity and accuracy of weak-noise (saddle-point or instanton) approximations for piecewise-smooth stochastic differential equations (SDEs), taking as an illustrative example a piecewise-constant SDE, which serves as a simple model of Brownian motion with solid friction. For this model, we show that the weak-noise approximation of the path integral correctly reproduces the known propagator of the SDE at lowest order in the noise power, as well as the main features of the exact propagator with higher-order corrections, provided the singularity of the path integral associated with the nonsmooth SDE is treated with some heuristics. We also show that, as in the case of smooth SDEs, the deterministic paths of the noiseless system correctly describe the behavior of the nonsmooth SDE in the low-noise limit. Finally, we consider a smooth regularization of the piecewise-constant SDE and study to what extent this regularization can rectify some of the problems encountered when dealing with discontinuous drifts and singularities in SDEs.

  16. Customization of ¹³C-MFA strategy according to cell culture system.

    PubMed

    Quek, Lake-Ee; Nielsen, Lars K

    2014-01-01

    (13)C-MFA is far from being a simple assay for quantifying metabolic activity. It requires considerable up-front experimental planning and familiarity with the cell culture system in question, as well as optimized analytics and adequate computation frameworks. The success of a (13)C-MFA experiment is ultimately rated by the ability to accurately quantify the flux of one or more reactions of interest. In this chapter, we describe the different (13)C-MFA strategies that have been developed for the various fermentation or cell culture systems, as well as the limitations of the respective strategies. The strategies are affected by many factors and the (13)C-MFA modeling and experimental strategy must be tailored to conditions. The prevailing philosophy in the computation process is that any metabolic processes that produce significant systematic bias in the labeling pattern of the metabolites being measured must be described in the model. It is equally important to plan a labeling strategy by analytical screening or by heuristics.

  17. The Uses and Dependency Model of Mass Communication.

    ERIC Educational Resources Information Center

    Rubin, Alan M.; Windahl, Sven

    1986-01-01

    Responds to criticism of the uses and gratification model by proposing a modified model integrating the dependency perspective. Suggests that this integrated model broadens the heuristic application of the earlier model. (MS)

  18. On the distribution of interspecies correlation for Markov models of character evolution on Yule trees.

    PubMed

    Mulder, Willem H; Crawford, Forrest W

    2015-01-07

    Efforts to reconstruct phylogenetic trees and understand evolutionary processes depend fundamentally on stochastic models of speciation and mutation. The simplest continuous-time model for speciation in phylogenetic trees is the Yule process, in which new species are "born" from existing lineages at a constant rate. Recent work has illuminated some of the structural properties of Yule trees, but it remains mostly unknown how these properties affect sequence and trait patterns observed at the tips of the phylogenetic tree. Understanding the interplay between speciation and mutation under simple models of evolution is essential for deriving valid phylogenetic inference methods and gives insight into the optimal design of phylogenetic studies. In this work, we derive the probability distribution of interspecies covariance under Brownian motion and Ornstein-Uhlenbeck models of phenotypic change on a Yule tree. We compute the probability distribution of the number of mutations shared between two randomly chosen taxa in a Yule tree under discrete Markov mutation models. Our results suggest summary measures of phylogenetic information content, illuminate the correlation between site patterns in sequences or traits of related organisms, and provide heuristics for experimental design and reconstruction of phylogenetic trees. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Drift-based scrape-off particle width in X-point geometry

    NASA Astrophysics Data System (ADS)

    Reiser, D.; Eich, T.

    2017-04-01

    The Goldston heuristic estimate of the scrape-off layer width (Goldston 2012 Nucl. Fusion 52 013009) is reconsidered using a fluid description for the plasma dynamics. The basic ingredient is the inclusion of a compressible diamagnetic drift for the particle cross field transport. Instead of testing the heuristic model in a sophisticated numerical simulation including several physical mechanisms working together, the purpose of this work is to point out basic consequences for a drift-dominated cross field transport using a reduced fluid model. To evaluate the model equations and prepare them for subsequent numerical solution a specific analytical model for 2D magnetic field configurations with X-points is employed. In a first step parameter scans in high-resolution grids for isothermal plasmas are done to assess the basic formulas of the heuristic model with respect to the functional dependence of the scrape-off width on the poloidal magnetic field and plasma temperature. Particular features in the 2D-fluid calculations—especially the appearance of supersonic parallel flows and shock wave like bifurcational jumps—are discussed and can be understood partly in the framework of a reduced 1D model. The resulting semi-analytical findings might give hints for experimental proof and implementation in more elaborated fluid simulations.

  20. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    PubMed

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  1. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  2. Storage Costs and Heuristics Interact to Produce Patterns of Aphasic Sentence Comprehension Performance

    PubMed Central

    Clark, David Glenn

    2012-01-01

    Background: Despite general agreement that aphasic individuals exhibit difficulty understanding complex sentences, the nature of sentence complexity itself is unresolved. In addition, aphasic individuals appear to make use of heuristic strategies for understanding sentences. This research is a comparison of predictions derived from two approaches to the quantification of sentence complexity, one based on the hierarchical structure of sentences, and the other based on dependency locality theory (DLT). Complexity metrics derived from these theories are evaluated under various assumptions of heuristic use. Method: A set of complexity metrics was derived from each general theory of sentence complexity and paired with assumptions of heuristic use. Probability spaces were generated that summarized the possible patterns of performance across 16 different sentence structures. The maximum likelihood of comprehension scores of 42 aphasic individuals was then computed for each probability space and the expected scores from the best-fitting points in the space were recorded for comparison to the actual scores. Predictions were then compared using measures of fit quality derived from linear mixed effects models. Results: All three of the metrics that provide the most consistently accurate predictions of patient scores rely on storage costs based on the DLT. Patients appear to employ an Agent–Theme heuristic, but vary in their tendency to accept heuristically generated interpretations. Furthermore, the ability to apply the heuristic may be degraded in proportion to aphasia severity. Conclusion: DLT-derived storage costs provide the best prediction of sentence comprehension patterns in aphasia. Because these costs are estimated by counting incomplete syntactic dependencies at each point in a sentence, this finding suggests that aphasia is associated with reduced availability of cognitive resources for maintaining these dependencies. PMID:22590462

  3. Storage costs and heuristics interact to produce patterns of aphasic sentence comprehension performance.

    PubMed

    Clark, David Glenn

    2012-01-01

    Despite general agreement that aphasic individuals exhibit difficulty understanding complex sentences, the nature of sentence complexity itself is unresolved. In addition, aphasic individuals appear to make use of heuristic strategies for understanding sentences. This research is a comparison of predictions derived from two approaches to the quantification of sentence complexity, one based on the hierarchical structure of sentences, and the other based on dependency locality theory (DLT). Complexity metrics derived from these theories are evaluated under various assumptions of heuristic use. A set of complexity metrics was derived from each general theory of sentence complexity and paired with assumptions of heuristic use. Probability spaces were generated that summarized the possible patterns of performance across 16 different sentence structures. The maximum likelihood of comprehension scores of 42 aphasic individuals was then computed for each probability space and the expected scores from the best-fitting points in the space were recorded for comparison to the actual scores. Predictions were then compared using measures of fit quality derived from linear mixed effects models. All three of the metrics that provide the most consistently accurate predictions of patient scores rely on storage costs based on the DLT. Patients appear to employ an Agent-Theme heuristic, but vary in their tendency to accept heuristically generated interpretations. Furthermore, the ability to apply the heuristic may be degraded in proportion to aphasia severity. DLT-derived storage costs provide the best prediction of sentence comprehension patterns in aphasia. Because these costs are estimated by counting incomplete syntactic dependencies at each point in a sentence, this finding suggests that aphasia is associated with reduced availability of cognitive resources for maintaining these dependencies.

  4. Does interaction matter? Testing whether a confidence heuristic can replace interaction in collective decision-making

    PubMed Central

    Bang, Dan; Fusaroli, Riccardo; Tylén, Kristian; Olsen, Karsten; Latham, Peter E.; Lau, Jennifer Y.F.; Roepstorff, Andreas; Rees, Geraint; Frith, Chris D.; Bahrami, Bahador

    2014-01-01

    In a range of contexts, individuals arrive at collective decisions by sharing confidence in their judgements. This tendency to evaluate the reliability of information by the confidence with which it is expressed has been termed the ‘confidence heuristic’. We tested two ways of implementing the confidence heuristic in the context of a collective perceptual decision-making task: either directly, by opting for the judgement made with higher confidence, or indirectly, by opting for the faster judgement, exploiting an inverse correlation between confidence and reaction time. We found that the success of these heuristics depends on how similar individuals are in terms of the reliability of their judgements and, more importantly, that for dissimilar individuals such heuristics are dramatically inferior to interaction. Interaction allows individuals to alleviate, but not fully resolve, differences in the reliability of their judgements. We discuss the implications of these findings for models of confidence and collective decision-making. PMID:24650632

  5. The effects of heuristic cues, motivation, and ability on systematic processing of information about breast cancer environmental factors.

    PubMed

    Smith, Sandi W; Hitt, Rose; Nazione, Samantha; Russell, Jessica; Silk, Kami; Atkin, Charles K

    2013-01-01

    The heuristic systematic model is used to investigate how ability, motivation, and heuristic message cues predict knowledge scores for individuals receiving messages written for different literacy levels about 3 environmental risk factors for breast cancer. The 3 risk factors were the roles of genetics, progesterone, and ingesting perfluorooctanoic acid in breast cancer risk. In this study, more than 4,000 women participated in an online survey. The results showed support for the hypotheses that ability (measured as education, number of science courses, and confidence in scientific ability) predict knowledge gain and that those individuals who presented with the lower literacy level message had significantly higher knowledge scores across all 3 message topics. There was little support for motivation or heuristic cues as direct predictors of knowledge gain across the 3 message topics, although they served as moderators for the perfluorooctanoic acid topic. The authors provide implications for health communication practitioners.

  6. Meta-RaPS Algorithm for the Aerial Refueling Scheduling Problem

    NASA Technical Reports Server (NTRS)

    Kaplan, Sezgin; Arin, Arif; Rabadi, Ghaith

    2011-01-01

    The Aerial Refueling Scheduling Problem (ARSP) can be defined as determining the refueling completion times for each fighter aircraft (job) on multiple tankers (machines). ARSP assumes that jobs have different release times and due dates, The total weighted tardiness is used to evaluate schedule's quality. Therefore, ARSP can be modeled as a parallel machine scheduling with release limes and due dates to minimize the total weighted tardiness. Since ARSP is NP-hard, it will be more appropriate to develop a pproimate or heuristic algorithm to obtain solutions in reasonable computation limes. In this paper, Meta-Raps-ATC algorithm is implemented to create high quality solutions. Meta-RaPS (Meta-heuristic for Randomized Priority Search) is a recent and promising meta heuristic that is applied by introducing randomness to a construction heuristic. The Apparent Tardiness Rule (ATC), which is a good rule for scheduling problems with tardiness objective, is used to construct initial solutions which are improved by an exchanging operation. Results are presented for generated instances.

  7. Inhibition of misleading heuristics as a core mechanism for typical cognitive development: evidence from behavioural and brain-imaging studies.

    PubMed

    Borst, Grégoire; Aïte, Ania; Houdé, Olivier

    2015-04-01

    Cognitive development is generally conceived as incremental with knowledge of increasing complexity acquired throughout childhood and adolescence. However, several studies have now demonstrated not only that infants possess complex cognitive abilities but also that older children, adolescents, and adults tend to make systematic errors even in simple logical reasoning tasks. Therefore, one of the main issues for any theory of typical cognitive development is to provide an explanation of why at some age and in some contexts children, adolescents, and adults do not express a knowledge or cognitive principle that they already acquired when they were younger. In this review, we present convergent behavioural and neurocognitive evidence that cognitive development is more similar to a non-linear dynamic system than to a linear, stage-like system. In this theoretical framework, errors can emerge in problems similar to the ones infants or young children were succeeding when older children, adolescents, and adults rely on a misleading heuristic rather than on the correct logical algorithm to solve such problems. And the core mechanism for overcoming these errors is inhibitory control (i.e. the ability to inhibit the misleading heuristics). Therefore, typical cognitive development relies not only on the ability to acquire knowledge of incremental complexity but also to inhibit previously acquired knowledge. © 2015 The Authors. Developmental Medicine & Child Neurology © 2015 Mac Keith Press.

  8. A usability evaluation exploring the design of American Nurses Association state web sites.

    PubMed

    Alexander, Gregory L; Wakefield, Bonnie J; Anbari, Allison B; Lyons, Vanessa; Prentice, Donna; Shepherd, Marilyn; Strecker, E Bradley; Weston, Marla J

    2014-08-01

    National leaders are calling for opportunities to facilitate the Future of Nursing. Opportunities can be encouraged through state nurses association Web sites, which are part of the American Nurses Association, that are well designed, with appropriate content, and in a language professional nurses understand. The American Nurses Association and constituent state nurses associations provide information about nursing practice, ethics, credentialing, and health on Web sites. We conducted usability evaluations to determine compliance with heuristic and ethical principles for Web site design. We purposefully sampled 27 nursing association Web sites and used 68 heuristic and ethical criteria to perform systematic usability assessments of nurse association Web sites. Web site analysis included seven double experts who were all RNs trained in usability analysis. The extent to which heuristic and ethical criteria were met ranged widely from one state that met 0% of the criteria for "help and documentation" to states that met greater than 92% of criteria for "visibility of system status" and "aesthetic and minimalist design." Suggested improvements are simple yet make an impact on a first-time visitor's impression of the Web site. For example, adding internal navigation and tracking features and providing more details about the application process through help and frequently asked question documentation would facilitate better use. Improved usability will improve effectiveness, efficiency, and consumer satisfaction with these Web sites.

  9. Choice Rules and Accumulator Networks

    PubMed Central

    2015-01-01

    This article presents a preference accumulation model that can be used to implement a number of different multi-attribute heuristic choice rules, including the lexicographic rule, the majority of confirming dimensions (tallying) rule and the equal weights rule. The proposed model differs from existing accumulators in terms of attribute representation: Leakage and competition, typically applied only to preference accumulation, are also assumed to be involved in processing attribute values. This allows the model to perform a range of sophisticated attribute-wise comparisons, including comparisons that compute relative rank. The ability of a preference accumulation model composed of leaky competitive networks to mimic symbolic models of heuristic choice suggests that these 2 approaches are not incompatible, and that a unitary cognitive model of preferential choice, based on insights from both these approaches, may be feasible. PMID:28670592

  10. Fracture heuristics: surgical decision for approaches to distal radius fractures. A surgeon's perspective.

    PubMed

    Wichlas, Florian; Tsitsilonis, Serafim; Kopf, Sebastian; Krapohl, Björn Dirk; Manegold, Sebastian

    2017-01-01

    Introduction: The aim of the present study is to develop a heuristic that could replace the surgeon's analysis for the decision on the operative approach of distal radius fractures based on simple fracture characteristics. Patients and methods: Five hundred distal radius fractures operated between 2011 and 2014 were analyzed for the surgeon's decision on the approach used. The 500 distal radius fractures were treated with open reduction and internal fixation through palmar, dorsal, and dorsopalmar approaches with 2.4 mm locking plates or underwent percutaneous fixation. The parameters that should replace the surgeon's analysis were the fractured palmar cortex, and the frontal and the sagittal split of the articular surface of the distal radius. Results: The palmar approach was used for 422 (84.4%) fractures, the dorsal approach for 39 (7.8%), and the combined dorsopalmar approach for 30 (6.0%). Nine (1.8%) fractures were treated percutaneously. The correlation between the fractured palmar cortex and the used palmar approach was moderate (r=0.464; p<0.0001). The correlation between the frontal split and the dorsal approach, including the dorsopalmar approach, was strong (r=0.715; p<0.0001). The sagittal split had only a weak correlation for the dorsal and dorsopalmar approach (r=0.300; p<0.0001). Discussion: The study shows that the surgical decision on the preferred approach is dictated through two simple factors, even in the case of complex fractures. Conclusion: When the palmar cortex is displaced in distal radius fractures, a palmar approach should be used. When there is a displaced frontal split of the articular surface, a dorsal approach should be used. When both are present, a dorsopalmar approach should be used. These two simple parameters could replace the surgeon's analysis for the surgical approach.

  11. Quantifying the origins of life on a planetary scale

    PubMed Central

    Scharf, Caleb; Cronin, Leroy

    2016-01-01

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of “microscale” factors and their role in determining “macroscale” abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities. PMID:27382156

  12. Sniffer Channel Selection for Monitoring Wireless LANs

    NASA Astrophysics Data System (ADS)

    Song, Yuan; Chen, Xian; Kim, Yoo-Ah; Wang, Bing; Chen, Guanling

    Wireless sniffers are often used to monitor APs in wireless LANs (WLANs) for network management, fault detection, traffic characterization, and optimizing deployment. It is cost effective to deploy single-radio sniffers that can monitor multiple nearby APs. However, since nearby APs often operate on orthogonal channels, a sniffer needs to switch among multiple channels to monitor its nearby APs. In this paper, we formulate and solve two optimization problems on sniffer channel selection. Both problems require that each AP be monitored by at least one sniffer. In addition, one optimization problem requires minimizing the maximum number of channels that a sniffer listens to, and the other requires minimizing the total number of channels that the sniffers listen to. We propose a novel LP-relaxation based algorithm, and two simple greedy heuristics for the above two optimization problems. Through simulation, we demonstrate that all the algorithms are effective in achieving their optimization goals, and the LP-based algorithm outperforms the greedy heuristics.

  13. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  14. Decision Making in Paediatric Cardiology. Are We Prone to Heuristics, Biases and Traps?

    PubMed

    Ryan, Aedin; Duignan, Sophie; Kenny, Damien; McMahon, Colin J

    2018-01-01

    Hidden traps in decision making have been long recognised in the behavioural economics community. Yet we spend very limited, if any time, analysing our decision-making processes in medicine and paediatric cardiology. Systems 1 and 2 thought processes differentiate between rapid emotional thoughts and slow deliberate rational thoughts. For fairly clear cut medical decisions, in-depth analysis may not be needed, but in our field of paediatric cardiology it is not uncommon for challenging cases and occasionally 'simple' cases to generate significant debate and uncertainty as to the best decision. Although morbidity and mortality meetings frequently highlight poor outcomes for our patients, they often neglect to analyse the process of thought which underlined those decisions taken. This article attempts to review commonly acknowledged traps in decision making in the behavioural economics world to ascertain whether these heuristics translate to decision making in the paediatric cardiology environment. We also discuss potential individual and collective solutions to pitfalls in decision making.

  15. Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Lee, Charles H.

    2012-01-01

    We developed framework and the mathematical formulation for optimizing communication network using mixed integer programming. The design yields a system that is much smaller, in search space size, when compared to the earlier approach. Our constrained network optimization takes into account the dynamics of link performance within the network along with mission and operation requirements. A unique penalty function is introduced to transform the mixed integer programming into the more manageable problem of searching in a continuous space. The constrained optimization problem was proposed to solve in two stages: first using the heuristic Particle Swarming Optimization algorithm to get a good initial starting point, and then feeding the result into the Sequential Quadratic Programming algorithm to achieve the final optimal schedule. We demonstrate the above planning and scheduling methodology with a scenario of 20 spacecraft and 3 ground stations of a Deep Space Network site. Our approach and framework have been simple and flexible so that problems with larger number of constraints and network can be easily adapted and solved.

  16. A heuristic evaluation of long-term global sea level acceleration

    NASA Astrophysics Data System (ADS)

    Spada, Giorgio; Olivieri, Marco; Galassi, Gaia

    2015-05-01

    In view of the scientific and social implications, the global mean sea level rise (GMSLR) and its possible causes and future trend have been a challenge for so long. For the twentieth century, reconstructions generally indicate a rate of GMSLR in the range of 1.5 to 2.0 mm yr-1. However, the existence of nonlinear trends is still debated, and current estimates of the secular acceleration are subject to ample uncertainties. Here we use various GMSLR estimates published on scholarly journals since the 1940s for a heuristic assessment of global sea level acceleration. The approach, alternative to sea level reconstructions, is based on simple statistical methods and exploits the principles of meta-analysis. Our results point to a global sea level acceleration of 0.54 ± 0.27 mm/yr/century (1σ) between 1898 and 1975. This supports independent estimates and suggests that a sea level acceleration since the early 1900s is more likely than currently believed.

  17. The coupling of cerebral blood flow and oxygen metabolism with brain activation is similar for simple and complex stimuli in human primary visual cortex.

    PubMed

    Griffeth, Valerie E M; Simon, Aaron B; Buxton, Richard B

    2015-01-01

    Quantitative functional MRI (fMRI) experiments to measure blood flow and oxygen metabolism coupling in the brain typically rely on simple repetitive stimuli. Here we compared such stimuli with a more naturalistic stimulus. Previous work on the primary visual cortex showed that direct attentional modulation evokes a blood flow (CBF) response with a relatively large oxygen metabolism (CMRO2) response in comparison to an unattended stimulus, which evokes a much smaller metabolic response relative to the flow response. We hypothesized that a similar effect would be associated with a more engaging stimulus, and tested this by measuring the primary human visual cortex response to two contrast levels of a radial flickering checkerboard in comparison to the response to free viewing of brief movie clips. We did not find a significant difference in the blood flow-metabolism coupling (n=%ΔCBF/%ΔCMRO2) between the movie stimulus and the flickering checkerboards employing two different analysis methods: a standard analysis using the Davis model and a new analysis using a heuristic model dependent only on measured quantities. This finding suggests that in the primary visual cortex a naturalistic stimulus (in comparison to a simple repetitive stimulus) is either not sufficient to provoke a change in flow-metabolism coupling by attentional modulation as hypothesized, that the experimental design disrupted the cognitive processes underlying the response to a more natural stimulus, or that the technique used is not sensitive enough to detect a small difference. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD).

    PubMed

    Khowaja, Kamran; Salim, Siti Salwah; Asemi, Adeleh

    2015-01-01

    In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen's set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen's heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system.

  19. Novel Methods for Analysing Bacterial Tracks Reveal Persistence in Rhodobacter sphaeroides

    PubMed Central

    Rosser, Gabriel; Fletcher, Alexander G.; Wilkinson, David A.; de Beyer, Jennifer A.; Yates, Christian A.; Armitage, Judith P.; Maini, Philip K.; Baker, Ruth E.

    2013-01-01

    Tracking bacteria using video microscopy is a powerful experimental approach to probe their motile behaviour. The trajectories obtained contain much information relating to the complex patterns of bacterial motility. However, methods for the quantitative analysis of such data are limited. Most swimming bacteria move in approximately straight lines, interspersed with random reorientation phases. It is therefore necessary to segment observed tracks into swimming and reorientation phases to extract useful statistics. We present novel robust analysis tools to discern these two phases in tracks. Our methods comprise a simple and effective protocol for removing spurious tracks from tracking datasets, followed by analysis based on a two-state hidden Markov model, taking advantage of the availability of mutant strains that exhibit swimming-only or reorientating-only motion to generate an empirical prior distribution. Using simulated tracks with varying levels of added noise, we validate our methods and compare them with an existing heuristic method. To our knowledge this is the first example of a systematic assessment of analysis methods in this field. The new methods are substantially more robust to noise and introduce less systematic bias than the heuristic method. We apply our methods to tracks obtained from the bacterial species Rhodobacter sphaeroides and Escherichia coli. Our results demonstrate that R. sphaeroides exhibits persistence over the course of a tumbling event, which is a novel result with important implications in the study of this and similar species. PMID:24204227

  20. Parental investment: how an equity motive can produce inequality.

    PubMed

    Hertwig, Ralph; Davis, Jennifer Nerissa; Sulloway, Frank J

    2002-09-01

    The equity heuristic is a decision rule specifying that parents should attempt to subdivide resources more or less equally among their children. This investment rule coincides with the prescription from optimality models in economics and biology in cases in which expected future return for each offspring is equal. In this article, the authors present a counterintuitive implication of the equity heuristic: Whereas an equity motive produces a fair distribution at any given point in time, it yields a cumulative distribution of investments that is unequal. The authors test this analytical observation against evidence reported in studies exploring parental investment and show how the equity heuristic can provide an explanation of why the literature reports a diversity of birth order effects with respect to parental resource allocation.

  1. Prediction-based dynamic load-sharing heuristics

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.

    1993-01-01

    The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.

  2. Glass microneedles for force measurements: a finite-element analysis model

    PubMed Central

    Ayittey, Peter N.; Walker, John S.; Rice, Jeremy J.; de Tombe, Pieter P.

    2010-01-01

    Changes in developed force (0.1–3.0 μN) observed during contraction of single myofibrils in response to rapidly changing calcium concentrations can be measured using glass microneedles. These microneedles are calibrated for stiffness and deflect on response to developed myofibril force. The precision and accuracy of kinetic measurements are highly dependent on the structural and mechanical characteristics of the microneedles, which are generally assumed to have a linear force–deflection relationship. We present a finite-element analysis (FEA) model used to simulate the effects of measurable geometry on stiffness as a function of applied force and validate our model with actual measured needle properties. In addition, we developed a simple heuristic constitutive equation that best describes the stiffness of our range of microneedles used and define limits of geometry parameters within which our predictions hold true. Our model also maps a relation between the geometry parameters and natural frequencies in air, enabling optimum parametric combinations for microneedle fabrication that would reflect more reliable force measurement in fluids and physiological environments. We propose a use for this model to aid in the design of microneedles to improve calibration time, reproducibility, and precision for measuring myofibrillar, cellular, and supramolecular kinetic forces. PMID:19104827

  3. Multiscale modeling and distributed computing to predict cosmesis outcome after a lumpectomy

    NASA Astrophysics Data System (ADS)

    Garbey, M.; Salmon, R.; Thanoon, D.; Bass, B. L.

    2013-07-01

    Surgery for early stage breast carcinoma is either total mastectomy (complete breast removal) or surgical lumpectomy (only tumor removal). The lumpectomy or partial mastectomy is intended to preserve a breast that satisfies the woman's cosmetic, emotional and physical needs. But in a fairly large number of cases the cosmetic outcome is not satisfactory. Today, predicting that surgery outcome is essentially based on heuristic. Modeling such a complex process must encompass multiple scales, in space from cells to tissue, as well as in time, from minutes for the tissue mechanics to months for healing. The goal of this paper is to present a first step in multiscale modeling of the long time scale prediction of breast shape after tumor resection. This task requires coupling very different mechanical and biological models with very different computing needs. We provide a simple illustration of the application of heterogeneous distributed computing and modular software design to speed up the model development. Our computational framework serves currently to test hypothesis on breast tissue healing in a pilot study with women who have been elected to undergo BCT and are being treated at the Methodist Hospital in Houston, TX.

  4. A model for diagnosing and explaining multiple disorders.

    PubMed

    Jamieson, P W

    1991-08-01

    The ability to diagnose multiple interacting disorders and explain them in a coherent causal framework has only partially been achieved in medical expert systems. This paper proposes a causal model for diagnosing and explaining multiple disorders whose key elements are: physician-directed hypotheses generation, object-oriented knowledge representation, and novel explanation heuristics. The heuristics modify and link the explanations to make the physician aware of diagnostic complexities. A computer program incorporating the model currently is in use for diagnosing peripheral nerve and muscle disorders. The program successfully diagnoses and explains interactions between diseases in terms of underlying pathophysiologic concepts. The model offers a new architecture for medical domains where reasoning from first principles is difficult but explanation of disease interactions is crucial for the system's operation.

  5. A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints

    NASA Astrophysics Data System (ADS)

    Estiningsih, Y.; Farikhin; Tjahjana, R. H.

    2018-03-01

    Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.

  6. A Transferrable Belief Model Representation for Physical Security of Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gerts

    This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less

  7. A Hidden Markov Model Approach to the Problem of Heuristic Selection in Hyper-Heuristics with a Case Study in High School Timetabling Problems.

    PubMed

    Kheiri, Ahmed; Keedwell, Ed

    2017-01-01

    Operations research is a well-established field that uses computational systems to support decisions in business and public life. Good solutions to operations research problems can make a large difference to the efficient running of businesses and organisations and so the field often searches for new methods to improve these solutions. The high school timetabling problem is an example of an operations research problem and is a challenging task which requires assigning events and resources to time slots subject to a set of constraints. In this article, a new sequence-based selection hyper-heuristic is presented that produces excellent results on a suite of high school timetabling problems. In this study, we present an easy-to-implement, easy-to-maintain, and effective sequence-based selection hyper-heuristic to solve high school timetabling problems using a benchmark of unified real-world instances collected from different countries. We show that with sequence-based methods, it is possible to discover new best known solutions for a number of the problems in the timetabling domain. Through this investigation, the usefulness of sequence-based selection hyper-heuristics has been demonstrated and the capability of these methods has been shown to exceed the state of the art.

  8. Decision heuristic or preference? Attribute non-attendance in discrete choice problems.

    PubMed

    Heidenreich, Sebastian; Watson, Verity; Ryan, Mandy; Phimister, Euan

    2018-01-01

    This paper investigates if respondents' choice to not consider all characteristics of a multiattribute health service may represent preferences. Over the last decade, an increasing number of studies account for attribute non-attendance (ANA) when using discrete choice experiments to elicit individuals' preferences. Most studies assume such behaviour is a heuristic and therefore uninformative. This assumption may result in misleading welfare estimates if ANA reflects preferences. This is the first paper to assess if ANA is a heuristic or genuine preference without relying on respondents' self-stated motivation and the first study to explore this question within a health context. Based on findings from cognitive psychology, we expect that familiar respondents are less likely to use a decision heuristic to simplify choices than unfamiliar respondents. We employ a latent class model of discrete choice experiment data concerned with National Health Service managers' preferences for support services that assist with performance concerns. We present quantitative and qualitative evidence that in our study ANA mostly represents preferences. We also show that wrong assumptions about ANA result in inadequate welfare measures that can result in suboptimal policy advice. Future research should proceed with caution when assuming that ANA is a heuristic. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Multi-objective Decision Based Available Transfer Capability in Deregulated Power System Using Heuristic Approaches

    NASA Astrophysics Data System (ADS)

    Pasam, Gopi Krishna; Manohar, T. Gowri

    2016-09-01

    Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.

  10. Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD)

    PubMed Central

    Khowaja, Kamran; Salim, Siti Salwah

    2015-01-01

    In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen’s set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen’s heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system. PMID:26196385

  11. Real time algorithms for sharp wave ripple detection.

    PubMed

    Sethi, Ankit; Kemere, Caleb

    2014-01-01

    Neural activity during sharp wave ripples (SWR), short bursts of co-ordinated oscillatory activity in the CA1 region of the rodent hippocampus, is implicated in a variety of memory functions from consolidation to recall. Detection of these events in an algorithmic framework, has thus far relied on simple thresholding techniques with heuristically derived parameters. This study is an investigation into testing and improving the current methods for detection of SWR events in neural recordings. We propose and profile methods to reduce latency in ripple detection. Proposed algorithms are tested on simulated ripple data. The findings show that simple realtime algorithms can improve upon existing power thresholding methods and can detect ripple activity with latencies in the range of 10-20 ms.

  12. From Heuristic to Mathematical Modeling of Drugs Dissolution Profiles: Application of Artificial Neural Networks and Genetic Programming

    PubMed Central

    Mendyk, Aleksander; Güres, Sinan; Szlęk, Jakub; Wiśniowska, Barbara; Kleinebudde, Peter

    2015-01-01

    The purpose of this work was to develop a mathematical model of the drug dissolution (Q) from the solid lipid extrudates based on the empirical approach. Artificial neural networks (ANNs) and genetic programming (GP) tools were used. Sensitivity analysis of ANNs provided reduction of the original input vector. GP allowed creation of the mathematical equation in two major approaches: (1) direct modeling of Q versus extrudate diameter (d) and the time variable (t) and (2) indirect modeling through Weibull equation. ANNs provided also information about minimum achievable generalization error and the way to enhance the original dataset used for adjustment of the equations' parameters. Two inputs were found important for the drug dissolution: d and t. The extrudates length (L) was found not important. Both GP modeling approaches allowed creation of relatively simple equations with their predictive performance comparable to the ANNs (root mean squared error (RMSE) from 2.19 to 2.33). The direct mode of GP modeling of Q versus d and t resulted in the most robust model. The idea of how to combine ANNs and GP in order to escape ANNs' black-box drawback without losing their superior predictive performance was demonstrated. Open Source software was used to deliver the state-of-the-art models and modeling strategies. PMID:26101544

  13. From Heuristic to Mathematical Modeling of Drugs Dissolution Profiles: Application of Artificial Neural Networks and Genetic Programming.

    PubMed

    Mendyk, Aleksander; Güres, Sinan; Jachowicz, Renata; Szlęk, Jakub; Polak, Sebastian; Wiśniowska, Barbara; Kleinebudde, Peter

    2015-01-01

    The purpose of this work was to develop a mathematical model of the drug dissolution (Q) from the solid lipid extrudates based on the empirical approach. Artificial neural networks (ANNs) and genetic programming (GP) tools were used. Sensitivity analysis of ANNs provided reduction of the original input vector. GP allowed creation of the mathematical equation in two major approaches: (1) direct modeling of Q versus extrudate diameter (d) and the time variable (t) and (2) indirect modeling through Weibull equation. ANNs provided also information about minimum achievable generalization error and the way to enhance the original dataset used for adjustment of the equations' parameters. Two inputs were found important for the drug dissolution: d and t. The extrudates length (L) was found not important. Both GP modeling approaches allowed creation of relatively simple equations with their predictive performance comparable to the ANNs (root mean squared error (RMSE) from 2.19 to 2.33). The direct mode of GP modeling of Q versus d and t resulted in the most robust model. The idea of how to combine ANNs and GP in order to escape ANNs' black-box drawback without losing their superior predictive performance was demonstrated. Open Source software was used to deliver the state-of-the-art models and modeling strategies.

  14. A Heuristic Fast Method to Solve the Nonlinear Schroedinger Equation in Fiber Bragg Gratings with Arbitrary Shape Input Pulse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emami, F.; Hatami, M.; Keshavarz, A. R.

    2009-08-13

    Using a combination of Runge-Kutta and Jacobi iterative method, we could solve the nonlinear Schroedinger equation describing the pulse propagation in FBGs. By decomposing the electric field to forward and backward components in fiber Bragg grating and utilizing the Fourier series analysis technique, the boundary value problem of a set of coupled equations governing the pulse propagation in FBG changes to an initial condition coupled equations which can be solved by simple Runge-Kutta method.

  15. Observer properties for understanding dynamical displays: Capacities, limitations, and defaults

    NASA Technical Reports Server (NTRS)

    Proffitt, Dennis R.; Kaiser, Mary K.

    1991-01-01

    People's ability to extract relevant information while viewing ongoing events is discussed in terms of human capabilities, limitations, and defaults. A taxonomy of event complexity is developed which predicts which dynamical events people can and cannot construe. This taxonomy is related to the distinction drawn in classical mechanics between particle and extended body motions. People's commonsense understandings of simple mechanical systems are impacted little by formal training, but rather reflect heuristical simplifications that focus on a single dimension of perceived dynamical relevance.

  16. Familiarity and Recollection in Heuristic Decision Making

    PubMed Central

    Schwikert, Shane R.; Curran, Tim

    2014-01-01

    Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the by-products of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the two heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective pre-experimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical frame work that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PMID:25347534

  17. Familiarity and recollection in heuristic decision making.

    PubMed

    Schwikert, Shane R; Curran, Tim

    2014-12-01

    Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the byproducts of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the 2 heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective preexperimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical framework that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. Incentives for Optimal Multi-level Allocation of HIV Prevention Resources

    PubMed Central

    Malvankar, Monali M.; Zaric, Gregory S.

    2013-01-01

    HIV/AIDS prevention funds are often allocated at multiple levels of decision-making. Optimal allocation of HIV prevention funds maximizes the number of HIV infections averted. However, decision makers often allocate using simple heuristics such as proportional allocation. We evaluate the impact of using incentives to encourage optimal allocation in a two-level decision-making process. We model an incentive based decision-making process consisting of an upper-level decision maker allocating funds to a single lower-level decision maker who then distributes funds to local programs. We assume that the lower-level utility function is linear in the amount of the budget received from the upper-level, the fraction of funds reserved for proportional allocation, and the number of infections averted. We assume that the upper level objective is to maximize the number of infections averted. We illustrate with an example using data from California, U.S. PMID:23766551

  19. Efficient prediction designs for random fields.

    PubMed

    Müller, Werner G; Pronzato, Luc; Rendas, Joao; Waldl, Helmut

    2015-03-01

    For estimation and predictions of random fields, it is increasingly acknowledged that the kriging variance may be a poor representative of true uncertainty. Experimental designs based on more elaborate criteria that are appropriate for empirical kriging (EK) are then often non-space-filling and very costly to determine. In this paper, we investigate the possibility of using a compound criterion inspired by an equivalence theorem type relation to build designs quasi-optimal for the EK variance when space-filling designs become unsuitable. Two algorithms are proposed, one relying on stochastic optimization to explicitly identify the Pareto front, whereas the second uses the surrogate criteria as local heuristic to choose the points at which the (costly) true EK variance is effectively computed. We illustrate the performance of the algorithms presented on both a simple simulated example and a real oceanographic dataset. © 2014 The Authors. Applied Stochastic Models in Business and Industry published by John Wiley & Sons, Ltd.

  20. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics

    PubMed Central

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-01-01

    Motivation: RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of O(n6). Subsequently, numerous faster ‘Sankoff-style’ approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity (≥ quartic time). Results: Breaking this barrier, we introduce the novel Sankoff-style algorithm ‘sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)’, which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff’s original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. Availability and implementation: SPARSE is freely available at http://www.bioinf.uni-freiburg.de/Software/SPARSE. Contact: backofen@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25838465

  1. Establishing usability heuristics for heuristics evaluation in a specific domain: Is there a consensus?

    PubMed

    Hermawati, Setia; Lawson, Glyn

    2016-09-01

    Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Heuristics structure and pervade formal risk assessment.

    PubMed

    MacGillivray, Brian H

    2014-04-01

    Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art. © 2013 Society for Risk Analysis.

  3. Evolutionary Artificial Neural Network Weight Tuning to Optimize Decision Making for an Abstract Game

    DTIC Science & Technology

    2010-03-01

    separate LoA heuristic. If any of the examined heuristics produced competitive player , then the final measurement was a success . Barring that, a...if offline training actually results in a successful player . Whereas offline learning plays many games and then trains as many networks as desired...a competitive Lines of Action player , shedding light on the difficulty of developing a neural network to model such a large and complex solution

  4. Pitfalls in Teaching Judgment Heuristics

    ERIC Educational Resources Information Center

    Shepperd, James A.; Koch, Erika J.

    2005-01-01

    Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…

  5. From anomalies to forecasts: Toward a descriptive model of decisions under risk, under ambiguity, and from experience.

    PubMed

    Erev, Ido; Ert, Eyal; Plonsky, Ori; Cohen, Doron; Cohen, Oded

    2017-07-01

    Experimental studies of choice behavior document distinct, and sometimes contradictory, deviations from maximization. For example, people tend to overweight rare events in 1-shot decisions under risk, and to exhibit the opposite bias when they rely on past experience. The common explanations of these results assume that the contradicting anomalies reflect situation-specific processes that involve the weighting of subjective values and the use of simple heuristics. The current article analyzes 14 choice anomalies that have been described by different models, including the Allais, St. Petersburg, and Ellsberg paradoxes, and the reflection effect. Next, it uses a choice prediction competition methodology to clarify the interaction between the different anomalies. It focuses on decisions under risk (known payoff distributions) and under ambiguity (unknown probabilities), with and without feedback concerning the outcomes of past choices. The results demonstrate that it is not necessary to assume situation-specific processes. The distinct anomalies can be captured by assuming high sensitivity to the expected return and 4 additional tendencies: pessimism, bias toward equal weighting, sensitivity to payoff sign, and an effort to minimize the probability of immediate regret. Importantly, feedback increases sensitivity to probability of regret. Simple abstractions of these assumptions, variants of the model Best Estimate and Sampling Tools (BEAST), allow surprisingly accurate ex ante predictions of behavior. Unlike the popular models, BEAST does not assume subjective weighting functions or cognitive shortcuts. Rather, it assumes the use of sampling tools and reliance on small samples, in addition to the estimation of the expected values. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Arguments, contradictions, resistances, and conceptual change in students' understanding of atomic structure

    NASA Astrophysics Data System (ADS)

    Niaz, Mansoor; Aguilera, Damarys; Maza, Arelys; Liendo, Gustavo

    2002-07-01

    Most general chemistry courses and textbooks emphasize experimental details and lack a history and philosophy of science perspective. The objective of this study is to facilitate freshman general chemistry students' understanding of atomic structure based on the work of Thomson, Rutherford, and Bohr. It is hypothesized that classroom discussions based on arguments/counterarguments of the heuristic principles, on which these scientists based their atomic models, can facilitate students' conceptual understanding. This study is based on 160 freshman students enrolled in six sections of General Chemistry I (three sections formed part of the experimental group). All three models (Thomson, Rutherford, and Bohr) were presented to the experimental and control group students in the traditional manner, as found in most textbooks. After this, the three sections of the experimental group participated in the discussion of six items with alternative responses. Students were first asked to select a response and then participate in classroom discussions leading to arguments in favor or against the selected response and finally select a new response. Three weeks after having discussed the six items, both the experimental and control groups presented a monthly exam (based on the three models) and after another 3 weeks a semester exam. Results obtained show that given the opportunity to argue and discuss, students' understanding can go beyond the simple regurgitation of experimental details. Performance of the experimental group showed contradictions, resistances, and progressive conceptual change with considerable and consistent improvement in the last item. It is concluded that if we want our students to understand scientific progress and practice, then it is important that we include the experimental details not as a rhetoric of conclusions (Schwab, 1962, The teaching of science as enquiry, Cambridge, MA, Harward University Press; Schwab, 1974, Conflicting conceptions of curriculum, Berkeley, CA, McCutchan) but as heuristic principles (Lakatos, 1970, Criticism and the growth of knowledge, Cambridge, UK, Cambridge University Press, pp. 91-195), which were based on arguments, controversies, and interpretations of the scientists.

  7. A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.

    PubMed

    Dubljević, Veljko; Racine, Eric

    2014-10-01

    The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).

  8. Transnational gestational surrogacy: does it have to be exploitative?

    PubMed

    Kirby, Jeffrey

    2014-01-01

    This article explores the controversial practice of transnational gestational surrogacy and poses a provocative question: Does it have to be exploitative? Various existing models of exploitation are considered and a novel exploitation-evaluation heuristic is introduced to assist in the analysis of the potentially exploitative dimensions/elements of complex health-related practices. On the basis of application of the heuristic, I conclude that transnational gestational surrogacy, as currently practiced in low-income country settings (such as rural, western India), is exploitative of surrogate women. Arising out of consideration of the heuristic's exploitation conditions, a set of public education and enabled choice, enhanced protections, and empowerment reforms to transnational gestational surrogacy practice is proposed that, if incorporated into a national regulatory framework and actualized within a low income country, could possibly render such practice nonexploitative.

  9. Discussion and revision of the mathematical modeling tool described in the previously published article "Modeling HIV Transmission risk among Mozambicans prior to their initiating highly active antiretroviral therapy".

    PubMed

    Cassels, Susan; Pearson, Cynthia R; Kurth, Ann E; Martin, Diane P; Simoni, Jane M; Matediana, Eduardo; Gloyd, Stephen

    2009-07-01

    Mathematical models are increasingly used in social and behavioral studies of HIV transmission; however, model structures must be chosen carefully to best answer the question at hand and conclusions must be interpreted cautiously. In Pearson et al. (2007), we presented a simple analytically tractable deterministic model to estimate the number of secondary HIV infections stemming from a population of HIV-positive Mozambicans and to evaluate how the estimate would change under different treatment and behavioral scenarios. In a subsequent application of the model with a different data set, we observed that the model produced an unduly conservative estimate of the number of new HIV-1 infections. In this brief report, our first aim is to describe a revision of the model to correct for this underestimation. Specifically, we recommend adjusting the population-level sexually transmitted infection (STI) parameters to be applicable to the individual-level model specification by accounting for the proportion of individuals uninfected with an STI. In applying the revised model to the original data, we noted an estimated 40 infections/1000 HIV-positive persons per year (versus the original 23 infections/1000 HIV-positive persons per year). In addition, the revised model estimated that highly active antiretroviral therapy (HAART) along with syphilis and herpes simplex virus type 2 (HSV-2) treatments combined could reduce HIV-1 transmission by 72% (versus 86% according to the original model). The second aim of this report is to discuss the advantages and disadvantages of mathematical models in the field and the implications of model interpretation. We caution that simple models should be used for heuristic purposes only. Since these models do not account for heterogeneity in the population and significantly simplify HIV transmission dynamics, they should be used to describe general characteristics of the epidemic and demonstrate the importance or sensitivity of parameters in the model.

  10. An OpenEarth Framework (OEF) for Integrating and Visualizing Earth Science Data

    NASA Astrophysics Data System (ADS)

    Moreland, J. L.; Nadeau, D. R.; Baru, C.; Crosby, C. J.

    2009-12-01

    The integration of data is essential to make transformative progress in understanding the complex processes operating at the Earth’s surface and within its interior. While our current ability to collect massive amounts of data, develop structural models, and generate high-resolution dynamics models is well developed, our ability to quantitatively integrate these data and models into holistic interpretations of Earth systems is poorly developed. We lack the basic tools to realize a first-order goal in Earth science of developing integrated 4D models of Earth structure and processes using a complete range of available constraints, at a time when the research agenda of major efforts such as EarthScope demand such a capability. Among the challenges to 3D data integration are data that may be in different coordinate spaces, units, value ranges, file formats, and data structures. While several file format standards exist, they are infrequently or incorrectly used. Metadata is often missing, misleading, or relegated to README text files along side the data. This leaves much of the work to integrate data bogged down by simple data management tasks. The OpenEarth Framework (OEF) being developed by GEON addresses these data management difficulties. The software incorporates file format parsers, data interpretation heuristics, user interfaces to prompt for missing information, and visualization techniques to merge data into a common visual model. The OEF’s data access libraries parse formal and de facto standard file formats and map their data into a common data model. The software handles file format quirks, storage details, caching, local and remote file access, and web service protocol handling. Heuristics are used to determine coordinate spaces, units, and other key data features. Where multiple data structure, naming, and file organization conventions exist, those heuristics check for each convention’s use to find a high confidence interpretation of the data. When no convention or embedded data yields a suitable answer, the user is prompted to fill in the blanks. The OEF’s interaction libraries assist in the construction of user interfaces for data management. These libraries support data import, data prompting, data introspection, the management of the contents of a common data model, and the creation of derived data to support visualization. Finally, visualization libraries provide interactive visualization using an extended version of NASA WorldWind. The OEF viewer supports visualization of terrains, point clouds, 3D volumes, imagery, cutting planes, isosurfaces, and more. Data may be color coded, shaded, and displayed above, or below the terrain, and always registered into a common coordinate space. The OEF architecture is open and cross-platform software libraries are available separately for use with other software projects, while modules from other projects may be integrated into the OEF to extend its features. The OEF is currently being used to visualize data from EarthScope-related research in the Western US.

  11. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  12. Heuristic use of perceptual evidence leads to dissociation between performance and metacognitive sensitivity.

    PubMed

    Maniscalco, Brian; Peters, Megan A K; Lau, Hakwan

    2016-04-01

    Zylberberg et al. [Zylberberg, Barttfeld, & Sigman (Frontiers in Integrative Neuroscience, 6; 79, 2012), Frontiers in Integrative Neuroscience 6:79] found that confidence decisions, but not perceptual decisions, are insensitive to evidence against a selected perceptual choice. We present a signal detection theoretic model to formalize this insight, which gave rise to a counter-intuitive empirical prediction: that depending on the observer's perceptual choice, increasing task performance can be associated with decreasing metacognitive sensitivity (i.e., the trial-by-trial correspondence between confidence and accuracy). The model also provides an explanation as to why metacognitive sensitivity tends to be less than optimal in actual subjects. These predictions were confirmed robustly in a psychophysics experiment. In a second experiment we found that, in at least some subjects, the effects were replicated even under performance feedback designed to encourage optimal behavior. However, some subjects did show improvement under feedback, suggesting the tendency to ignore evidence against a selected perceptual choice may be a heuristic adopted by the perceptual decision-making system, rather than reflecting inherent biological limitations. We present a Bayesian modeling framework that explains why this heuristic strategy may be advantageous in real-world contexts.

  13. Reconciling intuitive physics and Newtonian mechanics for colliding objects.

    PubMed

    Sanborn, Adam N; Mansinghka, Vikash K; Griffiths, Thomas L

    2013-04-01

    People have strong intuitions about the influence objects exert upon one another when they collide. Because people's judgments appear to deviate from Newtonian mechanics, psychologists have suggested that people depend on a variety of task-specific heuristics. This leaves open the question of how these heuristics could be chosen, and how to integrate them into a unified model that can explain human judgments across a wide range of physical reasoning tasks. We propose an alternative framework, in which people's judgments are based on optimal statistical inference over a Newtonian physical model that incorporates sensory noise and intrinsic uncertainty about the physical properties of the objects being viewed. This noisy Newton framework can be applied to a multitude of judgments, with people's answers determined by the uncertainty they have for physical variables and the constraints of Newtonian mechanics. We investigate a range of effects in mass judgments that have been taken as strong evidence for heuristic use and show that they are well explained by the interplay between Newtonian constraints and sensory uncertainty. We also consider an extended model that handles causality judgments, and obtain good quantitative agreement with human judgments across tasks that involve different judgment types with a single consistent set of parameters.

  14. The dynamics of decision making in risky choice: an eye-tracking analysis.

    PubMed

    Fiedler, Susann; Glöckner, Andreas

    2012-01-01

    In the last years, research on risky choice has moved beyond analyzing choices only. Models have been suggested that aim to describe the underlying cognitive processes and some studies have tested process predictions of these models. Prominent approaches are evidence accumulation models such as decision field theory (DFT), simple serial heuristic models such as the adaptive toolbox, and connectionist approaches such as the parallel constraint satisfaction (PCS) model. In two studies involving measures of attention and pupil dilation, we investigate hypotheses derived from these models in choices between two gambles with two outcomes each. We show that attention to an outcome of a gamble increases with its probability and its value and that attention shifts toward the subsequently favored gamble after about two thirds of the decision process, indicating a gaze-cascade effect. Information search occurs mostly within-gambles, and the direction of search does not change over the course of decision making. Pupil dilation, which reflects both cognitive effort and arousal, increases during the decision process and increases with mean expected value. Overall, the results support aspects of automatic integration models for risky choice such as DFT and PCS, but in their current specification none of them can account for the full pattern of results.

  15. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  16. Causal strength induction from time series data.

    PubMed

    Soo, Kevin W; Rottman, Benjamin M

    2018-04-01

    One challenge when inferring the strength of cause-effect relations from time series data is that the cause and/or effect can exhibit temporal trends. If temporal trends are not accounted for, a learner could infer that a causal relation exists when it does not, or even infer that there is a positive causal relation when the relation is negative, or vice versa. We propose that learners use a simple heuristic to control for temporal trends-that they focus not on the states of the cause and effect at a given instant, but on how the cause and effect change from one observation to the next, which we call transitions. Six experiments were conducted to understand how people infer causal strength from time series data. We found that participants indeed use transitions in addition to states, which helps them to reach more accurate causal judgments (Experiments 1A and 1B). Participants use transitions more when the stimuli are presented in a naturalistic visual format than a numerical format (Experiment 2), and the effect of transitions is not driven by primacy or recency effects (Experiment 3). Finally, we found that participants primarily use the direction in which variables change rather than the magnitude of the change for estimating causal strength (Experiments 4 and 5). Collectively, these studies provide evidence that people often use a simple yet effective heuristic for inferring causal strength from time series data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. A scalable method for identifying frequent subtrees in sets of large phylogenetic trees.

    PubMed

    Ramu, Avinash; Kahveci, Tamer; Burleigh, J Gordon

    2012-10-03

    We consider the problem of finding the maximum frequent agreement subtrees (MFASTs) in a collection of phylogenetic trees. Existing methods for this problem often do not scale beyond datasets with around 100 taxa. Our goal is to address this problem for datasets with over a thousand taxa and hundreds of trees. We develop a heuristic solution that aims to find MFASTs in sets of many, large phylogenetic trees. Our method works in multiple phases. In the first phase, it identifies small candidate subtrees from the set of input trees which serve as the seeds of larger subtrees. In the second phase, it combines these small seeds to build larger candidate MFASTs. In the final phase, it performs a post-processing step that ensures that we find a frequent agreement subtree that is not contained in a larger frequent agreement subtree. We demonstrate that this heuristic can easily handle data sets with 1000 taxa, greatly extending the estimation of MFASTs beyond current methods. Although this heuristic does not guarantee to find all MFASTs or the largest MFAST, it found the MFAST in all of our synthetic datasets where we could verify the correctness of the result. It also performed well on large empirical data sets. Its performance is robust to the number and size of the input trees. Overall, this method provides a simple and fast way to identify strongly supported subtrees within large phylogenetic hypotheses.

  18. A scalable method for identifying frequent subtrees in sets of large phylogenetic trees

    PubMed Central

    2012-01-01

    Background We consider the problem of finding the maximum frequent agreement subtrees (MFASTs) in a collection of phylogenetic trees. Existing methods for this problem often do not scale beyond datasets with around 100 taxa. Our goal is to address this problem for datasets with over a thousand taxa and hundreds of trees. Results We develop a heuristic solution that aims to find MFASTs in sets of many, large phylogenetic trees. Our method works in multiple phases. In the first phase, it identifies small candidate subtrees from the set of input trees which serve as the seeds of larger subtrees. In the second phase, it combines these small seeds to build larger candidate MFASTs. In the final phase, it performs a post-processing step that ensures that we find a frequent agreement subtree that is not contained in a larger frequent agreement subtree. We demonstrate that this heuristic can easily handle data sets with 1000 taxa, greatly extending the estimation of MFASTs beyond current methods. Conclusions Although this heuristic does not guarantee to find all MFASTs or the largest MFAST, it found the MFAST in all of our synthetic datasets where we could verify the correctness of the result. It also performed well on large empirical data sets. Its performance is robust to the number and size of the input trees. Overall, this method provides a simple and fast way to identify strongly supported subtrees within large phylogenetic hypotheses. PMID:23033843

  19. Exploring the quantum speed limit with computer games

    NASA Astrophysics Data System (ADS)

    Sørensen, Jens Jakob W. H.; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F.

    2016-04-01

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. ‘Gamification’—the application of game elements in a non-game context—is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  20. Exploring the quantum speed limit with computer games.

    PubMed

    Sørensen, Jens Jakob W H; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F

    2016-04-14

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. 'Gamification'--the application of game elements in a non-game context--is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  1. Magnitude comparison extended: how lack of knowledge informs comparative judgments under uncertainty.

    PubMed

    Schweickart, Oliver; Brown, Norman R

    2014-02-01

    How do people compare quantitative attributes of real-world objects? (e.g., Which country has the higher per capita GDP, Mauritania or Nepal?). The research literature on this question is divided: Although researchers in the 1970s and 1980s assumed that a 2-stage magnitude comparison process underlies these types of judgments (Banks, 1977), more recent approaches emphasize the role of probabilistic cues and simple heuristics (Gigerenzer, Todd, & The ABC Research Group, 1999). In this article, we review the magnitude comparison literature and propose a framework for magnitude comparison under uncertainty (MaC). Predictions from this framework were tested in a choice context involving one recognized and one unrecognized object, and were contrasted with those based on the recognition heuristic (Goldstein & Gigerenzer, 2002). This was done in 2 paired-comparison studies. In both, participants were timed as they decided which of 2 countries had the higher per capita gross domestic product (GDP). Consistent with the MaC account, we found that response times (RTs) displayed a classic symbolic distance effect: RTs were inversely related to the difference between the subjective per capita GDPs of the compared countries. Furthermore, choice of the recognized country became more frequent as subjective difference increased. These results indicate that the magnitude comparison process extends to choice contexts that have previously been associated only with cue-based strategies. We end by discussing how several findings reported in the recent heuristics literature relate to the MaC framework.

  2. Heuristics Applied in the Development of Advanced Space Mission Concepts

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.

    1998-01-01

    Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.

  3. Heuristics for Relevancy Ranking of Earth Dataset Search Results

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Quinn, P.; Norton, J.

    2016-12-01

    As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.

  4. Heuristics for Relevancy Ranking of Earth Dataset Search Results

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Quinn, Patrick; Norton, James

    2016-01-01

    As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.

  5. A method and implementation for incorporating heuristic knowledge into a state estimator through the use of a fuzzy model

    NASA Astrophysics Data System (ADS)

    Swanson, Steven Roy

    The objective of the dissertation is to improve state estimation performance, as compared to a Kalman filter, when non-constant, or changing, biases exist in the measurement data. The state estimation performance increase will come from the use of a fuzzy model to determine the position and velocity gains of a state estimator. A method is proposed for incorporating heuristic knowledge into a state estimator through the use of a fuzzy model. This method consists of using a fuzzy model to determine the gains of the state estimator, converting the heuristic knowledge into the fuzzy model, and then optimizing the fuzzy model with a genetic algorithm. This method is applied to the problem of state estimation of a cascaded global positioning system (GPS)/inertial reference unit (IRU) navigation system. The GPS position data contains two major sources for position bias. The first bias is due to satellite errors and the second is due to the time delay or lag from when the GPS position is calculated until it is used in the state estimator. When a change in the bias of the measurement data occurs, a state estimator will converge on the new measurement data solution. This will introduce errors into a Kalman filter's estimated state velocities, which in turn will cause a position overshoot as it converges. By using a fuzzy model to determine the gains of a state estimator, the velocity errors and their associated deficiencies can be reduced.

  6. Network Aggregation in Transportation Planning : Volume I : Summary and Survey

    DOT National Transportation Integrated Search

    1978-04-01

    Volume 1 summarizes research on network aggregation in transportation models. It includes a survey of network aggregation practices, definition of an extraction aggregation model, computational results on a heuristic implementation of the model, and ...

  7. A Hierarchy of Heuristic-Based Models of Crowd Dynamics

    NASA Astrophysics Data System (ADS)

    Degond, P.; Appert-Rolland, C.; Moussaïd, M.; Pettré, J.; Theraulaz, G.

    2013-09-01

    We derive a hierarchy of kinetic and macroscopic models from a noisy variant of the heuristic behavioral Individual-Based Model of Ngai et al. (Disaster Med. Public Health Prep. 3:191-195, 2009) where pedestrians are supposed to have constant speeds. This IBM supposes that pedestrians seek the best compromise between navigation towards their target and collisions avoidance. We first propose a kinetic model for the probability distribution function of pedestrians. Then, we derive fluid models and propose three different closure relations. The first two closures assume that the velocity distribution function is either a Dirac delta or a von Mises-Fisher distribution respectively. The third closure results from a hydrodynamic limit associated to a Local Thermodynamical Equilibrium. We develop an analogy between this equilibrium and Nash equilibria in a game theoretic framework. In each case, we discuss the features of the models and their suitability for practical use.

  8. Valiant load-balanced robust routing under hose model for WDM mesh networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoning; Li, Lemin; Wang, Sheng

    2006-09-01

    In this paper, we propose Valiant Load-Balanced robust routing scheme for WDM mesh networks under the model of polyhedral uncertainty (i.e., hose model), and the proposed routing scheme is implemented with traffic grooming approach. Our Objective is to maximize the hose model throughput. A mathematic formulation of Valiant Load-Balanced robust routing is presented and three fast heuristic algorithms are also proposed. When implementing Valiant Load-Balanced robust routing scheme to WDM mesh networks, a novel traffic-grooming algorithm called MHF (minimizing hop first) is proposed. We compare the three heuristic algorithms with the VPN tree under the hose model. Finally we demonstrate in the simulation results that MHF with Valiant Load-Balanced robust routing scheme outperforms the traditional traffic-grooming algorithm in terms of the throughput for the uniform/non-uniform traffic matrix under the hose model.

  9. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  10. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    PubMed

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  11. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.

  12. Algorithms for Automatic Alignment of Arrays

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Oliker, Leonid; Schreiber, Robert; Sheffler, Thomas J.

    1996-01-01

    Aggregate data objects (such as arrays) are distributed across the processor memories when compiling a data-parallel language for a distributed-memory machine. The mapping determines the amount of communication needed to bring operands of parallel operations into alignment with each other. A common approach is to break the mapping into two stages: an alignment that maps all the objects to an abstract template, followed by a distribution that maps the template to the processors. This paper describes algorithms for solving the various facets of the alignment problem: axis and stride alignment, static and mobile offset alignment, and replication labeling. We show that optimal axis and stride alignment is NP-complete for general program graphs, and give a heuristic method that can explore the space of possible solutions in a number of ways. We show that some of these strategies can give better solutions than a simple greedy approach proposed earlier. We also show how local graph contractions can reduce the size of the problem significantly without changing the best solution. This allows more complex and effective heuristics to be used. We show how to model the static offset alignment problem using linear programming, and we show that loop-dependent mobile offset alignment is sometimes necessary for optimum performance. We describe an algorithm with for determining mobile alignments for objects within do loops. We also identify situations in which replicated alignment is either required by the program itself or can be used to improve performance. We describe an algorithm based on network flow that replicates objects so as to minimize the total amount of broadcast communication in replication.

  13. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  14. Analysis of the type II robotic mixed-model assembly line balancing problem

    NASA Astrophysics Data System (ADS)

    Çil, Zeynel Abidin; Mete, Süleyman; Ağpak, Kürşad

    2017-06-01

    In recent years, there has been an increasing trend towards using robots in production systems. Robots are used in different areas such as packaging, transportation, loading/unloading and especially assembly lines. One important step in taking advantage of robots on the assembly line is considering them while balancing the line. On the other hand, market conditions have increased the importance of mixed-model assembly lines. Therefore, in this article, the robotic mixed-model assembly line balancing problem is studied. The aim of this study is to develop a new efficient heuristic algorithm based on beam search in order to minimize the sum of cycle times over all models. In addition, mathematical models of the problem are presented for comparison. The proposed heuristic is tested on benchmark problems and compared with the optimal solutions. The results show that the algorithm is very competitive and is a promising tool for further research.

  15. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  16. "The Gaze Heuristic:" Biography of an Adaptively Rational Decision Process.

    PubMed

    Hamlin, Robert P

    2017-04-01

    This article is a case study that describes the natural and human history of the gaze heuristic. The gaze heuristic is an interception heuristic that utilizes a single input (deviation from a constant angle of approach) repeatedly as a task is performed. Its architecture, advantages, and limitations are described in detail. A history of the gaze heuristic is then presented. In natural history, the gaze heuristic is the only known technique used by predators to intercept prey. In human history the gaze heuristic was discovered accidentally by Royal Air Force (RAF) fighter command just prior to World War II. As it was never discovered by the Luftwaffe, the technique conferred a decisive advantage upon the RAF throughout the war. After the end of the war in America, German technology was combined with the British heuristic to create the Sidewinder AIM9 missile, the most successful autonomous weapon ever built. There are no plans to withdraw it or replace its guiding gaze heuristic. The case study demonstrates that the gaze heuristic is a specific heuristic type that takes a single best input at the best time (take the best 2 ). Its use is an adaptively rational response to specific, rapidly evolving decision environments that has allowed those animals/humans/machines who use it to survive, prosper, and multiply relative to those who do not. Copyright © 2017 Cognitive Science Society, Inc.

  17. Understanding the science of portion control and the art of downsizing.

    PubMed

    Hetherington, Marion M; Blundell-Birtill, Pam; Caton, Samantha J; Cecil, Joanne E; Evans, Charlotte E; Rolls, Barbara J; Tang, Tang

    2018-05-24

    Offering large portions of high-energy-dense (HED) foods increases overall intake in children and adults. This is known as the portion size effect (PSE). It is robust, reliable and enduring. Over time, the PSE may facilitate overeating and ultimately positive energy balance. Therefore, it is important to understand what drives the PSE and what might be done to counter the effects of an environment promoting large portions, especially in children. Explanations for the PSE are many and diverse, ranging from consumer error in estimating portion size to simple heuristics such as cleaning the plate or eating in accordance with consumption norms. However, individual characteristics and hedonic processes influence the PSE, suggesting a more complex explanation than error or heuristics. Here PSE studies are reviewed to identify interventions that can be used to downsize portions of HED foods, with a focus on children who are still learning about social norms for portion size. Although the scientific evidence for the PSE is robust, there is still a need for creative downsizing solutions to facilitate portion control as children and adolescents establish their eating habits.

  18. Assessing the use of cognitive heuristic representativeness in clinical reasoning.

    PubMed

    Payne, Velma L; Crowley, Rebecca S; Crowley, Rebecca

    2008-11-06

    We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors.

  19. Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning

    PubMed Central

    Payne, Velma L.; Crowley, Rebecca S.

    2008-01-01

    We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors. PMID:18999140

  20. Network clustering and community detection using modulus of families of loops.

    PubMed

    Shakeri, Heman; Poggi-Corradini, Pietro; Albin, Nathan; Scoglio, Caterina

    2017-01-01

    We study the structure of loops in networks using the notion of modulus of loop families. We introduce an alternate measure of network clustering by quantifying the richness of families of (simple) loops. Modulus tries to minimize the expected overlap among loops by spreading the expected link usage optimally. We propose weighting networks using these expected link usages to improve classical community detection algorithms. We show that the proposed method enhances the performance of certain algorithms, such as spectral partitioning and modularity maximization heuristics, on standard benchmarks.

  1. On the Formation of Emotions.

    ERIC Educational Resources Information Center

    Montada, Leo

    1989-01-01

    Asserts that emotions are based on cognitive appraisals of occurrences. Argues that cognitive models have heuristic value for research and practice and examines objections concerning the validity of those models. Discusses the usefulness of these models for several educational and developmental goals. (KO)

  2. A Generalized Framework for Modeling Next Generation 911 Implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We foundmore » that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .« less

  3. The role of grazer predation strategies in the dynamics of consumer-resource based ecological models

    NASA Astrophysics Data System (ADS)

    Cropp, Roger; Moroz, Irene; Norbury, John

    2017-07-01

    We analyse a simple plankton system to provide a heuristic for more complex models such as Dynamic Green Ocean Models (DGOMs). Zooplankton foraging is either by generalist grazers that consume whatever they bump into or specialist grazers that actively seek particular prey. The zooplankton may further be classified as either facultative grazers that can survive on any of their prey or obligate grazers that depend on the presence of specific prey. A key result is that different prey dependencies can result in dramatically different impacts of grazing strategies on system outcomes. The grazing strategy can determine whether a system with obligate grazers will be stable, have regular, predictable cycles or be chaotic. Conversely, whether facultative zooplankton functioned as specialist or generalist grazers makes no qualitative difference to the dynamics of the system. These results demonstrate that the effect of different grazing strategies can be critically dependent on the grazer's dependency on specific prey. Great care must be taken when choosing functional forms for population interactions in DGOMs, particularly in scenarios such as climate change where parameters such as mortality and growth coefficients may change. A robust theoretical framework supporting model development and analysis is key to understanding how such choices can affect model properties and hence predictions.

  4. Beyond pain: modeling decision-making deficits in chronic pain

    PubMed Central

    Hess, Leonardo Emanuel; Haimovici, Ariel; Muñoz, Miguel Angel; Montoya, Pedro

    2014-01-01

    Risky decision-making seems to be markedly disrupted in patients with chronic pain, probably due to the high cost that impose pain and negative mood on executive control functions. Patients’ behavioral performance on decision-making tasks such as the Iowa Gambling Task (IGT) is characterized by selecting cards more frequently from disadvantageous than from advantageous decks, and by switching often between competing responses in comparison with healthy controls (HCs). In the present study, we developed a simple heuristic model to simulate individuals’ choice behavior by varying the level of decision randomness and the importance given to gains and losses. The findings revealed that the model was able to differentiate the behavioral performance of patients with chronic pain and HCs at the group, as well as at the individual level. The best fit of the model in patients with chronic pain was yielded when decisions were not based on previous choices and when gains were considered more relevant than losses. By contrast, the best account of the available data in HCs was obtained when decisions were based on previous experiences and losses loomed larger than gains. In conclusion, our model seems to provide useful information to measure each individual participant extensively, and to deal with the data on a participant-by-participant basis. PMID:25136301

  5. Beyond pain: modeling decision-making deficits in chronic pain.

    PubMed

    Hess, Leonardo Emanuel; Haimovici, Ariel; Muñoz, Miguel Angel; Montoya, Pedro

    2014-01-01

    Risky decision-making seems to be markedly disrupted in patients with chronic pain, probably due to the high cost that impose pain and negative mood on executive control functions. Patients' behavioral performance on decision-making tasks such as the Iowa Gambling Task (IGT) is characterized by selecting cards more frequently from disadvantageous than from advantageous decks, and by switching often between competing responses in comparison with healthy controls (HCs). In the present study, we developed a simple heuristic model to simulate individuals' choice behavior by varying the level of decision randomness and the importance given to gains and losses. The findings revealed that the model was able to differentiate the behavioral performance of patients with chronic pain and HCs at the group, as well as at the individual level. The best fit of the model in patients with chronic pain was yielded when decisions were not based on previous choices and when gains were considered more relevant than losses. By contrast, the best account of the available data in HCs was obtained when decisions were based on previous experiences and losses loomed larger than gains. In conclusion, our model seems to provide useful information to measure each individual participant extensively, and to deal with the data on a participant-by-participant basis.

  6. A biologically inspired two-species exclusion model: effects of RNA polymerase motor traffic on simultaneous DNA replication

    NASA Astrophysics Data System (ADS)

    Ghosh, Soumendu; Mishra, Bhavya; Patra, Shubhadeep; Schadschneider, Andreas; Chowdhury, Debashish

    2018-04-01

    We introduce a two-species exclusion model to describe the key features of the conflict between the RNA polymerase (RNAP) motor traffic, engaged in the transcription of a segment of DNA, concomitant with the progress of two DNA replication forks on the same DNA segment. One of the species of particles (P) represents RNAP motors while the other (R) represents the replication forks. Motivated by the biological phenomena that this model is intended to capture, a maximum of two R particles only are allowed to enter the lattice from two opposite ends whereas the unrestricted number of P particles constitutes a totally asymmetric simple exclusion process (TASEP) in a segment in the middle of the lattice. The model captures three distinct pathways for resolving the co-directional as well as head-on collision between the P and R particles. Using Monte Carlo simulations and heuristic analytical arguments that combine exact results for the TASEP with mean-field approximations, we predict the possible outcomes of the conflict between the traffic of RNAP motors (P particles engaged in transcription) and the replication forks (R particles). In principle, the model can be adapted to experimental conditions to account for the data quantitatively.

  7. Discrete bivariate population balance modelling of heteroaggregation processes.

    PubMed

    Rollié, Sascha; Briesen, Heiko; Sundmacher, Kai

    2009-08-15

    Heteroaggregation in binary particle mixtures was simulated with a discrete population balance model in terms of two internal coordinates describing the particle properties. The considered particle species are of different size and zeta-potential. Property space is reduced with a semi-heuristic approach to enable an efficient solution. Aggregation rates are based on deterministic models for Brownian motion and stability, under consideration of DLVO interaction potentials. A charge-balance kernel is presented, relating the electrostatic surface potential to the property space by a simple charge balance. Parameter sensitivity with respect to the fractal dimension, aggregate size, hydrodynamic correction, ionic strength and absolute particle concentration was assessed. Results were compared to simulations with the literature kernel based on geometric coverage effects for clusters with heterogeneous surface properties. In both cases electrostatic phenomena, which dominate the aggregation process, show identical trends: impeded cluster-cluster aggregation at low particle mixing ratio (1:1), restabilisation at high mixing ratios (100:1) and formation of complex clusters for intermediate ratios (10:1). The particle mixing ratio controls the surface coverage extent of the larger particle species. Simulation results are compared to experimental flow cytometric data and show very satisfactory agreement.

  8. Policy improvement by a model-free Dyna architecture.

    PubMed

    Hwang, Kao-Shing; Lo, Chia-Yue

    2013-05-01

    The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate.

  9. Heuristic Models of Elder Abuse: Implications for the Practitioner.

    ERIC Educational Resources Information Center

    Galbraith, Michael W.; Zdorkowski, R. Todd

    1984-01-01

    Lists some of the more interesting hypotheses that lie unexamined in the elder abuse literature and specifies the kinds of explanations that these hypotheses demand. Includes models that may be useful to practitioners. (JOW)

  10. Heuristic and analytic processing in online sports betting.

    PubMed

    d'Astous, Alain; Di Gaspero, Marc

    2015-06-01

    This article presents the results of two studies that examine the occurrence of heuristic (i.e., intuitive and fast) and analytic (i.e., deliberate and slow) processes among people who engage in online sports betting on a regular basis. The first study was qualitative and was conducted with a convenience sample of 12 regular online sports gamblers who described the processes by which they arrive at a sports betting decision. The results of this study showed that betting online on sports events involves a mix of heuristic and analytic processes. The second study consisted in a survey of 161 online sports gamblers where performance in terms of monetary gains, experience in online sports betting, propensity to collect and analyze relevant information prior to betting, and use of bookmaker odds were measured. This study showed that heuristic and analytic processes act as mediators of the relationship between experience and performance. The findings stemming of these two studies give some insights into gamblers' modes of thinking and behaviors in an online sports betting context and show the value of the dual mediation process model for research that looks at gambling activities from a judgment and decision making perspective.

  11. Exact and heuristic algorithms for Space Information Flow.

    PubMed

    Uwitonze, Alfred; Huang, Jiaqing; Ye, Yuanqing; Cheng, Wenqing; Li, Zongpeng

    2018-01-01

    Space Information Flow (SIF) is a new promising research area that studies network coding in geometric space, such as Euclidean space. The design of algorithms that compute the optimal SIF solutions remains one of the key open problems in SIF. This work proposes the first exact SIF algorithm and a heuristic SIF algorithm that compute min-cost multicast network coding for N (N ≥ 3) given terminal nodes in 2-D Euclidean space. Furthermore, we find that the Butterfly network in Euclidean space is the second example besides the Pentagram network where SIF is strictly better than Euclidean Steiner minimal tree. The exact algorithm design is based on two key techniques: Delaunay triangulation and linear programming. Delaunay triangulation technique helps to find practically good candidate relay nodes, after which a min-cost multicast linear programming model is solved over the terminal nodes and the candidate relay nodes, to compute the optimal multicast network topology, including the optimal relay nodes selected by linear programming from all the candidate relay nodes and the flow rates on the connection links. The heuristic algorithm design is also based on Delaunay triangulation and linear programming techniques. The exact algorithm can achieve the optimal SIF solution with an exponential computational complexity, while the heuristic algorithm can achieve the sub-optimal SIF solution with a polynomial computational complexity. We prove the correctness of the exact SIF algorithm. The simulation results show the effectiveness of the heuristic SIF algorithm.

  12. Labor union members play an OLG repeated game

    PubMed Central

    Kandori, Michihiro; Obayashi, Shinya

    2014-01-01

    Humans are capable of cooperating with one another even when it is costly and a deviation provides an immediate gain. An important reason is that cooperation is reciprocated or rewarded and deviations are penalized in later stages. For cooperation to be sustainable, not only must rewards and penalties be strong enough but individuals should also have the right incentives to provide rewards and punishments. Codes of conduct with such properties have been studied extensively in game theory (as repeated game equilibria), and the literature on the evolution of cooperation shows how equilibrium behavior might emerge and proliferate in society. We found that community unions, a subclass of labor unions that admits individual affiliations, are ideal to corroborate these theories with reality, because (i) their activities are simple and (ii) they have a structure that closely resembles a theoretical model, the overlapping generations repeated game. A detailed case study of a community union revealed a possible equilibrium that can function under the very limited observability in the union. The equilibrium code of conduct appears to be a natural focal point based on simple heuristic reasoning. The union we studied was created out of necessity for cooperation, without knowing or anticipating how cooperation might be sustained. The union has successfully resolved about 3,000 labor disputes and created a number of offspring. PMID:25024211

  13. The memory state heuristic: A formal model based on repeated recognition judgments.

    PubMed

    Castela, Marta; Erdfelder, Edgar

    2017-02-01

    The recognition heuristic (RH) theory predicts that, in comparative judgment tasks, if one object is recognized and the other is not, the recognized one is chosen. The memory-state heuristic (MSH) extends the RH by assuming that choices are not affected by recognition judgments per se, but by the memory states underlying these judgments (i.e., recognition certainty, uncertainty, or rejection certainty). Specifically, the larger the discrepancy between memory states, the larger the probability of choosing the object in the higher state. The typical RH paradigm does not allow estimation of the underlying memory states because it is unknown whether the objects were previously experienced or not. Therefore, we extended the paradigm by repeating the recognition task twice. In line with high threshold models of recognition, we assumed that inconsistent recognition judgments result from uncertainty whereas consistent judgments most likely result from memory certainty. In Experiment 1, we fitted 2 nested multinomial models to the data: an MSH model that formalizes the relation between memory states and binary choices explicitly and an approximate model that ignores the (unlikely) possibility of consistent guesses. Both models provided converging results. As predicted, reliance on recognition increased with the discrepancy in the underlying memory states. In Experiment 2, we replicated these results and found support for choice consistency predictions of the MSH. Additionally, recognition and choice latencies were in agreement with the MSH in both experiments. Finally, we validated critical parameters of our MSH model through a cross-validation method and a third experiment. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. The source of the truth bias: Heuristic processing?

    PubMed

    Street, Chris N H; Masip, Jaume

    2015-06-01

    People believe others are telling the truth more often than they actually are; this is called the truth bias. Surprisingly, when a speaker is judged at multiple points across their statement the truth bias declines. Previous claims argue this is evidence of a shift from (biased) heuristic processing to (reasoned) analytical processing. In four experiments we contrast the heuristic-analytic model (HAM) with alternative accounts. In Experiment 1, the decrease in truth responding was not the result of speakers appearing more deceptive, but was instead attributable to the rater's processing style. Yet contrary to HAMs, across three experiments we found the decline in bias was not related to the amount of processing time available (Experiments 1-3) or the communication channel (Experiment 2). In Experiment 4 we found support for a new account: that the bias reflects whether raters perceive the statement to be internally consistent. © 2015 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  15. Heuristic and analytic processes in reasoning: an event-related potential study of belief bias.

    PubMed

    Banks, Adrian P; Hope, Christopher

    2014-03-01

    Human reasoning involves both heuristic and analytic processes. This study of belief bias in relational reasoning investigated whether the two processes occur serially or in parallel. Participants evaluated the validity of problems in which the conclusions were either logically valid or invalid and either believable or unbelievable. Problems in which the conclusions presented a conflict between the logically valid response and the believable response elicited a more positive P3 than problems in which there was no conflict. This shows that P3 is influenced by the interaction of belief and logic rather than either of these factors on its own. These findings indicate that belief and logic influence reasoning at the same time, supporting models in which belief-based and logical evaluations occur in parallel but not theories in which belief-based heuristic evaluations precede logical analysis.

  16. Reexamining our bias against heuristics.

    PubMed

    McLaughlin, Kevin; Eva, Kevin W; Norman, Geoff R

    2014-08-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources of bias in the literature implicating the use of heuristics in diagnostic error and highlight the fact that there are also data suggesting that under certain circumstances using heuristics may lead to better decisions that formal analysis. They suggest that diagnostic error is frequently misattributed to the use of heuristics and propose an alternative view whereby content knowledge is the root cause of diagnostic performance and heuristics lie on the causal pathway between knowledge and diagnostic error or success.

  17. Focus of attention in an activity-based scheduler

    NASA Technical Reports Server (NTRS)

    Sadeh, Norman; Fox, Mark S.

    1989-01-01

    Earlier research in job shop scheduling has demonstrated the advantages of opportunistically combining order-based and resource-based scheduling techniques. An even more flexible approach is investigated where each activity is considered a decision point by itself. Heuristics to opportunistically select the next decision point on which to focus attention (i.e., variable ordering heuristics) and the next decision to be tried at this point (i.e., value ordering heuristics) are described that probabilistically account for both activity precedence and resource requirement interactions. Preliminary experimental results indicate that the variable ordering heuristic greatly increases search efficiency. While least constraining value ordering heuristics have been advocated in the literature, the experimental results suggest that other value ordering heuristics combined with our variable-ordering heuristic can produce much better schedules without significantly increasing search.

  18. Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics.

    PubMed

    Hattori, Masasi

    2016-12-01

    This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  19. Optimal rail container shipment planning problem in multimodal transportation

    NASA Astrophysics Data System (ADS)

    Cao, Chengxuan; Gao, Ziyou; Li, Keping

    2012-09-01

    The optimal rail container shipment planning problem in multimodal transportation is studied in this article. The characteristics of the multi-period planning problem is presented and the problem is formulated as a large-scale 0-1 integer programming model, which maximizes the total profit generated by all freight bookings accepted in a multi-period planning horizon subject to the limited capacities. Two heuristic algorithms are proposed to obtain an approximate optimal solution of the problem. Finally, numerical experiments are conducted to demonstrate the proposed formulation and heuristic algorithms.

  20. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  1. A Hyper-Heuristic Ensemble Method for Static Job-Shop Scheduling.

    PubMed

    Hart, Emma; Sim, Kevin

    2016-01-01

    We describe a new hyper-heuristic method NELLI-GP for solving job-shop scheduling problems (JSSP) that evolves an ensemble of heuristics. The ensemble adopts a divide-and-conquer approach in which each heuristic solves a unique subset of the instance set considered. NELLI-GP extends an existing ensemble method called NELLI by introducing a novel heuristic generator that evolves heuristics composed of linear sequences of dispatching rules: each rule is represented using a tree structure and is itself evolved. Following a training period, the ensemble is shown to outperform both existing dispatching rules and a standard genetic programming algorithm on a large set of new test instances. In addition, it obtains superior results on a set of 210 benchmark problems from the literature when compared to two state-of-the-art hyper-heuristic approaches. Further analysis of the relationship between heuristics in the evolved ensemble and the instances each solves provides new insights into features that might describe similar instances.

  2. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  3. Comparison of two landslide susceptibility assessments in the Champagne-Ardenne region (France)

    NASA Astrophysics Data System (ADS)

    Den Eeckhaut, M. Van; Marre, A.; Poesen, J.

    2010-02-01

    The vineyards of the Montagne de Reims are mostly planted on steep south-oriented cuesta fronts receiving a maximum of sun radiation. Due to the location of the vineyards on steep hillslopes, the viticultural activity is threatened by slope failures. This study attempts to better understand the spatial patterns of landslide susceptibility in the Champagne-Ardenne region by comparing a heuristic (qualitative) and a statistical (quantitative) model in a 1120 km² study area. The heuristic landslide susceptibility model was adopted from the Bureau de Recherches Géologiques et Minières, the GEGEAA - Reims University and the Comité Interprofessionnel du Vin de Champagne. In this model, expert knowledge of the region was used to assign weights to all slope classes and lithologies present in the area, but the final susceptibility map was never evaluated with the location of mapped landslides. For the statistical landslide susceptibility assessment, logistic regression was applied to a dataset of 291 'old' (Holocene) landslides. The robustness of the logistic regression model was evaluated and ROC curves were used for model calibration and validation. With regard to the variables assumed to be important environmental factors controlling landslides, the two models are in agreement. They both indicate that present and future landslides are mainly controlled by slope gradient and lithology. However, the comparison of the two landslide susceptibility maps through (1) an evaluation with the location of mapped 'old' landslides and through (2) a temporal validation with spatial data of 'recent' (1960-1999; n = 48) and 'very recent' (2000-2008; n = 46) landslides showed a better prediction capacity for the statistical model produced in this study compared to the heuristic model. In total, the statistically-derived landslide susceptibility map succeeded in correctly classifying 81.0% of the 'old' and 91.6% of the 'recent' and 'very recent' landslides. On the susceptibility map derived from the heuristic model, on the other hand, only 54.6% of the 'old' and 64.0% of the 'recent' and 'very recent' landslides were correctly classified as unstable. Hence, the landslide susceptibility map obtained from logistic regression is a better tool for regional landslide susceptibility analysis in the study area of the Montagne de Reims. The accurate classification of zones with very high and high susceptibility allows delineating zones where viticulturists should be informed and where implementation of precaution measures is needed to secure slope stability.

  4. A Heuristic Model of Consciousness with Applications to the Development of Science and Society

    NASA Technical Reports Server (NTRS)

    Curreri, Peter A.

    2010-01-01

    A working model of consciousness is fundamental to understanding of the interactions of the observer in science. This paper examines contemporary understanding of consciousness. A heuristic model of consciousness is suggested that is consistent with psycophysics measurements of bandwidth of consciousness relative to unconscious perception. While the self reference nature of consciousness confers a survival benefit by assuring the all points of view regarding a problem are experienced in sufficiently large population, conscious bandwidth is constrained by design to avoid chaotic behavior. The multiple hypotheses provided by conscious reflection enable the rapid progression of science and technology. The questions of free will and the problem of attention are discussed in relation to the model. Finally the combination of rapid technology growth with the assurance of many unpredictable points of view is considered in respect to contemporary constraints to the development of society.

  5. Evaluating the usability of an interactive, bi-lingual, touchscreen-enabled breastfeeding educational programme: application of Nielson's heuristics.

    PubMed

    Joshi, Ashish; Perin, Douglas M Puricelli; Amadi, Chioma; Trout, Kate

    2015-03-05

    The study purpose was to conduct heuristic evaluation of an interactive, bilingual touchscreen-enabled breastfeeding educational programme for Hispanic women living in rural settings in Nebraska. Three raters conducted the evaluation during May 2013 using principles of Nielson's heuristics. A total of 271 screens were evaluated and included: interface (n = 5), programme sections (n = 223) and educational content (n = 43). A total of 97 heuristic violations were identified and were mostly related to interface (8 violations/5 screens) and programme components (89 violations/266 screens). The most common heuristic violations reported were recognition rather than recall (62%, n = 60), consistency and standards (14%, n = 14) and match between the system and real world (9%, n = 9). Majority of the heuristic violations had minor usability issues (73%, n = 71). The only grade 4 heuristic violation reported was due to the visibility of system status in the assessment modules. The results demonstrated that the system was more consistent with Nielsen's usability heuristics. With Nielsen's usability heuristics, it is possible to identify problems in a timely manner, and help facilitate the identification and prioritisation of problems needing urgent attention at an earlier stage before the final deployment of the system.

  6. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems.

    PubMed

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems.

  7. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems

    PubMed Central

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383

  8. Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).

    PubMed

    Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff

    2017-01-01

    With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.

  9. Mean-field theory of baryonic matter for QCD in the large Nc and heavy quark mass limits

    NASA Astrophysics Data System (ADS)

    Adhikari, Prabal; Cohen, Thomas D.

    2013-11-01

    We discuss theoretical issues pertaining to baryonic matter in the combined heavy-quark and large Nc limits of QCD. Witten's classic argument that baryons and interacting systems of baryons can be described in a mean-field approximation with each of the quarks moving in an average potential due to the remaining quarks is heuristic. It is important to justify this heuristic description for the case of baryonic matter since systems of interacting baryons are intrinsically more complicated than single baryons due to the possibility of hidden color states—states in which the subsystems making up the entire baryon crystal are not color-singlet nucleons but rather colorful states coupled together to make a color-singlet state. In this work, we provide a formal justification of this heuristic prescription. In order to do this, we start by taking the heavy quark limit, thus effectively reducing the problem to a many-body quantum mechanical system. This problem can be formulated in terms of integrals over coherent states, which for this problem are simple Slater determinants. We show that for the many-body problem, the support region for these integrals becomes narrow at large Nc, yielding an energy which is well approximated by a single coherent state—that is a mean-field description. Corrections to the energy are of relative order 1/Nc. While hidden color states are present in the exact state of the heavy quark system, they only influence the interaction energy below leading order in 1/Nc.

  10. A Variable-Selection Heuristic for K-Means Clustering.

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Cradit, J. Dennis

    2001-01-01

    Presents a variable selection heuristic for nonhierarchical (K-means) cluster analysis based on the adjusted Rand index for measuring cluster recovery. Subjected the heuristic to Monte Carlo testing across more than 2,200 datasets. Results indicate that the heuristic is extremely effective at eliminating masking variables. (SLD)

  11. SYSTEMATIZATION OF MASS LEVELS OF PARTICLES AND RESONANCES ON HEURISTIC BASIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takabayasi, T.

    1963-12-16

    Once more a scheme of simple mass rules and formulas for particles and resonant levels is investigated and organized, based on some general hypotheses. The essential ingredients in the scheme are, on one hand, the equalinterval rule governing the isosinglet meson series, associated with particularly simple mass ratio between the 2/sup ++/ level f and 0/sup ++/ level ABC, and on the other a new basic mass formula that unifies some of the meson and baryon levels. The whole baryon levels are arranged in a table analogous to the periodic table, and then correspondences between different series and equivalence betweenmore » spin and hypercharge, when properly applied, just fix the whole baryon mass spectrum in good agreement with observations. Connections with the scheme of mass formulas formerly given are also shown. (auth)« less

  12. INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS.

    PubMed

    Villar, Sofía S

    2016-01-01

    Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects' state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics.

  13. INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS

    PubMed Central

    Villar, Sofía S.

    2016-01-01

    Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects’ state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics. PMID:27212781

  14. Conflict and Bias in Heuristic Judgment

    ERIC Educational Resources Information Center

    Bhatia, Sudeep

    2017-01-01

    Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy.…

  15. Ideology in Writing Instruction: Reconsidering Invention Heuristics.

    ERIC Educational Resources Information Center

    Byard, Vicki

    Modern writing textbooks tend to offer no heuristics, treat heuristics as if they do not have different impacts on inquiry, or take the view that heuristics are ideologically neutral pedagogies. Yet theory about language demonstrates that ideological neutrality is impossible. Any use of language in attempting to represent reality will inevitably…

  16. An Effective Exercise for Teaching Cognitive Heuristics

    ERIC Educational Resources Information Center

    Swinkels, Alan

    2003-01-01

    This article describes a brief heuristics demonstration and offers suggestions for personalizing examples of heuristics by making them relevant to students. Students complete a handout asking for 4 judgments illustrative of such heuristics. The decisions are cast in the context of students' daily lives at their particular university. After the…

  17. Stochastic Time Models of Syllable Structure

    PubMed Central

    Shaw, Jason A.; Gafos, Adamantios I.

    2015-01-01

    Drawing on phonology research within the generative linguistics tradition, stochastic methods, and notions from complex systems, we develop a modelling paradigm linking phonological structure, expressed in terms of syllables, to speech movement data acquired with 3D electromagnetic articulography and X-ray microbeam methods. The essential variable in the models is syllable structure. When mapped to discrete coordination topologies, syllabic organization imposes systematic patterns of variability on the temporal dynamics of speech articulation. We simulated these dynamics under different syllabic parses and evaluated simulations against experimental data from Arabic and English, two languages claimed to parse similar strings of segments into different syllabic structures. Model simulations replicated several key experimental results, including the fallibility of past phonetic heuristics for syllable structure, and exposed the range of conditions under which such heuristics remain valid. More importantly, the modelling approach consistently diagnosed syllable structure proving resilient to multiple sources of variability in experimental data including measurement variability, speaker variability, and contextual variability. Prospects for extensions of our modelling paradigm to acoustic data are also discussed. PMID:25996153

  18. On use of image quality metrics for perceptual blur modeling: image/video compression case

    NASA Astrophysics Data System (ADS)

    Cha, Jae H.; Olson, Jeffrey T.; Preece, Bradley L.; Espinola, Richard L.; Abbott, A. Lynn

    2018-02-01

    Linear system theory is employed to make target acquisition performance predictions for electro-optical/infrared imaging systems where the modulation transfer function (MTF) may be imposed from a nonlinear degradation process. Previous research relying on image quality metrics (IQM) methods, which heuristically estimate perceived MTF has supported that an average perceived MTF can be used to model some types of degradation such as image compression. Here, we discuss the validity of the IQM approach by mathematically analyzing the associated heuristics from the perspective of reliability, robustness, and tractability. Experiments with standard images compressed by x.264 encoding suggest that the compression degradation can be estimated by a perceived MTF within boundaries defined by well-behaved curves with marginal error. Our results confirm that the IQM linearizer methodology provides a credible tool for sensor performance modeling.

  19. Enriching mission planning approach with state transition graph heuristics for deep space exploration

    NASA Astrophysics Data System (ADS)

    Jin, Hao; Xu, Rui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying

    2017-10-01

    As to support the mission of Mars exploration in China, automated mission planning is required to enhance security and robustness of deep space probe. Deep space mission planning requires modeling of complex operations constraints and focus on the temporal state transitions of involved subsystems. Also, state transitions are ubiquitous in physical systems, but have been elusive for knowledge description. We introduce a modeling approach to cope with these difficulties that takes state transitions into consideration. The key technique we build on is the notion of extended states and state transition graphs. Furthermore, a heuristics that based on state transition graphs is proposed to avoid redundant work. Finally, we run comprehensive experiments on selected domains and our techniques present an excellent performance.

  20. Walking tree heuristics for biological string alignment, gene location, and phylogenies

    NASA Astrophysics Data System (ADS)

    Cull, P.; Holloway, J. L.; Cavener, J. D.

    1999-03-01

    Basic biological information is stored in strings of nucleic acids (DNA, RNA) or amino acids (proteins). Teasing out the meaning of these strings is a central problem of modern biology. Matching and aligning strings brings out their shared characteristics. Although string matching is well-understood in the edit-distance model, biological strings with transpositions and inversions violate this model's assumptions. We propose a family of heuristics called walking trees to align biologically reasonable strings. Both edit-distance and walking tree methods can locate specific genes within a large string when the genes' sequences are given. When we attempt to match whole strings, the walking tree matches most genes, while the edit-distance method fails. We also give examples in which the walking tree matches substrings even if they have been moved or inverted. The edit-distance method was not designed to handle these problems. We include an example in which the walking tree "discovered" a gene. Calculating scores for whole genome matches gives a method for approximating evolutionary distance. We show two evolutionary trees for the picornaviruses which were computed by the walking tree heuristic. Both of these trees show great similarity to previously constructed trees. The point of this demonstration is that WHOLE genomes can be matched and distances calculated. The first tree was created on a Sequent parallel computer and demonstrates that the walking tree heuristic can be efficiently parallelized. The second tree was created using a network of work stations and demonstrates that there is suffient parallelism in the phylogenetic tree calculation that the sequential walking tree can be used effectively on a network.

  1. Answer first: Applying the heuristic-analytic theory of reasoning to examine student intuitive thinking in the context of physics

    NASA Astrophysics Data System (ADS)

    Kryjevskaia, Mila; Stetzer, MacKenzie R.; Grosz, Nathaniel

    2014-12-01

    We have applied the heuristic-analytic theory of reasoning to interpret inconsistencies in student reasoning approaches to physics problems. This study was motivated by an emerging body of evidence that suggests that student conceptual and reasoning competence demonstrated on one task often fails to be exhibited on another. Indeed, even after instruction specifically designed to address student conceptual and reasoning difficulties identified by rigorous research, many undergraduate physics students fail to build reasoning chains from fundamental principles even though they possess the required knowledge and skills to do so. Instead, they often rely on a variety of intuitive reasoning strategies. In this study, we developed and employed a methodology that allowed for the disentanglement of student conceptual understanding and reasoning approaches through the use of sequences of related questions. We have shown that the heuristic-analytic theory of reasoning can be used to account for, in a mechanistic fashion, the observed inconsistencies in student responses. In particular, we found that students tended to apply their correct ideas in a selective manner that supported a specific and likely anticipated conclusion while neglecting to employ the same ideas to refute an erroneous intuitive conclusion. The observed reasoning patterns were consistent with the heuristic-analytic theory, according to which reasoners develop a "first-impression" mental model and then construct an argument in support of the answer suggested by this model. We discuss implications for instruction and argue that efforts to improve student metacognition, which serves to regulate the interaction between intuitive and analytical reasoning, is likely to lead to improved student reasoning.

  2. Runway Scheduling Using Generalized Dynamic Programming

    NASA Technical Reports Server (NTRS)

    Montoya, Justin; Wood, Zachary; Rathinam, Sivakumar

    2011-01-01

    A generalized dynamic programming method for finding a set of pareto optimal solutions for a runway scheduling problem is introduced. The algorithm generates a set of runway fight sequences that are optimal for both runway throughput and delay. Realistic time-based operational constraints are considered, including miles-in-trail separation, runway crossings, and wake vortex separation. The authors also model divergent runway takeoff operations to allow for reduced wake vortex separation. A modeled Dallas/Fort Worth International airport and three baseline heuristics are used to illustrate preliminary benefits of using the generalized dynamic programming method. Simulated traffic levels ranged from 10 aircraft to 30 aircraft with each test case spanning 15 minutes. The optimal solution shows a 40-70 percent decrease in the expected delay per aircraft over the baseline schedulers. Computational results suggest that the algorithm is promising for real-time application with an average computation time of 4.5 seconds. For even faster computation times, two heuristics are developed. As compared to the optimal, the heuristics are within 5% of the expected delay per aircraft and 1% of the expected number of runway operations per hour ad can be 100x faster.

  3. A Two-Echelon Cooperated Routing Problem for a Ground Vehicle and Its Carried Unmanned Aerial Vehicle.

    PubMed

    Luo, Zhihao; Liu, Zhong; Shi, Jianmai

    2017-05-17

    In this paper, a two-echelon cooperated routing problem for the ground vehicle (GV) and its carried unmanned aerial vehicle (UAV) is investigated, where the GV travels on the road network and its UAV travels in areas beyond the road to visit a number of targets unreached by the GV. In contrast to the classical two-echelon routing problem, the UAV has to launch and land on the GV frequently to change or charge its battery while the GV is moving on the road network. A new 0-1 integer programming model is developed to formulate the problem, where the constraints on the spatial and temporal cooperation of GV and UAV routes are included. Two heuristics are proposed to solve the model: the first heuristic (H1) constructs a complete tour for all targets and splits it by GV routes, while the second heuristic (H2) constructs the GV tour and assigns UAV flights to it. Random instances with six different sizes (25-200 targets, 12-80 rendezvous nodes) are used to test the algorithms. Computational results show that H1 performs slightly better than H2, while H2 uses less time and is more stable.

  4. A heuristic for efficient data distribution management in distributed simulation

    NASA Astrophysics Data System (ADS)

    Gupta, Pankaj; Guha, Ratan K.

    2005-05-01

    In this paper, we propose an algorithm for reducing the complexity of region matching and efficient multicasting in data distribution management component of High Level Architecture (HLA) Run Time Infrastructure (RTI). The current data distribution management (DDM) techniques rely on computing the intersection between the subscription and update regions. When a subscription region and an update region of different federates overlap, RTI establishes communication between the publisher and the subscriber. It subsequently routes the updates from the publisher to the subscriber. The proposed algorithm computes the update/subscription regions matching for dynamic allocation of multicast group. It provides new multicast routines that exploit the connectivity of federation by communicating updates regarding interactions and routes information only to those federates that require them. The region-matching problem in DDM reduces to clique-covering problem using the connections graph abstraction where the federations represent the vertices and the update/subscribe relations represent the edges. We develop an abstract model based on connection graph for data distribution management. Using this abstract model, we propose a heuristic for solving the region-matching problem of DDM. We also provide complexity analysis of the proposed heuristics.

  5. Learning to improve iterative repair scheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene

    1992-01-01

    This paper presents a general learning method for dynamically selecting between repair heuristics in an iterative repair scheduling system. The system employs a version of explanation-based learning called Plausible Explanation-Based Learning (PEBL) that uses multiple examples to confirm conjectured explanations. The basic approach is to conjecture contradictions between a heuristic and statistics that measure the quality of the heuristic. When these contradictions are confirmed, a different heuristic is selected. To motivate the utility of this approach we present an empirical evaluation of the performance of a scheduling system with respect to two different repair strategies. We show that the scheduler that learns to choose between the heuristics outperforms the same scheduler with any one of two heuristics alone.

  6. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    PubMed

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  7. [Risk of contamination from exposure to Rio Doce water: a case study on the population's perceptions in Tumiritinga, Minas Gerais State, Brazil].

    PubMed

    Guedes, Gilvan Ramalho; Simão, Andréa Branco; Dias, Carlos Alberto; Braga, Eliza de Oliveira

    2015-06-01

    The close relationship between local residents and the Rio Doce and the river's recurrent flooding lead to continuous exposure of the population to waterborne diseases. Given the epidemiological importance of such diseases in the region, this study analyzes the association between risk perception of contamination and river water use, as well as the heuristic mechanisms used by individuals to shape their personal perception of risk. Regression models coupled with thematic network analysis were applied to primary data from 352 households in 2012. The data are representative of urban residents of Tumiritinga, Minas Gerais State, Brazil. The results show that while 92.6% of respondents perceived high risk of waterborne diseases, only 11.4% reported not making direct use of the river. This apparent paradox is explained by the lack of information on transmission mechanisms, underestimating the perception of contamination. Public campaigns to promote preventive behavior should stress how waterborne diseases are transmitted, using simple examples to reach a wider local audience.

  8. The distribution of stars most likely to harbor intelligent life.

    PubMed

    Whitmire, Daniel P; Matese, John J

    2009-09-01

    Simple heuristic models and recent numerical simulations show that the probability of habitable planet formation increases with stellar mass. We combine those results with the distribution of main-sequence stellar masses to obtain the distribution of stars most likely to possess habitable planets as a function of stellar lifetime. We then impose the self-selection condition that intelligent observers can only find themselves around a star with a lifetime greater than the time required for that observer to have evolved, T(i). This allows us to obtain the stellar timescale number distribution for a given value of T(i). Our results show that for habitable planets with a civilization that evolved at time T(i) = 4.5 Gyr the median stellar lifetime is 13 Gyr, corresponding approximately to a stellar type of G5, with two-thirds of the stars having lifetimes between 7 and 30 Gyr, corresponding approximately to spectral types G0-K5. For other values of T(i) the median stellar lifetime changes by less than 50%.

  9. Automated problem scheduling and reduction of synchronization delay effects

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.

    1987-01-01

    It is anticipated that in order to make effective use of many future high performance architectures, programs will have to exhibit at least a medium grained parallelism. A framework is presented for partitioning very sparse triangular systems of linear equations that is designed to produce favorable preformance results in a wide variety of parallel architectures. Efficient methods for solving these systems are of interest because: (1) they provide a useful model problem for use in exploring heuristics for the aggregation, mapping and scheduling of relatively fine grained computations whose data dependencies are specified by directed acrylic graphs, and (2) because such efficient methods can find direct application in the development of parallel algorithms for scientific computation. Simple expressions are derived that describe how to schedule computational work with varying degrees of granularity. The Encore Multimax was used as a hardware simulator to investigate the performance effects of using the partitioning techniques presented in shared memory architectures with varying relative synchronization costs.

  10. Time-optical spinup maneuvers of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Singh, G.; Kabamba, P. T.; Mcclamroch, N. H.

    1990-01-01

    Attitude controllers for spacecraft have been based on the assumption that the bodies being controlled are rigid. Future spacecraft, however, may be quite flexible. Many applications require spinning up/down these vehicles. In this work the minimum time control of these maneuvers is considered. The time-optimal control is shown to possess an important symmetry property. Taking advantage of this property, the necessary and sufficient conditions for optimality are transformed into a system of nonlinear algebraic equations in the control switching times during one half of the maneuver, the maneuver time, and the costates at the mid-maneuver time. These equations can be solved using a homotopy approach. Control spillover measures are introduced and upper bounds on these measures are obtained. For a special case these upper bounds can be expressed in closed form for an infinite dimensional evaluation model. Rotational stiffening effects are ignored in the optimal control analysis. Based on a heuristic argument a simple condition is given which justifies the omission of these nonlinear effects. This condition is validated by numerical simulation.

  11. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  12. Investigating the Impacts of Design Heuristics on Idea Initiation and Development

    ERIC Educational Resources Information Center

    Kramer, Julia; Daly, Shanna R.; Yilmaz, Seda; Seifert, Colleen M.; Gonzalez, Richard

    2015-01-01

    This paper presents an analysis of engineering students' use of Design Heuristics as part of a team project in an undergraduate engineering design course. Design Heuristics are an empirically derived set of cognitive "rules of thumb" for use in concept generation. We investigated heuristic use in the initial concept generation phase,…

  13. Heuristics Made Easy: An Effort-Reduction Framework

    ERIC Educational Resources Information Center

    Shah, Anuj K.; Oppenheimer, Daniel M.

    2008-01-01

    In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…

  14. Heuristic Diagrams as a Tool to Teach History of Science

    ERIC Educational Resources Information Center

    Chamizo, Jose A.

    2012-01-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…

  15. Automated detection of heuristics and biases among pathologists in a computer-based system.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.

  16. Model Specification Searches Using Ant Colony Optimization Algorithms

    ERIC Educational Resources Information Center

    Marcoulides, George A.; Drezner, Zvi

    2003-01-01

    Ant colony optimization is a recently proposed heuristic procedure inspired by the behavior of real ants. This article applies the procedure to model specification searches in structural equation modeling and reports the results. The results demonstrate the capabilities of ant colony optimization algorithms for conducting automated searches.

  17. Cognitive Abilities, Monitoring Confidence, and Control Thresholds Explain Individual Differences in Heuristics and Biases

    PubMed Central

    Jackson, Simon A.; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar

    2016-01-01

    In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants (N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation. PMID:27790170

  18. Cognitive Abilities, Monitoring Confidence, and Control Thresholds Explain Individual Differences in Heuristics and Biases.

    PubMed

    Jackson, Simon A; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar

    2016-01-01

    In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants ( N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation.

  19. More than one way to see it: Individual heuristics in avian visual computation

    PubMed Central

    Ravignani, Andrea; Westphal-Fitch, Gesche; Aust, Ulrike; Schlumpp, Martin M.; Fitch, W. Tecumseh

    2015-01-01

    Comparative pattern learning experiments investigate how different species find regularities in sensory input, providing insights into cognitive processing in humans and other animals. Past research has focused either on one species’ ability to process pattern classes or different species’ performance in recognizing the same pattern, with little attention to individual and species-specific heuristics and decision strategies. We trained and tested two bird species, pigeons (Columba livia) and kea (Nestor notabilis, a parrot species), on visual patterns using touch-screen technology. Patterns were composed of several abstract elements and had varying degrees of structural complexity. We developed a model selection paradigm, based on regular expressions, that allowed us to reconstruct the specific decision strategies and cognitive heuristics adopted by a given individual in our task. Individual birds showed considerable differences in the number, type and heterogeneity of heuristic strategies adopted. Birds’ choices also exhibited consistent species-level differences. Kea adopted effective heuristic strategies, based on matching learned bigrams to stimulus edges. Individual pigeons, in contrast, adopted an idiosyncratic mix of strategies that included local transition probabilities and global string similarity. Although performance was above chance and quite high for kea, no individual of either species provided clear evidence of learning exactly the rule used to generate the training stimuli. Our results show that similar behavioral outcomes can be achieved using dramatically different strategies and highlight the dangers of combining multiple individuals in a group analysis. These findings, and our general approach, have implications for the design of future pattern learning experiments, and the interpretation of comparative cognition research more generally. PMID:26113444

  20. PathEdEx – Uncovering High-explanatory Visual Diagnostics Heuristics Using Digital Pathology and Multiscale Gaze Data

    PubMed Central

    Shin, Dmitriy; Kovalenko, Mikhail; Ersoy, Ilker; Li, Yu; Doll, Donald; Shyu, Chi-Ren; Hammer, Richard

    2017-01-01

    Background: Visual heuristics of pathology diagnosis is a largely unexplored area where reported studies only provided a qualitative insight into the subject. Uncovering and quantifying pathology visual and nonvisual diagnostic patterns have great potential to improve clinical outcomes and avoid diagnostic pitfalls. Methods: Here, we present PathEdEx, an informatics computational framework that incorporates whole-slide digital pathology imaging with multiscale gaze-tracking technology to create web-based interactive pathology educational atlases and to datamine visual and nonvisual diagnostic heuristics. Results: We demonstrate the capabilities of PathEdEx for mining visual and nonvisual diagnostic heuristics using the first PathEdEx volume of a hematopathology atlas. We conducted a quantitative study on the time dynamics of zooming and panning operations utilized by experts and novices to come to the correct diagnosis. We then performed association rule mining to determine sets of diagnostic factors that consistently result in a correct diagnosis, and studied differences in diagnostic strategies across different levels of pathology expertise using Markov chain (MC) modeling and MC Monte Carlo simulations. To perform these studies, we translated raw gaze points to high-explanatory semantic labels that represent pathology diagnostic clues. Therefore, the outcome of these studies is readily transformed into narrative descriptors for direct use in pathology education and practice. Conclusion: PathEdEx framework can be used to capture best practices of pathology visual and nonvisual diagnostic heuristics that can be passed over to the next generation of pathologists and have potential to streamline implementation of precision diagnostics in precision medicine settings. PMID:28828200

Top