Science.gov

Sample records for heuristic search method

  1. Theoretical Analysis of Heuristic Search Methods for Online POMDPs.

    PubMed

    Ross, Stéphane; Pineau, Joelle; Chaib-Draa, Brahim

    2008-01-01

    Planning in partially observable environments remains a challenging problem, despite significant recent advances in offline approximation techniques. A few online methods have also been proposed recently, and proven to be remarkably scalable, but without the theoretical guarantees of their offline counterparts. Thus it seems natural to try to unify offline and online techniques, preserving the theoretical properties of the former, and exploiting the scalability of the latter. In this paper, we provide theoretical guarantees on an anytime algorithm for POMDPs which aims to reduce the error made by approximate offline value iteration algorithms through the use of an efficient online searching procedure. The algorithm uses search heuristics based on an error analysis of lookahead search, to guide the online search towards reachable beliefs with the most potential to reduce error. We provide a general theorem showing that these search heuristics are admissible, and lead to complete and ε-optimal algorithms. This is, to the best of our knowledge, the strongest theoretical result available for online POMDP solution methods. We also provide empirical evidence showing that our approach is also practical, and can find (provably) near-optimal solutions in reasonable time.

  2. Search and heuristics

    SciTech Connect

    Pearl, J.

    1983-01-01

    This work is comprised of articles which are representative of current research on search and heuristics. The general theme is the quest for understanding the workings of heuristic knowledge; how it is acquired, stored and used by people, how it can be represented and utilized by machines and what makes one heuristic succeed where others fail. Topics covered include the following: search and reasoning in problem solving; theory formation by heuristic search; the nature of heuristics II: background and examples; Eurisko: a program that learns new heuristics and domain concepts; the nature of heuristics III: program design and results; searching for an optimal path in a tree with random costs; search rearrangement backtracking and polynomial average time; consistent-labeling problems and their algorithms: expected-complexities and theory-based heuristics; general branch and bound formulation for understanding and synthesizing and/or tree search procedures; a minimax algorithm better than alpha-beta. yes and no; and pathology on game trees revisited, and an alternative to minimaxing.

  3. Learning to Search: From Weak Methods to Domain-Specific Heuristics.

    ERIC Educational Resources Information Center

    Langley, Pat

    1985-01-01

    Examines processes by which general but weak search methods are transformed into powerful, domain-specific search strategies by classifying types of heuristics learning that can occur and components that contribute to such learning. A learning system--SAGE.2--and its structure, behavior in different domains, and future directions are explored. (36…

  4. Learning to Search: From Weak Methods to Domain-Specific Heuristics.

    ERIC Educational Resources Information Center

    Langley, Pat

    1985-01-01

    Examines processes by which general but weak search methods are transformed into powerful, domain-specific search strategies by classifying types of heuristics learning that can occur and components that contribute to such learning. A learning system--SAGE.2--and its structure, behavior in different domains, and future directions are explored. (36…

  5. Heuristic method for searches on large data-sets organised using network models

    NASA Astrophysics Data System (ADS)

    Ruiz-Fernández, D.; Quintana-Pacheco, Y.

    2016-05-01

    Searches on large data-sets have become an important issue in recent years. An alternative, which has achieved good results, is the use of methods relying on data mining techniques, such as cluster-based retrieval. This paper proposes a heuristic search that is based on an organisational model that reflects similarity relationships among data elements. The search is guided by using quality estimators of model nodes, which are obtained by the progressive evaluation of the given target function for the elements associated with each node. The results of the experiments confirm the effectiveness of the proposed algorithm. High-quality solutions are obtained evaluating a relatively small percentage of elements in the data-sets.

  6. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  7. Learning to Search. From Weak Methods to Domain-Specific Heuristics.

    DTIC Science & Technology

    1984-09-01

    Proceedings of the European Conference * on Artiicial Intelligence , 1982, 151-157. Cahn, A. A Puzzle with a Goal Recursive Strategy: The Mattress...states. ,,’ / / ,q .. 1. Introduction "The ability to search is central to intelligence , and the ability to direct scarch down profitable paths is...interest for Artificial Intelligence . In this paper. we examine the process by which general but weak methods are transformed into powerful. domain

  8. A Heuristic Distributed Task Allocation Method for Multivehicle Multitask Problems and Its Application to Search and Rescue Scenario.

    PubMed

    Zhao, Wanqing; Meng, Qinggang; Chung, Paul W H

    2016-04-01

    Using distributed task allocation methods for cooperating multivehicle systems is becoming increasingly attractive. However, most effort is placed on various specific experimental work and little has been done to systematically analyze the problem of interest and the existing methods. In this paper, a general scenario description and a system configuration are first presented according to search and rescue scenario. The objective of the problem is then analyzed together with its mathematical formulation extracted from the scenario. Considering the requirement of distributed computing, this paper then proposes a novel heuristic distributed task allocation method for multivehicle multitask assignment problems. The proposed method is simple and effective. It directly aims at optimizing the mathematical objective defined for the problem. A new concept of significance is defined for every task and is measured by the contribution to the local cost generated by a vehicle, which underlies the key idea of the algorithm. The whole algorithm iterates between a task inclusion phase, and a consensus and task removal phase, running concurrently on all the vehicles where local communication exists between them. The former phase is used to include tasks into a vehicle's task list for optimizing the overall objective, while the latter is to reach consensus on the significance value of tasks for each vehicle and to remove the tasks that have been assigned to other vehicles. Numerical simulations demonstrate that the proposed method is able to provide a conflict-free solution and can achieve outstanding performance in comparison with the consensus-based bundle algorithm.

  9. BCI Control of Heuristic Search Algorithms

    PubMed Central

    Cavazza, Marc; Aranyi, Gabor; Charles, Fred

    2017-01-01

    The ability to develop Brain-Computer Interfaces (BCI) to Intelligent Systems would offer new perspectives in terms of human supervision of complex Artificial Intelligence (AI) systems, as well as supporting new types of applications. In this article, we introduce a basic mechanism for the control of heuristic search through fNIRS-based BCI. The rationale is that heuristic search is not only a basic AI mechanism but also one still at the heart of many different AI systems. We investigate how users’ mental disposition can be harnessed to influence the performance of heuristic search algorithm through a mechanism of precision-complexity exchange. From a system perspective, we use weighted variants of the A* algorithm which have an ability to provide faster, albeit suboptimal solutions. We use recent results in affective BCI to capture a BCI signal, which is indicative of a compatible mental disposition in the user. It has been established that Prefrontal Cortex (PFC) asymmetry is strongly correlated to motivational dispositions and results anticipation, such as approach or even risk-taking, and that this asymmetry is amenable to Neurofeedback (NF) control. Since PFC asymmetry is accessible through fNIRS, we designed a BCI paradigm in which users vary their PFC asymmetry through NF during heuristic search tasks, resulting in faster solutions. This is achieved through mapping the PFC asymmetry value onto the dynamic weighting parameter of the weighted A* (WA*) algorithm. We illustrate this approach through two different experiments, one based on solving 8-puzzle configurations, and the other on path planning. In both experiments, subjects were able to speed up the computation of a solution through a reduction of search space in WA*. Our results establish the ability of subjects to intervene in heuristic search progression, with effects which are commensurate to their control of PFC asymmetry: this opens the way to new mechanisms for the implementation of hybrid

  10. BCI Control of Heuristic Search Algorithms.

    PubMed

    Cavazza, Marc; Aranyi, Gabor; Charles, Fred

    2017-01-01

    The ability to develop Brain-Computer Interfaces (BCI) to Intelligent Systems would offer new perspectives in terms of human supervision of complex Artificial Intelligence (AI) systems, as well as supporting new types of applications. In this article, we introduce a basic mechanism for the control of heuristic search through fNIRS-based BCI. The rationale is that heuristic search is not only a basic AI mechanism but also one still at the heart of many different AI systems. We investigate how users' mental disposition can be harnessed to influence the performance of heuristic search algorithm through a mechanism of precision-complexity exchange. From a system perspective, we use weighted variants of the A* algorithm which have an ability to provide faster, albeit suboptimal solutions. We use recent results in affective BCI to capture a BCI signal, which is indicative of a compatible mental disposition in the user. It has been established that Prefrontal Cortex (PFC) asymmetry is strongly correlated to motivational dispositions and results anticipation, such as approach or even risk-taking, and that this asymmetry is amenable to Neurofeedback (NF) control. Since PFC asymmetry is accessible through fNIRS, we designed a BCI paradigm in which users vary their PFC asymmetry through NF during heuristic search tasks, resulting in faster solutions. This is achieved through mapping the PFC asymmetry value onto the dynamic weighting parameter of the weighted A* (WA*) algorithm. We illustrate this approach through two different experiments, one based on solving 8-puzzle configurations, and the other on path planning. In both experiments, subjects were able to speed up the computation of a solution through a reduction of search space in WA*. Our results establish the ability of subjects to intervene in heuristic search progression, with effects which are commensurate to their control of PFC asymmetry: this opens the way to new mechanisms for the implementation of hybrid

  11. A Graph Search Heuristic for Shortest Distance Paths

    SciTech Connect

    Chow, E

    2005-03-24

    This paper presents a heuristic for guiding A* search for finding the shortest distance path between two vertices in a connected, undirected, and explicitly stored graph. The heuristic requires a small amount of data to be stored at each vertex. The heuristic has application to quickly detecting relationships between two vertices in a large information or knowledge network. We compare the performance of this heuristic with breadth-first search on graphs with various topological properties. The results show that one or more orders of magnitude improvement in the number of vertices expanded is possible for large graphs, including Poisson random graphs.

  12. Remotely sensed image processing service composition based on heuristic search

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxia; Zhu, Qing; Li, Hai-feng; Zhao, Wen-hao

    2008-12-01

    As remote sensing technology become ever more powerful with multi-platform and multi-sensor, it has been widely recognized for contributing to geospatial information efforts. Because the remotely sensed image processing demands large-scale, collaborative processing and massive storage capabilities to satisfy the increasing demands of various applications, the effect and efficiency of the remotely sensed image processing is far from the user's expectation. The emergence of Service Oriented Architecture (SOA) may make this challenge manageable. It encapsulate all processing function into services and recombine them with service chain. The service composition on demand has become a hot topic. Aiming at the success rate, quality and efficiency of processing service composition for remote sensing application, a remote sensed image processing service composition method is proposed in this paper. It composes services for a user requirement through two steps: 1) dynamically constructs a complete service dependency graph for user requirement on-line; 2) AO* based heuristic searches for optimal valid path in service dependency graph. These services within the service dependency graph are considered relevant to the specific request, instead of overall registered services. The second step, heuristic search is a promising approach for automated planning. Starting with the initial state, AO* uses a heuristic function to select states until the user requirement is reached. Experimental results show that this method has a good performance even the repository has a large number of processing services.

  13. Automated discovery of local search heuristics for satisfiability testing.

    PubMed

    Fukunaga, Alex S

    2008-01-01

    The development of successful metaheuristic algorithms such as local search for a difficult problem such as satisfiability testing (SAT) is a challenging task. We investigate an evolutionary approach to automating the discovery of new local search heuristics for SAT. We show that several well-known SAT local search algorithms such as Walksat and Novelty are composite heuristics that are derived from novel combinations of a set of building blocks. Based on this observation, we developed CLASS, a genetic programming system that uses a simple composition operator to automatically discover SAT local search heuristics. New heuristics discovered by CLASS are shown to be competitive with the best Walksat variants, including Novelty+. Evolutionary algorithms have previously been applied to directly evolve a solution for a particular SAT instance. We show that the heuristics discovered by CLASS are also competitive with these previous, direct evolutionary approaches for SAT. We also analyze the local search behavior of the learned heuristics using the depth, mobility, and coverage metrics proposed by Schuurmans and Southey.

  14. An Improved Heuristic Method for Subgraph Isomorphism Problem

    NASA Astrophysics Data System (ADS)

    Xiang, Yingzhuo; Han, Jiesi; Xu, Haijiang; Guo, Xin

    2017-09-01

    This paper focus on the subgraph isomorphism (SI) problem. We present an improved genetic algorithm, a heuristic method to search the optimal solution. The contribution of this paper is that we design a dedicated crossover algorithm and a new fitness function to measure the evolution process. Experiments show our improved genetic algorithm performs better than other heuristic methods. For a large graph, such as a subgraph of 40 nodes, our algorithm outperforms the traditional tree search algorithms. We find that the performance of our improved genetic algorithm does not decrease as the number of nodes in prototype graphs.

  15. Best-First Heuristic Search for Multicore Machines

    DTIC Science & Technology

    2010-01-01

    w factor of the optimal solution cost) (Davis, Bramanti -Gregor, & Wang, 1988). It is possible to modify AHDA*, BFPSDD, and PBNF to use weights to... Bramanti -Gregor, A., & Wang, J. (1988). The advantages of using depth and breadth components in heuristic search. In Methodologies for Intelligent Systems 3

  16. Heuristics for Relevancy Ranking of Earth Dataset Search Results

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Quinn, P.; Norton, J.

    2016-12-01

    As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.

  17. Heuristics for Relevancy Ranking of Earth Dataset Search Results

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Quinn, Patrick; Norton, James

    2016-01-01

    As the Variety of Earth science datasets increases, science researchers find it more challenging to discover and select the datasets that best fit their needs. The most common way of search providers to address this problem is to rank the datasets returned for a query by their likely relevance to the user. Large web page search engines typically use text matching supplemented with reverse link counts, semantic annotations and user intent modeling. However, this produces uneven results when applied to dataset metadata records simply externalized as a web page. Fortunately, data and search provides have decades of experience in serving data user communities, allowing them to form heuristics that leverage the structure in the metadata together with knowledge about the user community. Some of these heuristics include specific ways of matching the user input to the essential measurements in the dataset and determining overlaps of time range and spatial areas. Heuristics based on the novelty of the datasets can prioritize later, better versions of data over similar predecessors. And knowledge of how different user types and communities use data can be brought to bear in cases where characteristics of the user (discipline, expertise) or their intent (applications, research) can be divined. The Earth Observing System Data and Information System has begun implementing some of these heuristics in the relevancy algorithm of its Common Metadata Repository search engine.

  18. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1989-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  19. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1988-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  20. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1989-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  1. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  2. Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy.

    PubMed

    Blumenthal-Barby, J S; Krieger, Heather

    2015-05-01

    The role of cognitive biases and heuristics in medical decision making is of growing interest. The purpose of this study was to determine whether studies on cognitive biases and heuristics in medical decision making are based on actual or hypothetical decisions and are conducted with populations that are representative of those who typically make the medical decision; to categorize the types of cognitive biases and heuristics found and whether they are found in patients or in medical personnel; and to critically review the studies based on standard methodological quality criteria. Data sources were original, peer-reviewed, empirical studies on cognitive biases and heuristics in medical decision making found in Ovid Medline, PsycINFO, and the CINAHL databases published in 1980-2013. Predefined exclusion criteria were used to identify 213 studies. During data extraction, information was collected on type of bias or heuristic studied, respondent population, decision type, study type (actual or hypothetical), study method, and study conclusion. Of the 213 studies analyzed, 164 (77%) were based on hypothetical vignettes, and 175 (82%) were conducted with representative populations. Nineteen types of cognitive biases and heuristics were found. Only 34% of studies (n = 73) investigated medical personnel, and 68% (n = 145) confirmed the presence of a bias or heuristic. Each methodological quality criterion was satisfied by more than 50% of the studies, except for sample size and validated instruments/questions. Limitations are that existing terms were used to inform search terms, and study inclusion criteria focused strictly on decision making. Most of the studies on biases and heuristics in medical decision making are based on hypothetical vignettes, raising concerns about applicability of these findings to actual decision making. Biases and heuristics have been underinvestigated in medical personnel compared with patients. © The Author(s) 2014.

  3. Novel heuristic search for ventricular arrhythmia detection using normalized cut clustering.

    PubMed

    Castro-Ospina, A E; Castro-Hoyos, C; Peluffo-Ordoñez, D; Castellanos-Dominguez, G

    2013-01-01

    Processing of the long-term ECG Holter recordings for accurate arrhythmia detection is a problem that has been addressed in several approaches. However, there is not an outright method for heartbeat classification able to handle problems such as the large amount of data and highly unbalanced classes. This work introduces a heuristic-search-based clustering to discriminate among ventricular cardiac arrhythmias in Holter recordings. The proposed method is posed under the normalized cut criterion, which iteratively seeks for the nodes to be grouped into the same cluster. Searching procedure is carried out in accordance to the introduced maximum similarity value. Since our approach is unsupervised, a procedure for setting the initial algorithm parameters is proposed by fixing the initial nodes using a kernel density estimator. Results are obtained from MIT/BIH arrhythmia database providing heartbeat labelling. As a result, proposed heuristic-search-based clustering shows an adequate performance, even in the presence of strong unbalanced classes.

  4. Minimizing conflicts: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steve; Johnston, Mark; Philips, Andrew; Laird, Phil

    1992-01-01

    This paper describes a simple heuristic approach to solving large-scale constraint satisfaction and scheduling problems. In this approach one starts with an inconsistent assignment for a set of variables and searches through the space of possible repairs. The search can be guided by a value-ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. The heuristic can be used with a variety of different search strategies. We demonstrate empirically that on the n-queens problem, a technique based on this approach performs orders of magnitude better than traditional backtracking techniques. We also describe a scheduling application where the approach has been used successfully. A theoretical analysis is presented both to explain why this method works well on certain types of problems and to predict when it is likely to be most effective.

  5. A Library of Local Search Heuristics for the Vehicle Routing Problem

    SciTech Connect

    Groer, Christopher S; Golden, Bruce; Edward, Wasil

    2010-01-01

    The vehicle routing problem (VRP) is a difficult and well-studied combinatorial optimization problem. Real-world instances of the VRP can contain hundreds and even thousands of customer locations and can involve many complicating constraints, necessitating the use of heuristic methods. We present a software library of local search heuristics that allow one to quickly generate good solutions to VRP instances. The code has a logical, object-oriented design and uses efficient data structures to store and modify solutions. The core of the library is the implementation of seven local search operators that share a similar interface and are designed to be extended to handle additional options with minimal code change. The code is well-documented, is straightforward to compile, and is freely available for download at http://sites.google.com/site/vrphlibrary/ . The distribution of the code contains several applications that can be used to generate solutions to instances of the capacitated VRP.

  6. A method for extracting drainage networks with heuristic information from digital elevation models.

    PubMed

    Hou, Kun; Yang, Wei; Sun, Jigui; Sun, Tieli

    2011-01-01

    Depression filling and direction assignment over flat areas are critical issues in hydrologic analysis. This paper proposes a method to handle depressions and flat areas in one procedure. Being different from the traditional raster neighbourhoods processing with little heuristic information, the method is designed to compensate for the inadequate searching information of other methods. The proposed method routes flow through depressions and flat areas by searching for the outlet using the heuristic information. Heuristic information can reveal the general trend slope of the DEM (digital elevation models) and help the proposed method find the outlet accurately. The method is implemented in Pascal and experiments are carried out on actual DEM data. It can be seen from the comparison with the four existing methods that the proposed method can get a closer match result with the ground truth network. Moreover, the proposed method can avoid the generation of the unrealistic parallel drainage lines, unreal drainage lines and spurious terrain features.

  7. Local search heuristic for the discrete leader-follower problem with multiple follower objectives

    NASA Astrophysics Data System (ADS)

    Kochetov, Yury; Alekseeva, Ekaterina; Mezmaz, Mohand

    2016-10-01

    We study a discrete bilevel problem, called as well as leader-follower problem, with multiple objectives at the lower level. It is assumed that constraints at the upper level can include variables of both levels. For such ill-posed problem we define feasible and optimal solutions for pessimistic case. A central point of this work is a two stage method to get a feasible solution under the pessimistic case, given a leader decision. The target of the first stage is a follower solution that violates the leader constraints. The target of the second stage is a pessimistic feasible solution. Each stage calls a heuristic and a solver for a series of particular mixed integer programs. The method is integrated inside a local search based heuristic that is designed to find near-optimal leader solutions.

  8. A learning heuristic for space mapping and searching self-organizing systems using adaptive mesh refinement

    NASA Astrophysics Data System (ADS)

    Phillips, Carolyn L.

    2014-09-01

    In a complex self-organizing system, small changes in the interactions between the system's components can result in different emergent macrostructures or macrobehavior. In chemical engineering and material science, such spontaneously self-assembling systems, using polymers, nanoscale or colloidal-scale particles, DNA, or other precursors, are an attractive way to create materials that are precisely engineered at a fine scale. Changes to the interactions can often be described by a set of parameters. Different contiguous regions in this parameter space correspond to different ordered states. Since these ordered states are emergent, often experiment, not analysis, is necessary to create a diagram of ordered states over the parameter space. By issuing queries to points in the parameter space (e.g., performing a computational or physical experiment), ordered states can be discovered and mapped. Queries can be costly in terms of resources or time, however. In general, one would like to learn the most information using the fewest queries. Here we introduce a learning heuristic for issuing queries to map and search a two-dimensional parameter space. Using a method inspired by adaptive mesh refinement, the heuristic iteratively issues batches of queries to be executed in parallel based on past information. By adjusting the search criteria, different types of searches (for example, a uniform search, exploring boundaries, sampling all regions equally) can be flexibly implemented. We show that this method will densely search the space, while preferentially targeting certain features. Using numerical examples, including a study simulating the self-assembly of complex crystals, we show how this heuristic can discover new regions and map boundaries more accurately than a uniformly distributed set of queries.

  9. Comparative study of heuristic evaluation and usability testing methods.

    PubMed

    Thyvalikakath, Thankam Paul; Monaco, Valerie; Thambuganipalle, Himabindu; Schleyer, Titus

    2009-01-01

    Usability methods, such as heuristic evaluation, cognitive walk-throughs and user testing, are increasingly used to evaluate and improve the design of clinical software applications. There is still some uncertainty, however, as to how those methods can be used to support the development process and evaluation in the most meaningful manner. In this study, we compared the results of a heuristic evaluation with those of formal user tests in order to determine which usability problems were detected by both methods. We conducted heuristic evaluation and usability testing on four major commercial dental computer-based patient records (CPRs), which together cover 80% of the market for chairside computer systems among general dentists. Both methods yielded strong evidence that the dental CPRs have significant usability problems. An average of 50% of empirically-determined usability problems were identified by the preceding heuristic evaluation. Some statements of heuristic violations were specific enough to precisely identify the actual usability problem that study participants encountered. Other violations were less specific, but still manifested themselves in usability problems and poor task outcomes. In this study, heuristic evaluation identified a significant portion of problems found during usability testing. While we make no assumptions about the generalizability of the results to other domains and software systems, heuristic evaluation may, under certain circumstances, be a useful tool to determine design problems early in the development cycle.

  10. Protein sequence-similarity search acceleration using a heuristic algorithm with a sensitive matrix.

    PubMed

    Lim, Kyungtaek; Yamada, Kazunori D; Frith, Martin C; Tomii, Kentaro

    2016-12-01

    Protein database search for public databases is a fundamental step in the target selection of proteins in structural and functional genomics and also for inferring protein structure, function, and evolution. Most database search methods employ amino acid substitution matrices to score amino acid pairs. The choice of substitution matrix strongly affects homology detection performance. We earlier proposed a substitution matrix named MIQS that was optimized for distant protein homology search. Herein we further evaluate MIQS in combination with LAST, a heuristic and fast database search tool with a tunable sensitivity parameter m, where larger m denotes higher sensitivity. Results show that MIQS substantially improves the homology detection and alignment quality performance of LAST across diverse m parameters. Against a protein database consisting of approximately 15 million sequences, LAST with m = 10(5) achieves better homology detection performance than BLASTP, and completes the search 20 times faster. Compared to the most sensitive existing methods being used today, CS-BLAST and SSEARCH, LAST with MIQS and m = 10(6) shows comparable homology detection performance at 2.0 and 3.9 times greater speed, respectively. Results demonstrate that MIQS-powered LAST is a time-efficient method for sensitive and accurate homology search.

  11. An interdisciplinary heuristic evaluation method for universal building design.

    PubMed

    Afacan, Yasemin; Erbug, Cigdem

    2009-07-01

    This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.

  12. Heuristic-based tabu search algorithm for folding two-dimensional AB off-lattice model proteins.

    PubMed

    Liu, Jingfa; Sun, Yuanyuan; Li, Gang; Song, Beibei; Huang, Weibo

    2013-12-01

    The protein structure prediction problem is a classical NP hard problem in bioinformatics. The lack of an effective global optimization method is the key obstacle in solving this problem. As one of the global optimization algorithms, tabu search (TS) algorithm has been successfully applied in many optimization problems. We define the new neighborhood conformation, tabu object and acceptance criteria of current conformation based on the original TS algorithm and put forward an improved TS algorithm. By integrating the heuristic initialization mechanism, the heuristic conformation updating mechanism, and the gradient method into the improved TS algorithm, a heuristic-based tabu search (HTS) algorithm is presented for predicting the two-dimensional (2D) protein folding structure in AB off-lattice model which consists of hydrophobic (A) and hydrophilic (B) monomers. The tabu search minimization leads to the basins of local minima, near which a local search mechanism is then proposed to further search for lower-energy conformations. To test the performance of the proposed algorithm, experiments are performed on four Fibonacci sequences and two real protein sequences. The experimental results show that the proposed algorithm has found the lowest-energy conformations so far for three shorter Fibonacci sequences and renewed the results for the longest one, as well as two real protein sequences, demonstrating that the HTS algorithm is quite promising in finding the ground states for AB off-lattice model proteins.

  13. Hitch-hiking: a parallel heuristic search strategy, applied to the phylogeny problem.

    PubMed

    Charleston, M A

    2001-01-01

    The article introduces a parallel heuristic search strategy ("Hitch-hiking") which can be used in conjunction with other random-walk heuristic search strategies. It is applied to an artificial phylogeny problem, in which character sequences are evolved using pseudo-random numbers from a hypothetical ancestral sequence. The objective function to be minimized is the minimum number of character-state changes required on a binary tree that could account for the sequences observed at the tips (leaves) of the tree -- the Maximum Parsimony criterion. The Hitch-hiking strategy is shown to be useful in that it is robust and that on average the solutions found using the strategy are better than those found without. Also the strategy can dynamically provide information on the characteristics of the landscape of the problem. I argue that Hitch-hiking as a scheme for parallelization of existing heuristic search strategies is of potentially very general use, in many areas of combinatorial optimization.

  14. A simple heuristic for Internet-based evidence search in primary care: a randomized controlled trial

    PubMed Central

    Eberbach, Andreas; Becker, Annette; Rochon, Justine; Finkemeler, Holger; Wagner, Achim; Donner-Banzhoff, Norbert

    2016-01-01

    Background General practitioners (GPs) are confronted with a wide variety of clinical questions, many of which remain unanswered. Methods In order to assist GPs in finding quick, evidence-based answers, we developed a learning program (LP) with a short interactive workshop based on a simple three-step-heuristic to improve their search and appraisal competence (SAC). We evaluated the LP effectiveness with a randomized controlled trial (RCT). Participants (intervention group [IG] n=20; control group [CG] n=31) rated acceptance and satisfaction and also answered 39 knowledge questions to assess their SAC. We controlled for previous knowledge in content areas covered by the test. Results Main outcome – SAC: within both groups, the pre–post test shows significant (P=0.00) improvements in correctness (IG 15% vs CG 11%) and confidence (32% vs 26%) to find evidence-based answers. However, the SAC difference was not significant in the RCT. Other measures Most workshop participants rated “learning atmosphere” (90%), “skills acquired” (90%), and “relevancy to my practice” (86%) as good or very good. The LP-recommendations were implemented by 67% of the IG, whereas 15% of the CG already conformed to LP recommendations spontaneously (odds ratio 9.6, P=0.00). After literature search, the IG showed a (not significantly) higher satisfaction regarding “time spent” (IG 80% vs CG 65%), “quality of information” (65% vs 54%), and “amount of information” (53% vs 47%). Conclusion Long-standing established GPs have a good SAC. Despite high acceptance, strong learning effects, positive search experience, and significant increase of SAC in the pre–post test, the RCT of our LP showed no significant difference in SAC between IG and CG. However, we suggest that our simple decision heuristic merits further investigation. PMID:27563264

  15. Divergence of Scientific Heuristic Method and Direct Algebraic Instruction

    ERIC Educational Resources Information Center

    Calucag, Lina S.

    2016-01-01

    This is an experimental study, made used of the non-randomized experimental and control groups, pretest-posttest designs. The experimental and control groups were two separate intact classes in Algebra. For a period of twelve sessions, the experimental group was subjected to the scientific heuristic method, but the control group instead was given…

  16. Heuristics in Problem Solving: The Role of Direction in Controlling Search Space

    ERIC Educational Resources Information Center

    Chu, Yun; Li, Zheng; Su, Yong; Pizlo, Zygmunt

    2010-01-01

    Isomorphs of a puzzle called m+m resulted in faster solution times and an easily reproduced solution path in a labeled version of the problem compared to a more difficult binary version. We conjecture that performance is related to a type of heuristic called direction that not only constrains search space in the labeled version, but also…

  17. Heuristics in Problem Solving: The Role of Direction in Controlling Search Space

    ERIC Educational Resources Information Center

    Chu, Yun; Li, Zheng; Su, Yong; Pizlo, Zygmunt

    2010-01-01

    Isomorphs of a puzzle called m+m resulted in faster solution times and an easily reproduced solution path in a labeled version of the problem compared to a more difficult binary version. We conjecture that performance is related to a type of heuristic called direction that not only constrains search space in the labeled version, but also…

  18. A lifelong learning hyper-heuristic method for bin packing.

    PubMed

    Sim, Kevin; Hart, Emma; Paechter, Ben

    2015-01-01

    We describe a novel hyper-heuristic system that continuously learns over time to solve a combinatorial optimisation problem. The system continuously generates new heuristics and samples problems from its environment; and representative problems and heuristics are incorporated into a self-sustaining network of interacting entities inspired by methods in artificial immune systems. The network is plastic in both its structure and content, leading to the following properties: it exploits existing knowledge captured in the network to rapidly produce solutions; it can adapt to new problems with widely differing characteristics; and it is capable of generalising over the problem space. The system is tested on a large corpus of 3,968 new instances of 1D bin-packing problems as well as on 1,370 existing problems from the literature; it shows excellent performance in terms of the quality of solutions obtained across the datasets and in adapting to dynamically changing sets of problem instances compared to previous approaches. As the network self-adapts to sustain a minimal repertoire of both problems and heuristics that form a representative map of the problem space, the system is further shown to be computationally efficient and therefore scalable.

  19. A New Improved Hybrid Meta-Heuristics Method for Unit Commitment with Nonlinear Fuel Cost Function

    NASA Astrophysics Data System (ADS)

    Okawa, Kenta; Mori, Hiroyuki

    In this paper, a new improved hybrid meta-heuristic method is proposed to solve the unit commitment problem effectively. The objective is to minimize operation cost while satisfying the power balance constraints and so on. It may be formulated as a nonlinear mixed-integer problem. In other words, the unit commitment problem is hard to solve. Therefore, this paper makes use of a hybrid meta-heuristic method with two layers. Layer 1 determines the on/off conditions of generators with tabu search (TS) while Layer 2 evaluates output of generators with evolutionary particle swarm optimization (EPSO). The construction phase of Greedy Randomized Adaptive Search Procedure (GRASP) is used to create initial feasible solutions efficiently. Three kinds of meta-heuristic methods such as TS, EPSO and GRASP are combined to solve the problem. In addition, a parallel scheme of EPSO is developed to improve the computational efficient as well as the accuracy. The effectiveness of the proposed method is tested in sample systems.

  20. An extended abstract: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Johnston, Mark D.; Philips, Andrew B.; Laird, Philip

    1992-01-01

    The work described in this paper was inspired by a surprisingly effective neural network developed for scheduling astronomical observations on the Hubble Space Telescope. Our heuristic constraint satisfaction problem (CSP) method was distilled from an analysis of the network. In the process of carrying out the analysis, we discovered that the effectiveness of the network has little to do with its connectionist implementation. Furthermore, the ideas employed in the network can be implemented very efficiently within a symbolic CSP framework. The symbolic implementation is extremely simple. It also has the advantage that several different search strategies can be employed, although we have found that hill-climbing methods are particularly well-suited for the applications that we have investigated. We begin the paper with a brief review of the neural network. Following this, we describe our symbolic method for heuristic repair.

  1. Amoeba-Inspired Heuristic Search Dynamics for Exploring Chemical Reaction Paths

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Wakabayashi, Masamitsu

    2015-09-01

    We propose a nature-inspired model for simulating chemical reactions in a computationally resource-saving manner. The model was developed by extending our previously proposed heuristic search algorithm, called "AmoebaSAT [Aono et al. 2013]," which was inspired by the spatiotemporal dynamics of a single-celled amoeboid organism that exhibits sophisticated computing capabilities in adapting to its environment efficiently [Zhu et al. 2013]. AmoebaSAT is used for solving an NP-complete combinatorial optimization problem [Garey and Johnson 1979], "the satisfiability problem," and finds a constraint-satisfying solution at a speed that is dramatically faster than one of the conventionally known fastest stochastic local search methods [Iwama and Tamaki 2004] for a class of randomly generated problem instances [http://www.cs.ubc.ca/~hoos/5/benchm.html]. In cases where the problem has more than one solution, AmoebaSAT exhibits dynamic transition behavior among a variety of the solutions. Inheriting these features of AmoebaSAT, we formulate "AmoebaChem," which explores a variety of metastable molecules in which several constraints determined by input atoms are satisfied and generates dynamic transition processes among the metastable molecules. AmoebaChem and its developed forms will be applied to the study of the origins of life, to discover reaction paths for which expected or unexpected organic compounds may be formed via unknown unstable intermediates and to estimate the likelihood of each of the discovered paths.

  2. Amoeba-Inspired Heuristic Search Dynamics for Exploring Chemical Reaction Paths.

    PubMed

    Aono, Masashi; Wakabayashi, Masamitsu

    2015-09-01

    We propose a nature-inspired model for simulating chemical reactions in a computationally resource-saving manner. The model was developed by extending our previously proposed heuristic search algorithm, called "AmoebaSAT [Aono et al. 2013]," which was inspired by the spatiotemporal dynamics of a single-celled amoeboid organism that exhibits sophisticated computing capabilities in adapting to its environment efficiently [Zhu et al. 2013]. AmoebaSAT is used for solving an NP-complete combinatorial optimization problem [Garey and Johnson 1979], "the satisfiability problem," and finds a constraint-satisfying solution at a speed that is dramatically faster than one of the conventionally known fastest stochastic local search methods [Iwama and Tamaki 2004] for a class of randomly generated problem instances [ http://www.cs.ubc.ca/~hoos/5/benchm.html ]. In cases where the problem has more than one solution, AmoebaSAT exhibits dynamic transition behavior among a variety of the solutions. Inheriting these features of AmoebaSAT, we formulate "AmoebaChem," which explores a variety of metastable molecules in which several constraints determined by input atoms are satisfied and generates dynamic transition processes among the metastable molecules. AmoebaChem and its developed forms will be applied to the study of the origins of life, to discover reaction paths for which expected or unexpected organic compounds may be formed via unknown unstable intermediates and to estimate the likelihood of each of the discovered paths.

  3. Fitness landscapes, heuristics and technological paradigms: A critique on random search models in evolutionary economics

    NASA Astrophysics Data System (ADS)

    Frenken, Koen

    2001-06-01

    The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.

  4. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  5. Index Fund Optimization Using a Genetic Algorithm and a Heuristic Local Search

    NASA Astrophysics Data System (ADS)

    Orito, Yukiko; Inoguchi, Manabu; Yamamoto, Hisashi

    It is well known that index funds are popular passively managed portfolios and have been used very extensively for the hedge trading. Index funds consist of a certain number of stocks of listed companies on a stock market such that the fund's return rates follow a similar path to the changing rates of the market indices. However it is hard to make a perfect index fund consisting of all companies included in the given market index. Thus, the index fund optimization can be viewed as a combinatorial optimization for portfolio managements. In this paper, we propose an optimization method that consists of a genetic algorithm and a heuristic local search algorithm to make strong linear association between the fund's return rates and the changing rates of market index. We apply the method to the Tokyo Stock Exchange and make index funds whose return rates follow a similar path to the changing rates of Tokyo Stock Price Index (TOPIX). The results show that our proposal method makes the index funds with strong linear association to the market index by small computing time.

  6. A Hybrid Tabu Search Heuristic for a Bilevel Competitive Facility Location Model

    NASA Astrophysics Data System (ADS)

    Küçükaydın, Hande; Aras, Necati; Altınel, I. Kuban

    We consider a problem in which a firm or franchise enters a market by locating new facilities where there are existing facilities belonging to a competitor. The firm aims at finding the location and attractiveness of each facility to be opened so as to maximize its profit. The competitor, on the other hand, can react by adjusting the attractiveness of its existing facilities, opening new facilities and/or closing existing ones with the objective of maximizing its own profit. The demand is assumed to be aggregated at certain points in the plane and the facilities of the firm can be located at prespecified candidate sites. We employ Huff's gravity-based rule in modeling the behavior of the customers where the fraction of customers at a demand point that visit a certain facility is proportional to the facility attractiveness and inversely proportional to the distance between the facility site and demand point. We formulate a bilevel mixed-integer nonlinear programming model where the firm entering the market is the leader and the competitor is the follower. In order to find a feasible solution of this model, we develop a hybrid tabu search heuristic which makes use of two exact methods as subroutines: a gradient ascent method and a branch-and-bound algorithm with nonlinear programming relaxation.

  7. Heuristic search in robot configuration space using variable metric

    NASA Technical Reports Server (NTRS)

    Verwer, Ben J. H.

    1987-01-01

    A method to generate obstacle free trajectories for both mobile robots and linked robots is proposed. The approach generates the shortest paths in a configuration space. The metric in the configuration space can be adjusted to obtain a tradeoff between safety and velocity by imposing extra costs on paths near obstacles.

  8. Parameter Identification of Robot Manipulators: A Heuristic Particle Swarm Search Approach

    PubMed Central

    Yan, Danping; Lu, Yongzhong; Levy, David

    2015-01-01

    Parameter identification of robot manipulators is an indispensable pivotal process of achieving accurate dynamic robot models. Since these kinetic models are highly nonlinear, it is not easy to tackle the matter of identifying their parameters. To solve the difficulty effectively, we herewith present an intelligent approach, namely, a heuristic particle swarm optimization (PSO) algorithm, which we call the elitist learning strategy (ELS) and proportional integral derivative (PID) controller hybridized PSO approach (ELPIDSO). A specified PID controller is designed to improve particles’ local and global positions information together with ELS. Parameter identification of robot manipulators is conducted for performance evaluation of our proposed approach. Experimental results clearly indicate the following findings: Compared with standard PSO (SPSO) algorithm, ELPIDSO has improved a lot. It not only enhances the diversity of the swarm, but also features better search effectiveness and efficiency in solving practical optimization problems. Accordingly, ELPIDSO is superior to least squares (LS) method, genetic algorithm (GA), and SPSO algorithm in estimating the parameters of the kinetic models of robot manipulators. PMID:26039090

  9. Parameter identification of robot manipulators: a heuristic particle swarm search approach.

    PubMed

    Yan, Danping; Lu, Yongzhong; Levy, David

    2015-01-01

    Parameter identification of robot manipulators is an indispensable pivotal process of achieving accurate dynamic robot models. Since these kinetic models are highly nonlinear, it is not easy to tackle the matter of identifying their parameters. To solve the difficulty effectively, we herewith present an intelligent approach, namely, a heuristic particle swarm optimization (PSO) algorithm, which we call the elitist learning strategy (ELS) and proportional integral derivative (PID) controller hybridized PSO approach (ELPIDSO). A specified PID controller is designed to improve particles' local and global positions information together with ELS. Parameter identification of robot manipulators is conducted for performance evaluation of our proposed approach. Experimental results clearly indicate the following findings: Compared with standard PSO (SPSO) algorithm, ELPIDSO has improved a lot. It not only enhances the diversity of the swarm, but also features better search effectiveness and efficiency in solving practical optimization problems. Accordingly, ELPIDSO is superior to least squares (LS) method, genetic algorithm (GA), and SPSO algorithm in estimating the parameters of the kinetic models of robot manipulators.

  10. Finding minimum gene subsets with heuristic breadth-first search algorithm for robust tumor classification

    PubMed Central

    2012-01-01

    Background Previous studies on tumor classification based on gene expression profiles suggest that gene selection plays a key role in improving the classification performance. Moreover, finding important tumor-related genes with the highest accuracy is a very important task because these genes might serve as tumor biomarkers, which is of great benefit to not only tumor molecular diagnosis but also drug development. Results This paper proposes a novel gene selection method with rich biomedical meaning based on Heuristic Breadth-first Search Algorithm (HBSA) to find as many optimal gene subsets as possible. Due to the curse of dimensionality, this type of method could suffer from over-fitting and selection bias problems. To address these potential problems, a HBSA-based ensemble classifier is constructed using majority voting strategy from individual classifiers constructed by the selected gene subsets, and a novel HBSA-based gene ranking method is designed to find important tumor-related genes by measuring the significance of genes using their occurrence frequencies in the selected gene subsets. The experimental results on nine tumor datasets including three pairs of cross-platform datasets indicate that the proposed method can not only obtain better generalization performance but also find many important tumor-related genes. Conclusions It is found that the frequencies of the selected genes follow a power-law distribution, indicating that only a few top-ranked genes can be used as potential diagnosis biomarkers. Moreover, the top-ranked genes leading to very high prediction accuracy are closely related to specific tumor subtype and even hub genes. Compared with other related methods, the proposed method can achieve higher prediction accuracy with fewer genes. Moreover, they are further justified by analyzing the top-ranked genes in the context of individual gene function, biological pathway, and protein-protein interaction network. PMID:22830977

  11. Evaluating Simulation Heuristics in Monte-Carlo Tree Search and its Application to a Production Scheduling

    NASA Astrophysics Data System (ADS)

    Matsumoto, Shimpei; Kato, Kosuke; Hirosue, Noriaki; Ishii, Hiroaki

    2010-10-01

    This paper reports simulation heuristics of Monte-Carlo Tree Search (MCTS) and shows an application example. MCTS introduced by Coulom is a best-first search where pseudorandom simulations guide the solution of problem. Recent improvements on MCTS have produced strong computer Go program, which has a large search space, and the success is a hot topic for selecting the best move. So far, most of reports about MCTS have been on two-player games, and MCTS has been used rarely for one-player perfect-information games. MCTS does not need admissible heuristic, so the application of MCTS for one-player games might be an interesting alternative. Additionally, one-player games like puzzles are determinately operated only by one player's decision, so the sequences of changes in state are describable as a network diagram with interdependence between operations. If MCTS for one-player games is available as a meta-heuristic algorithm, we can use this algorithm for not only combinatorial optimization problems, but also many practical problems. Especially, as MCTS does not fully depend on evaluation function, so the solutions based on MCTS remain effective if objective function is modified. This paper firstly investigates on the application of Single Player MCTS (SP-MCTS) introduced by Schadd et al. to a puzzle game called Bubble Breaker. Next this paper shows the effectiveness of new simulation strategies of SP-MCTS, and considers the differences between each parameter. Based on the results, this paper discusses the application potentiality of SP-MCTS for a scheduling problem.

  12. Hierarchical heuristic search using a Gaussian mixture model for UAV coverage planning.

    PubMed

    Lin, Lanny; Goodrich, Michael A

    2014-12-01

    During unmanned aerial vehicle (UAV) search missions, efficient use of UAV flight time requires flight paths that maximize the probability of finding the desired subject. The probability of detecting the desired subject based on UAV sensor information can vary in different search areas due to environment elements like varying vegetation density or lighting conditions, making it likely that the UAV can only partially detect the subject. This adds another dimension of complexity to the already difficult (NP-Hard) problem of finding an optimal search path. We present a new class of algorithms that account for partial detection in the form of a task difficulty map and produce paths that approximate the payoff of optimal solutions. The algorithms use the mode goodness ratio heuristic that uses a Gaussian mixture model to prioritize search subregions. The algorithms search for effective paths through the parameter space at different levels of resolution. We compare the performance of the new algorithms against two published algorithms (Bourgault's algorithm and LHC-GW-CONV algorithm) in simulated searches with three real search and rescue scenarios, and show that the new algorithms outperform existing algorithms significantly and can yield efficient paths that yield payoffs near the optimal.

  13. On optimizing syntactic pattern recognition using tries and AI-based heuristic-search strategies.

    PubMed

    Badr, Ghada; Oommen, B John

    2006-06-01

    This paper deals with the problem of estimating, using enhanced artificial-intelligence (AI) techniques, a transmitted string X* by processing the corresponding string Y, which is a noisy version of X*. It is assumed that Y contains substitution, insertion, and deletion (SID) errors. The best estimate X+ of X* is defined as that element of a dictionary H that minimizes the generalized Levenshtein distance (GLD) D (X, Y) between X and Y, for all X epsilon H. In this paper, it is shown how to evaluate D (X, Y) for every X epsilon H simultaneously, when the edit distances are general and the maximum number of errors is not given a priori, and when H is stored as a trie. A new scheme called clustered beam search (CBS) is first introduced, which is a heuristic-based search approach that enhances the well-known beam-search (BS) techniques used in AI. The new scheme is then applied to the approximate string-matching problem when the dictionary is stored as a trie. The new technique is compared with the benchmark depth-first search (DFS) trie-based technique (with respect to time and accuracy) using large and small dictionaries. The results demonstrate a marked improvement of up to 75% with respect to the total number of operations needed on three benchmark dictionaries, while yielding an accuracy comparable to the optimal. Experiments are also done to show the benefits of the CBS over the BS when the search is done on the trie. The results also demonstrate a marked improvement (more than 91%) for large dictionaries.

  14. Animal, Vegetable, Mineral: A Method for Introducing Heuristics.

    ERIC Educational Resources Information Center

    Rivers, Thomas M.

    Students beginning a freshman composition class tend to regard writing as an editing process rather than as a process which encompasses intelligence, character, and humanity. Helping students understand and master heuristic procedures on the way to developing composition skills can be facilitated by the use of the game Twenty Questions to learn…

  15. Heuristic Search for Planning with Different Forced Goal-Ordering Constraints

    PubMed Central

    Zhang, Weiming; Cui, Jing; Zhu, Cheng; Huang, Jincai; Liu, Zhong

    2013-01-01

    Planning with forced goal-ordering (FGO) constraints has been proposed many times over the years, but there are still major difficulties in realizing these FGOs in plan generation. In certain planning domains, all the FGOs exist in the initial state. No matter which approach is adopted to achieve a subgoal, all the subgoals should be achieved in a given sequence from the initial state. Otherwise, the planning may arrive at a deadlock. For some other planning domains, there is no FGO in the initial state. However, FGO may occur during the planning process if certain subgoal is achieved by an inappropriate approach. This paper contributes to illustrate that it is the excludable constraints among the goal achievement operations (GAO) of different subgoals that introduce the FGOs into the planning problem, and planning with FGO is still a challenge for the heuristic search based planners. Then, a novel multistep forward search algorithm is proposed which can solve the planning problem with different FGOs efficiently. PMID:23935443

  16. Parsing heuristic and forward search in first-graders' game-play behavior.

    PubMed

    Paz, Luciano; Goldin, Andrea P; Diuk, Carlos; Sigman, Mariano

    2015-07-01

    Seventy-three children between 6 and 7 years of age were presented with a problem having ambiguous subgoal ordering. Performance in this task showed reliable fingerprints: (a) a non-monotonic dependence of performance as a function of the distance between the beginning and the end-states of the problem, (b) very high levels of performance when the first move was correct, and (c) states in which accuracy of the first move was significantly below chance. These features are consistent with a non-Markov planning agent, with an inherently inertial decision process, and that uses heuristics and partial problem knowledge to plan its actions. We applied a statistical framework to fit and test the quality of a proposed planning model (Monte Carlo Tree Search). Our framework allows us to parse out independent contributions to problem-solving based on the construction of the value function and on general mechanisms of the search process in the tree of solutions. We show that the latter are correlated with children's performance on an independent measure of planning, while the former is highly domain specific.

  17. A Hyper-Heuristic Ensemble Method for Static Job-Shop Scheduling.

    PubMed

    Hart, Emma; Sim, Kevin

    2016-01-01

    We describe a new hyper-heuristic method NELLI-GP for solving job-shop scheduling problems (JSSP) that evolves an ensemble of heuristics. The ensemble adopts a divide-and-conquer approach in which each heuristic solves a unique subset of the instance set considered. NELLI-GP extends an existing ensemble method called NELLI by introducing a novel heuristic generator that evolves heuristics composed of linear sequences of dispatching rules: each rule is represented using a tree structure and is itself evolved. Following a training period, the ensemble is shown to outperform both existing dispatching rules and a standard genetic programming algorithm on a large set of new test instances. In addition, it obtains superior results on a set of 210 benchmark problems from the literature when compared to two state-of-the-art hyper-heuristic approaches. Further analysis of the relationship between heuristics in the evolved ensemble and the instances each solves provides new insights into features that might describe similar instances.

  18. An adaptive large neighborhood search heuristic for Two-Echelon Vehicle Routing Problems arising in city logistics

    PubMed Central

    Hemmelmayr, Vera C.; Cordeau, Jean-François; Crainic, Teodor Gabriel

    2012-01-01

    In this paper, we propose an adaptive large neighborhood search heuristic for the Two-Echelon Vehicle Routing Problem (2E-VRP) and the Location Routing Problem (LRP). The 2E-VRP arises in two-level transportation systems such as those encountered in the context of city logistics. In such systems, freight arrives at a major terminal and is shipped through intermediate satellite facilities to the final customers. The LRP can be seen as a special case of the 2E-VRP in which vehicle routing is performed only at the second level. We have developed new neighborhood search operators by exploiting the structure of the two problem classes considered and have also adapted existing operators from the literature. The operators are used in a hierarchical scheme reflecting the multi-level nature of the problem. Computational experiments conducted on several sets of instances from the literature show that our algorithm outperforms existing solution methods for the 2E-VRP and achieves excellent results on the LRP. PMID:23483764

  19. An adaptive large neighborhood search heuristic for Two-Echelon Vehicle Routing Problems arising in city logistics.

    PubMed

    Hemmelmayr, Vera C; Cordeau, Jean-François; Crainic, Teodor Gabriel

    2012-12-01

    In this paper, we propose an adaptive large neighborhood search heuristic for the Two-Echelon Vehicle Routing Problem (2E-VRP) and the Location Routing Problem (LRP). The 2E-VRP arises in two-level transportation systems such as those encountered in the context of city logistics. In such systems, freight arrives at a major terminal and is shipped through intermediate satellite facilities to the final customers. The LRP can be seen as a special case of the 2E-VRP in which vehicle routing is performed only at the second level. We have developed new neighborhood search operators by exploiting the structure of the two problem classes considered and have also adapted existing operators from the literature. The operators are used in a hierarchical scheme reflecting the multi-level nature of the problem. Computational experiments conducted on several sets of instances from the literature show that our algorithm outperforms existing solution methods for the 2E-VRP and achieves excellent results on the LRP.

  20. Development of a core collection for ramie by heuristic search based on SSR markers

    PubMed Central

    Luan, Ming-Bao; Zou, Zi-Zheng; Zhu, Juan-Juan; Wang, Xiao-Fei; Xu, Ying; Ma, Qing-Hua; Sun, Zhi-Min; Chen, Jian-Hua

    2014-01-01

    There are more than 2000 ramie germplasms in the National Ramie Germplasm Nursery affiliated with the Institute of Bast Fiber Crops, Chinese Academy of Agricultural Science, China. As it is difficult to perform effective conservation, management, evaluation, and utilization of redundant genetic resources, it is necessary to construct a core collection by using molecular markers. In this study, a core collection of ramie consisting of 22 germplasms was constructed from 108 accessions by heuristic search based on 21 Simple Sequence Repeat (SSR) marker combinations. The results showed that there is a poor relationship between the core collection and the geographic distribution. The number of amplification bands for the core collection was the same as that for the entire collection. Shannon's index for three of the SSR primers (14%) and Nei's index for nine of the SSR primers (19%) were lower in the core collection than in the entire collection. The true core collection had wider genetic diversity compared with the random core collection. Collectively, the core collection constructed in this study is reliable and represents the genetic diversity of all the 108 accessions. PMID:26019563

  1. DETECTORS AND EXPERIMENTAL METHODS: Heuristic approach for peak regions estimation in gamma-ray spectra measured by a NaI detector

    NASA Astrophysics Data System (ADS)

    Zhu, Meng-Hua; Liu, Liang-Gang; You, Zhong; Xu, Ao-Ao

    2009-03-01

    In this paper, a heuristic approach based on Slavic's peak searching method has been employed to estimate the width of peak regions for background removing. Synthetic and experimental data are used to test this method. With the estimated peak regions using the proposed method in the whole spectrum, we find it is simple and effective enough to be used together with the Statistics-sensitive Nonlinear Iterative Peak-Clipping method.

  2. Heuristic urban transportation network design method, a multilayer coevolution approach

    NASA Astrophysics Data System (ADS)

    Ding, Rui; Ujang, Norsidah; Hamid, Hussain bin; Manan, Mohd Shahrudin Abd; Li, Rong; Wu, Jianjun

    2017-08-01

    The design of urban transportation networks plays a key role in the urban planning process, and the coevolution of urban networks has recently garnered significant attention in literature. However, most of these recent articles are based on networks that are essentially planar. In this research, we propose a heuristic multilayer urban network coevolution model with lower layer network and upper layer network that are associated with growth and stimulate one another. We first use the relative neighbourhood graph and the Gabriel graph to simulate the structure of rail and road networks, respectively. With simulation we find that when a specific number of nodes are added, the total travel cost ratio between an expanded network and the initial lower layer network has the lowest value. The cooperation strength Λ and the changeable parameter average operation speed ratio Θ show that transit users' route choices change dramatically through the coevolution process and that their decisions, in turn, affect the multilayer network structure. We also note that the simulated relation between the Gini coefficient of the betweenness centrality, Θ and Λ have an optimal point for network design. This research could inspire the analysis of urban network topology features and the assessment of urban growth trends.

  3. A heuristic method to compute more accurate TM-scores

    NASA Astrophysics Data System (ADS)

    Li, Shuai Guo; Lim, Yun Kai; Ng, Yen Kaow

    2017-04-01

    Many scoring functions have been proposed to evaluate the similarity between protein structure models. Among these, a popular measure is the template modeling score (TM-score), introduced by Zhang and Skolnick. At this moment, the TM-score is calculated through a heuristic algorithm with no accuracy guarantee. In this paper, we propose an algorithm which computes more accurate TM-scores, through the use of the very fast Kabsch algorithm-which is commonly used to compute the Root Mean Square Deviation (RMSD). Our algorithm first obtain an approximation for the superposition of the protein models that optimizes the TM-score (for example, through OptGDT). Then, it iteratively refines on this superposition through the rotation axes discovered using the Kabsch algorithm. The algorithm is implemented in C++ into a tool that runs in time comparable to Zhang and Skolnick's TM-score software, but consistently produces TM-scores that are more accurate. The tool can be downloaded from https://github.com/kalngyk/tm2.

  4. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings with HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.

  5. Utilization of Tabu search heuristic rules in sampling-based motion planning

    NASA Astrophysics Data System (ADS)

    Khaksar, Weria; Hong, Tang Sai; Sahari, Khairul Salleh Mohamed; Khaksar, Mansoor

    2015-05-01

    Path planning in unknown environments is one of the most challenging research areas in robotics. In this class of path planning, the robot acquires the information from its sensory system. Sampling-based path planning is one of the famous approaches with low memory and computational requirements that has been studied by many researchers during the past few decades. We propose a sampling-based algorithm for path planning in unknown environments using Tabu search. The Tabu search component of the proposed method guides the sampling to find the samples in the most promising areas and makes the sampling procedure more intelligent. The simulation results show the efficient performance of the proposed approach in different types of environments. We also compare the performance of the algorithm with some of the well-known path planning approaches, including Bug1, Bug2, PRM, RRT and the Visibility Graph. The comparison results support the claim of superiority of the proposed algorithm.

  6. Formal and heuristic system decomposition methods in multidisciplinary synthesis

    NASA Astrophysics Data System (ADS)

    Bloebaum, Christina Lynne

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work is to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. Although the design process encompasses several stages in which optimization methods could be applied, the present study addresses the applications of optimization in the preliminary design stage, in which the most capability for positive change exists. A primary concern in this stage involves implementation of an accurate and efficient mathematical representation of large engineering systems. Without such a representation, meaningful design synthesis is impossible. Multilevel decomposition methods provide a systematic approach for decoupling the large complex systems found in multidisciplinary design problems into smaller, more manageable subsystems. These methods account for the couplings between the intrinsically linked disciplinary analysis modules on the basis of a linear sensitivity analysis. In a majority of such efforts, the decomposition is governed either by an obvious hierarchy in the system or on the basis of discipline.

  7. Integrating heuristic evaluation with cognitive walkthrough: development of a hybrid usability inspection method.

    PubMed

    Kushniruk, Andre W; Monkman, Helen; Tuden, Danica; Bellwood, Paule; Borycki, Elizabeth M

    2015-01-01

    Developing more usable healthcare information systems has become an important goal in health informatics. Although methods from usability engineering have appeared and been effectively applied in the design and evaluation of healthcare systems, there continues to be reports of deployment of unusable systems and issues with adoption of healthcare IT worldwide. In this paper we propose a new cost-effective usability engineering approach for healthcare IT that integrates two of the major usability inspection approaches (heuristic evaluation and cognitive walkthrough) into one combined approach that leverages the advantages of both heuristic evaluation and cognitive walkthrough. The approach will be described as will a pilot application of the method in evaluating the usability of a well-known electronic health record system. Implications and future work will also be described.

  8. A new heuristic method for approximating the number of local minima in partial RNA energy landscapes.

    PubMed

    Albrecht, Andreas A; Day, Luke; Abdelhadi Ep Souki, Ouala; Steinhöfel, Kathleen

    2016-02-01

    The analysis of energy landscapes plays an important role in mathematical modelling, simulation and optimisation. Among the main features of interest are the number and distribution of local minima within the energy landscape. Granier and Kallel proposed in 2002 a new sampling procedure for estimating the number of local minima. In the present paper, we focus on improved heuristic implementations of the general framework devised by Granier and Kallel with regard to run-time behaviour and accuracy of predictions. The new heuristic method is demonstrated for the case of partial energy landscapes induced by RNA secondary structures. While the computation of minimum free energy RNA secondary structures has been studied for a long time, the analysis of folding landscapes has gained momentum over the past years in the context of co-transcriptional folding and deeper insights into cell processes. The new approach has been applied to ten RNA instances of length between 99 nt and 504 nt and their respective partial energy landscapes defined by secondary structures within an energy offset ΔE above the minimum free energy conformation. The number of local minima within the partial energy landscapes ranges from 1440 to 3441. Our heuristic method produces for the best approximations on average a deviation below 3.0% from the true number of local minima.

  9. Monte Carlo method with heuristic adjustment for irregularly shaped food product volume measurement.

    PubMed

    Siswantoro, Joko; Prabuwono, Anton Satria; Abdullah, Azizi; Idrus, Bahari

    2014-01-01

    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.

  10. Iterative-deepening heuristic search for optimal and semi-optimal resource allocation

    NASA Technical Reports Server (NTRS)

    Bridges, Susan M.; Johannes, James D.

    1987-01-01

    It is demonstrated that when iterative-deepening A asterisk (IDA asterisk) is applied to one type of resource allocation problem, it uses far less storage than A asterisk, but opens far more nodes and thus has unacceptable time complexity. This is shown to be due, at least in part, to the low-valued effective branching factor that is a characteristic of problems with real-valued cost functions. The semi-optimal, epsilon-admissible IDA asterisk sub epsilon search algorithm that the authors described was shown to open fewer nodes than both A asterisk and IDA asterisk with storage complexity proportional to the depth of the search tree.

  11. Job-shop scheduling with a combination of evolutionary and heuristic methods

    NASA Astrophysics Data System (ADS)

    Patkai, Bela; Torvinen, Seppo

    1999-08-01

    Since almost all of the scheduling problems are NP-hard-- cannot be solved in polynomial time--those companies that need a realistic scheduling system face serious limitations of available methods for finding an optimal schedule, especially if the given environment requires adaptation to dynamic variations. Exact methods do find an optimal schedule, but the size of the problem they can solve is very limited, excluding this way the required scalability. The solution presented in this paper is a simple, multi-pass heuristic method, which aims to avoid the limitations of other well-known formulations. Even though the dispatching rules are fast and provide near-optimal solutions in most cases, they are severely limited in efficiency--especially in case the schedule builder satisfies a significant number of constraints. That is the main motivation for adding a simplified genetic algorithm to the dispatching rules, which--due to its stochastic nature--belongs to heuristic, too. The scheduling problem is of a middle size Finnish factory, throughout the investigations their up-to-date manufacturing data has been used for the sake of realistic calculations.

  12. Heuristic-biased stochastic sampling

    SciTech Connect

    Bresina, J.L.

    1996-12-31

    This paper presents a search technique for scheduling problems, called Heuristic-Biased Stochastic Sampling (HBSS). The underlying assumption behind the HBSS approach is that strictly adhering to a search heuristic often does not yield the best solution and, therefore, exploration off the heuristic path can prove fruitful. Within the HBSS approach, the balance between heuristic adherence and exploration can be controlled according to the confidence one has in the heuristic. By varying this balance, encoded as a bias function, the HBSS approach encompasses a family of search algorithms of which greedy search and completely random search are extreme members. We present empirical results from an application of HBSS to the realworld problem of observation scheduling. These results show that with the proper bias function, it can be easy to outperform greedy search.

  13. An efficient heuristic method for dynamic portfolio selection problem under transaction costs and uncertain conditions

    NASA Astrophysics Data System (ADS)

    Najafi, Amir Abbas; Pourahmadi, Zahra

    2016-04-01

    Selecting the optimal combination of assets in a portfolio is one of the most important decisions in investment management. As investment is a long term concept, looking into a portfolio optimization problem just in a single period may cause loss of some opportunities that could be exploited in a long term view. Hence, it is tried to extend the problem from single to multi-period model. We include trading costs and uncertain conditions to this model which made it more realistic and complex. Hence, we propose an efficient heuristic method to tackle this problem. The efficiency of the method is examined and compared with the results of the rolling single-period optimization and the buy and hold method which shows the superiority of the proposed method.

  14. Heuristic status polling

    SciTech Connect

    Archer, Charles J.; Blocksome, Michael A.; Heidelberger, Philip; Kumar, Sameer; Parker, Jeffrey J.; Ratterman, Joseph D.

    2011-06-07

    Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.

  15. Heuristic approach to image registration

    NASA Astrophysics Data System (ADS)

    Gertner, Izidor; Maslov, Igor V.

    2000-08-01

    Image registration, i.e. correct mapping of images obtained from different sensor readings onto common reference frame, is a critical part of multi-sensor ATR/AOR systems based on readings from different types of sensors. In order to fuse two different sensor readings of the same object, the readings have to be put into a common coordinate system. This task can be formulated as optimization problem in a space of all possible affine transformations of an image. In this paper, a combination of heuristic methods is explored to register gray- scale images. The modification of Genetic Algorithm is used as the first step in global search for optimal transformation. It covers the entire search space with (randomly or heuristically) scattered probe points and helps significantly reduce the search space to a subspace of potentially most successful transformations. Due to its discrete character, however, Genetic Algorithm in general can not converge while coming close to the optimum. Its termination point can be specified either as some predefined number of generations or as achievement of a certain acceptable convergence level. To refine the search, potential optimal subspaces are searched using more delicate and efficient for local search Taboo and Simulated Annealing methods.

  16. SP-100 shield design automation process using expert system and heuristic search techniques

    SciTech Connect

    Marcille, T.F.; Protsik, R.; Deane, N.A.; Hoover, D.G. )

    1993-01-15

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability.

  17. SP-100 shield design automation process using expert system and heuristic search techniques

    NASA Astrophysics Data System (ADS)

    Marcille, Thomas F.; Protsik, Robert; Deane, Nelson A.; Hoover, Darryl G.

    1993-01-01

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability.

  18. Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses

    ERIC Educational Resources Information Center

    Ings, Welby

    2011-01-01

    This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…

  19. Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses

    ERIC Educational Resources Information Center

    Ings, Welby

    2011-01-01

    This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…

  20. A Tabu-Search Heuristic for Deterministic Two-Mode Blockmodeling of Binary Network Matrices

    ERIC Educational Resources Information Center

    Brusco, Michael; Steinley, Douglas

    2011-01-01

    Two-mode binary data matrices arise in a variety of social network contexts, such as the attendance or non-attendance of individuals at events, the participation or lack of participation of groups in projects, and the votes of judges on cases. A popular method for analyzing such data is two-mode blockmodeling based on structural equivalence, where…

  1. AM: An Artificial Intelligence Approach to Discovery in Mathematics as Heuristic Search

    DTIC Science & Technology

    1976-07-01

    the domain of elementary number theory. AM was not designed to prove anything, but it did conjecture many well -known relationships (e.g., the unique...better research than either could alone.8 Although most of the concepts AM proposed and developed were already very well known, AM defined some of them...do mathematical research well , I claim that it is necessary and sufficent to have good methods for proposing new concepts from existing ones, and for

  2. Exact and heuristic methods for network completion for time-varying genetic networks.

    PubMed

    Nakajima, Natsu; Akutsu, Tatsuya

    2014-01-01

    Robustness in biological networks can be regarded as an important feature of living systems. A system maintains its functions against internal and external perturbations, leading to topological changes in the network with varying delays. To understand the flexibility of biological networks, we propose a novel approach to analyze time-dependent networks, based on the framework of network completion, which aims to make the minimum amount of modifications to a given network so that the resulting network is most consistent with the observed data. We have developed a novel network completion method for time-varying networks by extending our previous method for the completion of stationary networks. In particular, we introduce a double dynamic programming technique to identify change time points and required modifications. Although this extended method allows us to guarantee the optimality of the solution, this method has relatively low computational efficiency. In order to resolve this difficulty, we developed a heuristic method for speeding up the calculation of minimum least squares errors. We demonstrate the effectiveness of our proposed methods through computational experiments using synthetic data and real microarray gene expression data. The results indicate that our methods exhibit good performance in terms of completing and inferring gene association networks with time-varying structures.

  3. Constraining planetary atmospheric density: application of heuristic search algorithms to aerodynamic modeling of impact ejecta trajectories

    NASA Astrophysics Data System (ADS)

    Liu, Z. Y. C.; Shirzaei, M.

    2015-12-01

    Impact craters on the terrestrial planets are typically surrounded by a continuous ejecta blanket that the initial emplacement is via ballistic sedimentation. Following an impact event, a significant volume of material is ejected and falling debris surrounds the crater. Aerodynamics rule governs the flight path and determines the spatial distribution of these ejecta. Thus, for the planets with atmosphere, the preserved ejecta deposit directly recorded the interaction of ejecta and atmosphere at the time of impact. In this study, we develop a new framework to establish links between distribution of the ejecta, age of the impact and the properties of local atmosphere. Given the radial distance of the continuous ejecta extent from crater, an inverse aerodynamic modeling approach is employed to estimate the local atmospheric drags and density as well as the lift forces at the time of impact. Based on earlier studies, we incorporate reasonable value ranges for ejection angle, initial velocity, aerodynamic drag, and lift in the model. In order to solve the trajectory differential equations, obtain the best estimate of atmospheric density, and the associated uncertainties, genetic algorithm is applied. The method is validated using synthetic data sets as well as detailed maps of impact ejecta associated with five fresh martian and two lunar impact craters, with diameter of 20-50 m, 10-20 m, respectively. The estimated air density for martian carters range 0.014-0.028 kg/m3, consistent with the recent surface atmospheric density measurement of 0.015-0.020 kg/m3. This constancy indicates the robustness of the presented methodology. In the following, the inversion results for the lunar craters yield air density of 0.003-0.008 kg/m3, which suggest the inversion results are accurate to the second decimal place. This framework will be applied to older martian craters with preserved ejecta blankets, which expect to constrain the long-term evolution of martian atmosphere.

  4. The Effect of Using the Lakatosian Heuristic Method to Teach the Surface Area of a Cone on Students' Achievement According to Bloom's Taxonomy Levels

    ERIC Educational Resources Information Center

    Dimitriou-Hadjichristou, Chrysoula; Ogbonnaya, Ugorji I.

    2015-01-01

    This paper reports a study on the effect of using the Lakatosian heuristic method to teach the surface area of a cone (SAC) on students' achievement according to Bloom's taxonomy levels. Two groups of students (experimental and control) participated in the study. The experimental group (n = 20) was taught using the Lakatosian heuristic method…

  5. Effective heuristics and meta-heuristics for the quadratic assignment problem with tuned parameters and analytical comparisons

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Karimi, Hossein

    2012-07-01

    Quadratic assignment problem (QAP) is a well-known problem in the facility location and layout. It belongs to the NP-complete class. There are many heuristic and meta-heuristic methods, which are presented for QAP in the literature. In this paper, we applied 2-opt, greedy 2-opt, 3-opt, greedy 3-opt, and VNZ as heuristic methods and tabu search (TS), simulated annealing, and particle swarm optimization as meta-heuristic methods for the QAP. This research is dedicated to compare the relative percentage deviation of these solution qualities from the best known solution which is introduced in QAPLIB. Furthermore, a tuning method is applied for meta-heuristic parameters. Results indicate that TS is the best in 31%of QAPs, and the IFLS method, which is in the literature, is the best in 58 % of QAPs; these two methods are the same in 11 % of test problems. Also, TS has a better computational time among heuristic and meta-heuristic methods.

  6. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  7. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  8. Comparison of heuristic and cognitive walkthrough usability evaluation methods for evaluating health information systems.

    PubMed

    Khajouei, Reza; Zahiri Esfahani, Misagh; Jahani, Yunes

    2017-04-01

    There are several user-based and expert-based usability evaluation methods that may perform differently according to the context in which they are used. The objective of this study was to compare 2 expert-based methods, heuristic evaluation (HE) and cognitive walkthrough (CW), for evaluating usability of health care information systems. Five evaluators independently evaluated a medical office management system using HE and CW. We compared the 2 methods in terms of the number of identified usability problems, their severity, and the coverage of each method. In total, 156 problems were identified using the 2 methods. HE identified a significantly higher number of problems related to the "satisfaction" attribute ( P  = .002). The number of problems identified using CW concerning the "learnability" attribute was significantly higher than those identified using HE ( P  = .005). There was no significant difference between the number of problems identified by HE, based on different usability attributes ( P  = .232). Results of CW showed a significant difference between the number of problems related to usability attributes ( P  < .0001). The average severity of problems identified using CW was significantly higher than that of HE ( P  < .0001). This study showed that HE and CW do not differ significantly in terms of the number of usability problems identified, but they differ based on the severity of problems and the coverage of some usability attributes. The results suggest that CW would be the preferred method for evaluating systems intended for novice users and HE for users who have experience with similar systems. However, more studies are needed to support this finding.

  9. Associating optical measurements of MEO and GEO objects using Population-Based Meta-Heuristic methods

    NASA Astrophysics Data System (ADS)

    Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.

    2016-11-01

    Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.

  10. Heuristic decision making in medicine

    PubMed Central

    Marewski, Julian N.; Gigerenzer, Gerd

    2012-01-01

    Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care. PMID:22577307

  11. The high performing backtracking algorithm and heuristic for the sequence-dependent setup times flowshop problem with total weighted tardiness

    NASA Astrophysics Data System (ADS)

    Zheng, Jun-Xi; Zhang, Ping; Li, Fang; Du, Guang-Long

    2016-09-01

    Although the sequence-dependent setup times flowshop problem with the total weighted tardiness minimization objective exists widely in industry, work on the problem has been scant in the existing literature. To the authors' best knowledge, the NEH?EWDD heuristic and the Iterated Greedy (IG) algorithm with descent local search have been regarded as the high performing heuristic and the state-of-the-art algorithm for the problem, which are both based on insertion search. In this article firstly, an efficient backtracking algorithm and a novel heuristic (HPIS) are presented for insertion search. Accordingly, two heuristics are introduced, one is NEH?EWDD with HPIS for insertion search, and the other is the combination of NEH?EWDD and both the two methods. Furthermore, the authors improve the IG algorithm with the proposed methods. Finally, experimental results show that both the proposed heuristics and the improved IG (IG*) significantly outperform the original ones.

  12. Iterated sequence databank search methods.

    PubMed

    Taylor, W R; Brown, N P

    1999-06-15

    Iterated sequence databank search methods were assessed from the viewpoint of someone with the sequence of a novel gene product wishing to find distant relatives to their protein and, with the specific searches against the PDB, also hoping to find a relative of known structure. We examined three methods in detail, spanning a range from simple pattern-matching to sophisticated weighted profiles. Rather than apply these methods 'blindly' (with default parameters) to a large number of test queries, we have concentrated on the globins, so allowing a more detailed investigation of each method on different data subsets with different parameter settings. Despite their widespread use, regular-expression matching proved to be very limited-seldom extending beyond the sub-family from which the pattern was derived. To attain any generality, the patterns had to be 'stripped-down' to include only the most highly conserved parts. The QUEST program avoided these problems by introducing a more flexible (weighted) matching. On the PDB sequences this was highly effective, missing only a few globins with probes based on each sub-family or even a single representative from each sub-family. In addition, very few false-positives were encountered, and those that did match, often only did so for a few cycles before being lost again. On the larger sequence collection, however, QUEST encountered problems with maintaining (or achieving) the alignment of the full globin family. psi-BLAST also recognised almost all the globins when matching against the PDB sequences, typically, missing three or four of the most distantly related sequences while picking-up a few false-positives. In contrast to QUEST, psi-BLAST performed very well on the larger databank, getting almost a full collection of globins although still retaining the same proportion of false-positives. SAM applied to the PDB sequences performed reasonably well with the myoglobin and hemoglobin families as probes, missing, typically

  13. The double-assignment method for the exponential chaotic tabu search in quadratic assignment problems

    NASA Astrophysics Data System (ADS)

    Shibata, Kazuaki; Horio, Yoshihiko; Aihara, Kazuyuki

    The quadratic assignment problem (QAP) is one of the NP-hard combinatorial optimization problems. An exponential chaotic tabu search using a 2-opt algorithm driven by chaotic neuro-dynamics has been proposed as one heuristic method for solving QAPs. In this paper we first propose a new local search, the double-assignment method, suitable for the exponential chaotic tabu search, which adopts features of the Lin-Kernighan algorithm. We then introduce chaotic neuro-dynamics into the double-assignment method to propose a novel exponential chaotic tabu search. We further improve the proposed exponential chaotic tabu search with the double-assignment method by enhancing the effect of chaotic neuro-dynamics.

  14. Decentralized Bayesian search using approximate dynamic programming methods.

    PubMed

    Zhao, Yijia; Patek, Stephen D; Beling, Peter A

    2008-08-01

    We consider decentralized Bayesian search problems that involve a team of multiple autonomous agents searching for targets on a network of search points operating under the following constraints: 1) interagent communication is limited; 2) the agents do not have the opportunity to agree in advance on how to resolve equivalent but incompatible strategies; and 3) each agent lacks the ability to control or predict with certainty the actions of the other agents. We formulate the multiagent search-path-planning problem as a decentralized optimal control problem and introduce approximate dynamic heuristics that can be implemented in a decentralized fashion. After establishing some analytical properties of the heuristics, we present computational results for a search problem involving two agents on a 5 x 5 grid.

  15. Twilight of the Slogans: A Heuristic Investigation of Linguistic Memes Using Mixed Methods

    ERIC Educational Resources Information Center

    Duffy, Curt Paul

    2013-01-01

    Slogans, or linguistic memes, are short, memorable phrases that are present in commercial, political, and everyday discourse. Slogans propagate similarly to other memes, or cultural units, through an evolutionary mechanism first proposed by Dawkins (1976). Heuristic inquiry, as presented by Moustakas (1990), provided a template from which to…

  16. Heuristics as a Basis for Assessing Creative Potential: Measures, Methods, and Contingencies

    ERIC Educational Resources Information Center

    Vessey, William B.; Mumford, Michael D.

    2012-01-01

    Studies of creative thinking skills have generally measured a single aspect of creativity, divergent thinking. A number of other processes involved in creative thought have been identified. Effective execution of these processes is held to depend on the strategies applied in process execution, or heuristics. In this article, we review prior…

  17. Twilight of the Slogans: A Heuristic Investigation of Linguistic Memes Using Mixed Methods

    ERIC Educational Resources Information Center

    Duffy, Curt Paul

    2013-01-01

    Slogans, or linguistic memes, are short, memorable phrases that are present in commercial, political, and everyday discourse. Slogans propagate similarly to other memes, or cultural units, through an evolutionary mechanism first proposed by Dawkins (1976). Heuristic inquiry, as presented by Moustakas (1990), provided a template from which to…

  18. Heuristics as a Basis for Assessing Creative Potential: Measures, Methods, and Contingencies

    ERIC Educational Resources Information Center

    Vessey, William B.; Mumford, Michael D.

    2012-01-01

    Studies of creative thinking skills have generally measured a single aspect of creativity, divergent thinking. A number of other processes involved in creative thought have been identified. Effective execution of these processes is held to depend on the strategies applied in process execution, or heuristics. In this article, we review prior…

  19. The min-conflicts heuristic: Experimental and theoretical results

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Philips, Andrew B.; Johnston, Mark D.; Laird, Philip

    1991-01-01

    This paper describes a simple heuristic method for solving large-scale constraint satisfaction and scheduling problems. Given an initial assignment for the variables in a problem, the method operates by searching through the space of possible repairs. The search is guided by an ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. We demonstrate empirically that the method performs orders of magnitude better than traditional backtracking techniques on certain standard problems. For example, the one million queens problem can be solved rapidly using our approach. We also describe practical scheduling applications where the method has been successfully applied. A theoretical analysis is presented to explain why the method works so well on certain types of problems and to predict when it is likely to be most effective.

  20. Multi-Criteria Optimization of the Deployment of a Grid for Rural Electrification Based on a Heuristic Method

    NASA Astrophysics Data System (ADS)

    Ortiz-Matos, L.; Aguila-Tellez, A.; Hincapié-Reyes, R. C.; González-Sanchez, J. W.

    2017-07-01

    In order to design electrification systems, recent mathematical models solve the problem of location, type of electrification components, and the design of possible distribution microgrids. However, due to the amount of points to be electrified increases, the solution to these models require high computational times, thereby becoming unviable practice models. This study posed a new heuristic method for the electrification of rural areas in order to solve the problem. This heuristic algorithm presents the deployment of rural electrification microgrids in the world, by finding routes for optimal placement lines and transformers in transmission and distribution microgrids. The challenge is to obtain a display with equity in losses, considering the capacity constraints of the devices and topology of the land at minimal economic cost. An optimal scenario ensures the electrification of all neighbourhoods to a minimum investment cost in terms of the distance between electric conductors and the amount of transformation devices.

  1. A heuristic nonlinear constructive method for electric power distribution system reconfiguration

    NASA Astrophysics Data System (ADS)

    McDermott, Thomas E.

    1998-12-01

    The electric power distribution system usually operates in a radial configuration, with tie switches between circuits to provide alternate feeds. The losses would be minimized if all switches were closed, but this is not done because it complicates the system's protection against overcurrents. Whenever a component fails, some of the switches must be operated to restore power to as many customers as possible. As loads vary with time, switch operations may reduce losses in the system. Both of these are applications for reconfiguration. The problem is combinatorial, which precludes algorithms that guarantee a global optimum. Most existing reconfiguration algorithms fall into two categories. In the first, branch exchange, the system operates in a feasible radial configuration and the algorithm opens and closes candidate switches in pairs. In the second, loop cutting, the system is completely meshed and the algorithm opens candidate switches to reach a feasible radial configuration. Reconfiguration algorithms based on linearized transshipment, neural networks, heuristics, genetic algorithms, and simulated annealing have also been reported, but not widely used. These existing reconfiguration algorithms work with a simplified model of the power system, and they handle voltage and current constraints approximately, if at all. The algorithm described here is a constructive method, using a full nonlinear power system model that accurately handles constraints. The system starts with all switches open and all failed components isolated. An optional network power flow provides a lower bound on the losses. Then the algorithm closes one switch at a time to minimize the increase in a merit figure, which is the real loss divided by the apparent load served. The merit figure increases with each switch closing. This principle, called discrete ascent optimal programming (DAOP), has been applied to other power system problems, including economic dispatch and phase balancing. For

  2. Meta-heuristic algorithms as tools for hydrological science

    NASA Astrophysics Data System (ADS)

    Yoo, Do Guen; Kim, Joong Hoon

    2014-12-01

    In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.

  3. Search systems and computer-implemented search methods

    DOEpatents

    Payne, Deborah A.; Burtner, Edwin R.; Hampton, Shawn D.; Gillen, David S.; Henry, Michael J.

    2017-03-07

    Search systems and computer-implemented search methods are described. In one aspect, a search system includes a communications interface configured to access a plurality of data items of a collection, wherein the data items include a plurality of image objects individually comprising image data utilized to generate an image of the respective data item. The search system may include processing circuitry coupled with the communications interface and configured to process the image data of the data items of the collection to identify a plurality of image content facets which are indicative of image content contained within the images and to associate the image objects with the image content facets and a display coupled with the processing circuitry and configured to depict the image objects associated with the image content facets.

  4. Search systems and computer-implemented search methods

    DOEpatents

    Payne, Deborah A.; Burtner, Edwin R.; Bohn, Shawn J.; Hampton, Shawn D.; Gillen, David S.; Henry, Michael J.

    2015-12-22

    Search systems and computer-implemented search methods are described. In one aspect, a search system includes a communications interface configured to access a plurality of data items of a collection, wherein the data items include a plurality of image objects individually comprising image data utilized to generate an image of the respective data item. The search system may include processing circuitry coupled with the communications interface and configured to process the image data of the data items of the collection to identify a plurality of image content facets which are indicative of image content contained within the images and to associate the image objects with the image content facets and a display coupled with the processing circuitry and configured to depict the image objects associated with the image content facets.

  5. A hybrid color space for skin detection using genetic algorithm heuristic search and principal component analysis technique.

    PubMed

    Maktabdar Oghaz, Mahdi; Maarof, Mohd Aizaini; Zainal, Anazida; Rohani, Mohd Foad; Yaghoubyan, S Hadi

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.

  6. A Hybrid Color Space for Skin Detection Using Genetic Algorithm Heuristic Search and Principal Component Analysis Technique

    PubMed Central

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377

  7. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  8. Analytical Methods in Search Theory

    DTIC Science & Technology

    1979-11-01

    DETERMINISTIC ANb CONDITIONALLY DETERMTNISTIC S, ART -1 EQUATrONS References: 14,15. The deterministic search equation is: 3 • ’ ( f) - pf (3-1) at (IX...34le arts "." 2 pp., Mae 13M. Aanmtch CohtfractO~4*VI Partial Umnsitvatantൖ" 36 00 Alto 1174, IPtla, AD � fita IN t110 JOR" Of .1 M 04PIn Angeeeo...e" ISMO f Pdu ttarrtioegt, A. LeVas, "Market Anetytis with Th aleaiapsicMP"Ag11 isella Petdr Pattmcs Affecting Reseatch & RNational B 1peiarttso

  9. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  10. Long alternating codes: 2. Practical search method

    NASA Astrophysics Data System (ADS)

    Markkanen, Markku; NygréN, Tuomo

    1997-01-01

    This paper is the second one in a series explaining a new search method of long alternating codes for incoherent scatter radars. The first paper explains the general idea of the method in terms of a special game of dominoes. This second paper gives an alternative mathematical formalism suitable for computer search. It consists of three rules and a mathematical analysis leading to a formula which can be used in practical search. Although the rules were originally experimental, a mathematical proof of their sufficiency is also given. The method has been used to make a complete search up to a length of 1,048,576 bits. Even longer codes have been found; the longest one known at the moment contains 4,194,304 bits. For demonstration, complete tables of 8-, 16-, 32-, and 64-bit codes and examples of 128- and 256-bit codes are presented.

  11. The TSTS Method in Cultural Heritage Search

    NASA Astrophysics Data System (ADS)

    Stawniak, Mirosław; Cellary, Wojciech

    In cultural heritage content management systems in which cultural objects are described with the use of their semantic, temporal and spatial properties, the search capabilities taking all those properties into consideration are very limited. The difficulty comes from the fact that concepts evolve over time and depend on location. In this paper the TSTS search method is presented based on the TST similarity measure that allows assessing the similarity factor between different resources in a knowledgebase. A ranked search result is generated basing on the semantic distance between the fuzzy set created for the user query and fuzzy sets describing potential results in the time-space continuum.

  12. A System for Automatically Generating Scheduling Heuristics

    NASA Technical Reports Server (NTRS)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  13. A System for Automatically Generating Scheduling Heuristics

    NASA Technical Reports Server (NTRS)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  14. Automated detection of heuristics and biases among pathologists in a computer-based system.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.

  15. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  16. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  17. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search.

    PubMed

    Wheeler, Ward C

    2003-06-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed.

  18. Multiobjective Tabu Search method used in chemistry

    NASA Astrophysics Data System (ADS)

    Rusu, T.; Bulacovschi, V.

    The use of a combined artificial intelligence method in macromolecular chemistry design is described. This method implies a Back-Propagation (BP) Neural Network, modified for two-dimensional input data and for a system composed of a genetic algorithm extended by a Tabu Search operator used to incorporate high-level chemical knowledge: thermodynamic polymer properties.

  19. Fixing Dataset Search

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    Three current search engines are queried for ozone data at the GES DISC. The results range from sub-optimal to counter-intuitive. We propose a method to fix dataset search by implementing a robust relevancy ranking scheme. The relevancy ranking scheme is based on several heuristics culled from more than 20 years of helping users select datasets.

  20. Numerical Solution to Generalized Burgers'-Fisher Equation Using Exp-Function Method Hybridized with Heuristic Computation

    PubMed Central

    Malik, Suheel Abdullah; Qureshi, Ijaz Mansoor; Amir, Muhammad; Malik, Aqdas Naveed; Haq, Ihsanul

    2015-01-01

    In this paper, a new heuristic scheme for the approximate solution of the generalized Burgers'-Fisher equation is proposed. The scheme is based on the hybridization of Exp-function method with nature inspired algorithm. The given nonlinear partial differential equation (NPDE) through substitution is converted into a nonlinear ordinary differential equation (NODE). The travelling wave solution is approximated by the Exp-function method with unknown parameters. The unknown parameters are estimated by transforming the NODE into an equivalent global error minimization problem by using a fitness function. The popular genetic algorithm (GA) is used to solve the minimization problem, and to achieve the unknown parameters. The proposed scheme is successfully implemented to solve the generalized Burgers'-Fisher equation. The comparison of numerical results with the exact solutions, and the solutions obtained using some traditional methods, including adomian decomposition method (ADM), homotopy perturbation method (HPM), and optimal homotopy asymptotic method (OHAM), show that the suggested scheme is fairly accurate and viable for solving such problems. PMID:25811858

  1. Numerical solution to generalized Burgers'-Fisher equation using Exp-function method hybridized with heuristic computation.

    PubMed

    Malik, Suheel Abdullah; Qureshi, Ijaz Mansoor; Amir, Muhammad; Malik, Aqdas Naveed; Haq, Ihsanul

    2015-01-01

    In this paper, a new heuristic scheme for the approximate solution of the generalized Burgers'-Fisher equation is proposed. The scheme is based on the hybridization of Exp-function method with nature inspired algorithm. The given nonlinear partial differential equation (NPDE) through substitution is converted into a nonlinear ordinary differential equation (NODE). The travelling wave solution is approximated by the Exp-function method with unknown parameters. The unknown parameters are estimated by transforming the NODE into an equivalent global error minimization problem by using a fitness function. The popular genetic algorithm (GA) is used to solve the minimization problem, and to achieve the unknown parameters. The proposed scheme is successfully implemented to solve the generalized Burgers'-Fisher equation. The comparison of numerical results with the exact solutions, and the solutions obtained using some traditional methods, including adomian decomposition method (ADM), homotopy perturbation method (HPM), and optimal homotopy asymptotic method (OHAM), show that the suggested scheme is fairly accurate and viable for solving such problems.

  2. Effectiveness of Rural Job Search Methods

    ERIC Educational Resources Information Center

    Rungeling, Brian; And Others

    1976-01-01

    In an excerpt of a paper, data examining the relative effectiveness of various job search techniques (direct application, friends and relatives, state agencies, and other methods) are presented which were obtained from a 1974 survey of 3,357 head of households in four southern rural counties. (Author/BP)

  3. Scalable metagenomics alignment research tool (SMART): a scalable, rapid, and complete search heuristic for the classification of metagenomic sequences from complex sequence populations.

    PubMed

    Lee, Aaron Y; Lee, Cecilia S; Van Gelder, Russell N

    2016-07-28

    Next generation sequencing technology has enabled characterization of metagenomics through massively parallel genomic DNA sequencing. The complexity and diversity of environmental samples such as the human gut microflora, combined with the sustained exponential growth in sequencing capacity, has led to the challenge of identifying microbial organisms by DNA sequence. We sought to validate a Scalable Metagenomics Alignment Research Tool (SMART), a novel searching heuristic for shotgun metagenomics sequencing results. After retrieving all genomic DNA sequences from the NCBI GenBank, over 1 × 10(11) base pairs of 3.3 × 10(6) sequences from 9.25 × 10(5) species were indexed using 4 base pair hashtable shards. A MapReduce searching strategy was used to distribute the search workload in a computing cluster environment. In addition, a one base pair permutation algorithm was used to account for single nucleotide polymorphisms and sequencing errors. Simulated datasets used to evaluate Kraken, a similar metagenomics classification tool, were used to measure and compare precision and accuracy. Finally using a same set of training sequences we compared Kraken, CLARK, and SMART within the same computing environment. Utilizing 12 computational nodes, we completed the classification of all datasets in under 10 min each using exact matching with an average throughput of over 1.95 × 10(6) reads classified per minute. With permutation matching, we achieved sensitivity greater than 83 % and precision greater than 94 % with simulated datasets at the species classification level. We demonstrated the application of this technique applied to conjunctival and gut microbiome metagenomics sequencing results. In our head to head comparison, SMART and CLARK had similar accuracy gains over Kraken at the species classification level, but SMART required approximately half the amount of RAM of CLARK. SMART is the first scalable, efficient, and rapid metagenomics classification algorithm

  4. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  5. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  6. A Simple but Powerful Heuristic Method for Accelerating k-Means Clustering of Large-Scale Data in Life Science.

    PubMed

    Ichikawa, Kazuki; Morishita, Shinichi

    2014-01-01

    K-means clustering has been widely used to gain insight into biological systems from large-scale life science data. To quantify the similarities among biological data sets, Pearson correlation distance and standardized Euclidean distance are used most frequently; however, optimization methods have been largely unexplored. These two distance measurements are equivalent in the sense that they yield the same k-means clustering result for identical sets of k initial centroids. Thus, an efficient algorithm used for one is applicable to the other. Several optimization methods are available for the Euclidean distance and can be used for processing the standardized Euclidean distance; however, they are not customized for this context. We instead approached the problem by studying the properties of the Pearson correlation distance, and we invented a simple but powerful heuristic method for markedly pruning unnecessary computation while retaining the final solution. Tests using real biological data sets with 50-60K vectors of dimensions 10-2001 (~400 MB in size) demonstrated marked reduction in computation time for k = 10-500 in comparison with other state-of-the-art pruning methods such as Elkan's and Hamerly's algorithms. The BoostKCP software is available at http://mlab.cb.k.u-tokyo.ac.jp/~ichikawa/boostKCP/.

  7. Impacts of Learning Inventive Problem-Solving Principles: Students' Transition from Systematic Searching to Heuristic Problem Solving

    ERIC Educational Resources Information Center

    Barak, Moshe

    2013-01-01

    This paper presents the outcomes of teaching an inventive problem-solving course in junior high schools in an attempt to deal with the current relative neglect of fostering students' creativity and problem-solving capabilities in traditional schooling. The method involves carrying out systematic manipulation with attributes, functions and…

  8. Impacts of Learning Inventive Problem-Solving Principles: Students' Transition from Systematic Searching to Heuristic Problem Solving

    ERIC Educational Resources Information Center

    Barak, Moshe

    2013-01-01

    This paper presents the outcomes of teaching an inventive problem-solving course in junior high schools in an attempt to deal with the current relative neglect of fostering students' creativity and problem-solving capabilities in traditional schooling. The method involves carrying out systematic manipulation with attributes, functions and…

  9. Heuristically Driven Search Methods for Topology Control in Directional Wireless Hybrid Networks

    DTIC Science & Technology

    2007-03-01

    sovereign options for the defense of the United States of America and its global interests- -to fly and fight in Air, Space, and Cyberspace.” One of...model has been formulated, it can be integrated into a linear solver. Erwin used Xpress -Optimizer, a component of the Xpress -MP suite and a well-known...no global information is used to make decisions. On the other hand, greedy techniques are often acceptable substitutes for approximation algorithms

  10. On utilizing search methods to select subspace dimensions for kernel-based nonlinear subspace classifiers.

    PubMed

    Kim, Sang-Woon; Oommen, B John

    2005-01-01

    In Kernel-based Nonlinear Subspace (KNS) methods, the subspace dimensions have a strong influence on the performance of the subspace classifier. In order to get a high classification accuracy, a large dimension is generally required. However, if the chosen subspace dimension is too large, it leads to a low performance due to the overlapping of the resultant subspaces and, if it is too small, it increases the classification error due to the poor resulting approximation. The most common approach is of an ad hoc nature, which selects the dimensions based on the so-called cumulative proportion computed from the kernel matrix for each class. In this paper, we propose a new method of systematically and efficiently selecting optimal or near-optimal subspace dimensions for KNS classifiers using a search strategy and a heuristic function termed the Overlapping criterion. The rationale for this function has been motivated in the body of the paper. The task of selecting optimal subspace dimensions is reduced to finding the best ones from a given problem-domain solution space using this criterion as a heuristic function. Thus, the search space can be pruned to very efficiently find the best solution. Our experimental results demonstrate that the proposed mechanism selects the dimensions efficiently without sacrificing the classification accuracy.

  11. Harmony search method: theory and applications.

    PubMed

    Gao, X Z; Govindasamy, V; Xu, H; Wang, X; Zenger, K

    2015-01-01

    The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.

  12. Harmony Search Method: Theory and Applications

    PubMed Central

    Gao, X. Z.; Govindasamy, V.; Xu, H.; Wang, X.; Zenger, K.

    2015-01-01

    The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem. PMID:25945083

  13. A Heuristic Automatic and Robust ROI Detection Method for Medical Image Warermarking.

    PubMed

    Mousavi, Seyed Mojtaba; Naghsh, Alireza; Abu-Bakar, S A R

    2015-08-01

    This paper presents an automatic region of interest (ROI) segmentation method for application of watermarking in medical images. The advantage of using this scheme is that the proposed method is robust against different attacks such as median, Wiener, Gaussian, and sharpening filters. In other words, this technique can produce the same result for the ROI before and after these attacks. The proposed algorithm consists of three main parts; suggesting an automatic ROI detection system, evaluating the robustness of the proposed system against numerous attacks, and finally recommending an enhancement part to increase the strength of the composed system against different attacks. Results obtained from the proposed method demonstrated the promising performance of the method.

  14. A Meta-heuristic Approach for Variants of VRP in Terms of Generalized Saving Method

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki

    Global logistic design is becoming a keen interest to provide an essential infrastructure associated with modern societal provision. For examples, we can designate green and/or robust logistics in transportation systems, smart grids in electricity utilization systems, and qualified service in delivery systems, and so on. As a key technology for such deployments, we engaged in practical vehicle routing problem on a basis of the conventional saving method. This paper extends such idea and gives a general framework available for various real-world applications. It can cover not only delivery problems but also two kind of pick-up problems, i.e., straight and drop-by routings. Moreover, multi-depot problem is considered by a hybrid approach with graph algorithm and its solution method is realized in a hierarchical manner. Numerical experiments have been taken place to validate effectiveness of the proposed method.

  15. Paradigms or Toolkits? Philosophical and Methodological Positions as Heuristics for Mixed Methods Research

    ERIC Educational Resources Information Center

    Maxwell, Joseph A.

    2011-01-01

    In this article, the author challenges the validity and usefulness of the concept of "paradigm," as this term has been used in the social sciences generally, and specifically in the debates over research methods. He emphasizes that in criticizing what he sees as the misuse of the paradigm concept, he is not arguing for dismissing or ignoring…

  16. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  17. Pan evaporation modeling using six different heuristic computing methods in different climates of China

    NASA Astrophysics Data System (ADS)

    Wang, Lunche; Kisi, Ozgur; Zounemat-Kermani, Mohammad; Li, Hui

    2017-01-01

    Pan evaporation (Ep) plays important roles in agricultural water resources management. One of the basic challenges is modeling Ep using limited climatic parameters because there are a number of factors affecting the evaporation rate. This study investigated the abilities of six different soft computing methods, multi-layer perceptron (MLP), generalized regression neural network (GRNN), fuzzy genetic (FG), least square support vector machine (LSSVM), multivariate adaptive regression spline (MARS), adaptive neuro-fuzzy inference systems with grid partition (ANFIS-GP), and two regression methods, multiple linear regression (MLR) and Stephens and Stewart model (SS) in predicting monthly Ep. Long-term climatic data at various sites crossing a wide range of climates during 1961-2000 are used for model development and validation. The results showed that the models have different accuracies in different climates and the MLP model performed superior to the other models in predicting monthly Ep at most stations using local input combinations (for example, the MAE (mean absolute errors), RMSE (root mean square errors), and determination coefficient (R2) are 0.314 mm/day, 0.405 mm/day and 0.988, respectively for HEB station), while GRNN model performed better in Tibetan Plateau (MAE, RMSE and R2 are 0.459 mm/day, 0.592 mm/day and 0.932, respectively). The accuracies of above models ranked as: MLP, GRNN, LSSVM, FG, ANFIS-GP, MARS and MLR. The overall results indicated that the soft computing techniques generally performed better than the regression methods, but MLR and SS models can be more preferred at some climatic zones instead of complex nonlinear models, for example, the BJ (Beijing), CQ (Chongqing) and HK (Haikou) stations. Therefore, it can be concluded that Ep could be successfully predicted using above models in hydrological modeling studies.

  18. A Heuristic Method of Optimal Generalized Hypercube Encoding for Pictorial Databases.

    DTIC Science & Technology

    1980-01-15

    the GHm codes are: (X1 ,.. *, • ..- 1 ; am., an; bm ... , bn), where (x-,. x1 , ,z ,.. ., z ) is in S for some coordinates, z, z ; and a =mm {y,: for...PROBLEM 1 Given m, the method of choosing m-1 handles which will generate optimal GHm codes for a point set S is based on the following ideas: * For each...generate GHm encoded tuples, and their number was counted for a different number of point set and dimensions of space. To find the optimal GHm encoding

  19. A novel heuristic algorithm for capacitated vehicle routing problem

    NASA Astrophysics Data System (ADS)

    Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre

    2017-02-01

    The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.

  20. Debris flow susceptibility mapping using a qualitative heuristic method and Flow-R along the Yukon Alaska Highway Corridor, Canada

    NASA Astrophysics Data System (ADS)

    Blais-Stevens, A.; Behnia, P.

    2016-02-01

    This research activity aimed at reducing risk to infrastructure, such as a proposed pipeline route roughly parallel to the Yukon Alaska Highway Corridor (YAHC), by filling geoscience knowledge gaps in geohazards. Hence, the Geological Survey of Canada compiled an inventory of landslides including debris flow deposits, which were subsequently used to validate two different debris flow susceptibility models. A qualitative heuristic debris flow susceptibility model was produced for the northern region of the YAHC, from Kluane Lake to the Alaska border, by integrating data layers with assigned weights and class ratings. These were slope angle, slope aspect, surficial geology, plan curvature, and proximity to drainage system. Validation of the model was carried out by calculating a success rate curve which revealed a good correlation with the susceptibility model and the debris flow deposit inventory compiled from air photos, high-resolution satellite imagery, and field verification. In addition, the quantitative Flow-R method was tested in order to define the potential source and debris flow susceptibility for the southern region of Kluane Lake, an area where documented debris flow events have blocked the highway in the past (e.g. 1988). Trial and error calculations were required for this method because there was not detailed information on the debris flows for the YAHC to allow us to define threshold values for some parameters when calculating source areas, spreading, and runout distance. Nevertheless, correlation with known documented events helped define these parameters and produce a map that captures most of the known events and displays debris flow susceptibility in other, usually smaller, steep channels that had not been previously documented.

  1. Debris flow susceptibility mapping using a qualitative heuristic method and Flow-R along the Yukon Alaska Highway Corridor, Canada

    NASA Astrophysics Data System (ADS)

    Blais-Stevens, A.; Behnia, P.

    2015-05-01

    This research activity aimed at reducing risk to infrastructure, such as a proposed pipeline route roughly parallel to the Yukon Alaska Highway Corridor (YAHC) by filling geoscience knowledge gaps in geohazards. Hence, the Geological Survey of Canada compiled an inventory of landslides including debris flow deposits, which were subsequently used to validate two different debris flow susceptibility models. A qualitative heuristic debris flow susceptibility model was produced for the northern region of the YAHC, from Kluane Lake to the Alaska border, by integrating data layers with assigned weights and class ratings. These were slope angle, slope aspect (derived from a 5 m × 5 m DEM), surficial geology, permafrost distribution, and proximity to drainage system. Validation of the model was carried out by calculating a success rate curve which revealed a good correlation with the susceptibility model and the debris flow deposit inventory compiled from air photos, high resolution satellite imagery, and field verification. In addition, the quantitative Flow-R method was tested in order to define the potential source and debris flow susceptibility for the southern region of Kluane Lake, an area where documented debris flow events have blocked the highway in the past (e.g., 1988). Trial and error calculations were required for this method because there was not detailed information on the debris flows for the YAHC to allow us to define threshold values for some parameters when calculating source areas, spreading, and runout distance. Nevertheless, correlation with known documented events helped define these parameters and produce a map that captures most of the known events and displays debris flow susceptibility in other, usually smaller, steep channels that had not been previously documented.

  2. Automatic Generation of Heuristics for Scheduling

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Bresina, John L.; Rodgers, Stuart M.

    1997-01-01

    This paper presents a technique, called GenH, that automatically generates search heuristics for scheduling problems. The impetus for developing this technique is the growing consensus that heuristics encode advice that is, at best, useful in solving most, or typical, problem instances, and, at worst, useful in solving only a narrowly defined set of instances. In either case, heuristic problem solvers, to be broadly applicable, should have a means of automatically adjusting to the idiosyncrasies of each problem instance. GenH generates a search heuristic for a given problem instance by hill-climbing in the space of possible multi-attribute heuristics, where the evaluation of a candidate heuristic is based on the quality of the solution found under its guidance. We present empirical results obtained by applying GenH to the real world problem of telescope observation scheduling. These results demonstrate that GenH is a simple and effective way of improving the performance of an heuristic scheduler.

  3. Heuristic method of fabricating counter electrodes in dye-sensitized solar cells based on a PEDOT:PSS layer as a catalytic material

    NASA Astrophysics Data System (ADS)

    Edalati, Sh; Houshangi far, A.; Torabi, N.; Baneshi, Z.; Behjat, A.

    2017-02-01

    Poly(3,4-ethylendioxythiophene):poly(styrene sulfonate) (PEDOT:PSS) was deposited on a fluoride-doped tin oxide glass substrate using a heuristic method to fabricate platinum-free counter electrodes for dye-sensitized solar cells (DSSCs). In this heuristic method a thin layer of PEDOT:PPS is obtained by spin coating the PEDOT:PSS on a Cu substrate and then removing the substrate with FeCl3. The characteristics of the deposited PEDOT:PSS were studied by energy dispersive x-ray analysis and scanning electron microscopy, which revealed the micro-electronic specifications of the cathode. The aforementioned DSSCs exhibited a solar conversion efficiency of 3.90%, which is far higher than that of DSSCs with pure PEDOT:PSS (1.89%). This enhancement is attributed not only to the micro-electronic specifications but also to the HNO3 treatment through our heuristic method. The results of cyclic voltammetry, electrochemical impedance spectroscopy (EIS) and Tafel polarization plots show the modified cathode has a dual function, including excellent conductivity and electrocatalytic activity for iodine reduction.

  4. Heuristic Traversal Of A Free Space Graph

    NASA Astrophysics Data System (ADS)

    Holmes, Peter D.; Jungert, Erland

    1989-01-01

    In order to plan paths within a physical working space, effective data structures must be used for spatial representation. A free space graph is a data structure derived from a systematic decomposition of the unobstructed portions of the working space. For the two-dimensional case, this work describes an heuristic method for traversal and search of one particular type of free space graph. The focus herein regards the "dialogue" between an A* search process and an inference engine whose rules employ spatial operators for classification of local topologies within the free space graph. This knowledge-based technique is used to generate plans which describe admissible sequences of movement between selected start and goal configurations.

  5. Prediction-based dynamic load-sharing heuristics

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.

    1993-01-01

    The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.

  6. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in

  7. Heuristic errors in clinical reasoning.

    PubMed

    Rylander, Melanie; Guerrasio, Jeannette

    2016-08-01

    Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.

  8. Heuristics of Twelfth Graders Building Isomorphisms

    ERIC Educational Resources Information Center

    Powell, Arthur B.; Maher, Carolyn A.

    2003-01-01

    This report analyzes the discursive interactions of four students to understand what heuristic methods they develop as well as how and why they build isomorphisms to resolve a combinatorial problem set in a non-Euclidian context. The findings suggest that results of their heuristic actions lead them to build isomorphisms that in turn allow them to…

  9. A Comparison of Usability Evaluation Methods: Heuristic Evaluation versus End-User Think-Aloud Protocol – An Example from a Web-based Communication Tool for Nurse Scheduling

    PubMed Central

    Yen, Po-Yin; Bakken, Suzanne

    2009-01-01

    We evaluated a web-based communication tool for nurse scheduling using two common usability evaluation methods, heuristic evaluation and end-user think aloud protocol. We found that heuristic evaluation performed by human-computer interaction (HCI) experts revealed more general interface design problems, while end-users’ think-aloud protocols identified more obstacles to task performance. To provide the most effective and thorough evaluation results, a combination of heuristic evaluation and end-user think-aloud protocol is recommended. PMID:20351946

  10. Restricted random search method based on taboo search in the multiple minima problem

    NASA Astrophysics Data System (ADS)

    Hong, Seung Do; Jhon, Mu Shik

    1997-03-01

    The restricted random search method is proposed as a simple Monte Carlo sampling method to search minima fast in the multiple minima problem. This method is based on taboo search applied recently to continuous test functions. The concept of the taboo region instead of the taboo list is used and therefore the sampling of a region near an old configuration is restricted in this method. This method is applied to 2-dimensional test functions and the argon clusters. This method is found to be a practical and efficient method to search near-global configurations of test functions and the argon clusters.

  11. Design space pruning heuristics and global optimization method for conceptual design of low-thrust asteroid tour missions

    NASA Astrophysics Data System (ADS)

    Alemany, Kristina

    Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the

  12. Visual tracking method based on cuckoo search algorithm

    NASA Astrophysics Data System (ADS)

    Gao, Ming-Liang; Yin, Li-Ju; Zou, Guo-Feng; Li, Hai-Tao; Liu, Wei

    2015-07-01

    Cuckoo search (CS) is a new meta-heuristic optimization algorithm that is based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flight behavior of some birds and fruit flies. It has been found to be efficient in solving global optimization problems. An application of CS is presented to solve the visual tracking problem. The relationship between optimization and visual tracking is comparatively studied and the parameters' sensitivity and adjustment of CS in the tracking system are experimentally studied. To demonstrate the tracking ability of a CS-based tracker, a comparative study of tracking accuracy and speed of the CS-based tracker with six "state-of-art" trackers, namely, particle filter, meanshift, PSO, ensemble tracker, fragments tracker, and compressive tracker are presented. Comparative results show that the CS-based tracker outperforms the other trackers.

  13. An intelligent method for geographic Web search

    NASA Astrophysics Data System (ADS)

    Mei, Kun; Yuan, Ying

    2008-10-01

    While the electronically available information in the World-Wide Web is explosively growing and thus increasing, the difficulty to find relevant information is also increasing for search engine user. In this paper we discuss how to constrain web queries geographically. A number of search queries are associated with geographical locations, either explicitly or implicitly. Accurately and effectively detecting the locations where search queries are truly about has huge potential impact on increasing search relevance, bringing better targeted search results, and improving search user satisfaction. Our approach focus on both in the way geographic information is extracted from the web and, as far as we can tell, in the way it is integrated into query processing. This paper gives an overview of a spatially aware search engine for semantic querying of web document. It also illustrates algorithms for extracting location from web documents and query requests using the location ontologies to encode and reason about formal semantics of geographic web search. Based on a real-world scenario of tourism guide search, the application of our approach shows that the geographic information retrieval can be efficiently supported.

  14. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  15. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  16. Comparison tomography relocation hypocenter grid search and guided grid search method in Java island

    NASA Astrophysics Data System (ADS)

    Nurdian, S. W.; Adu, N.; Palupi, I. R.; Raharjo, W.

    2016-11-01

    The main data in this research is earthquake data recorded from 1952 to 2012 with 9162 P wave and 2426 events are recorded by 30 stations located around Java island. Relocation hypocenter processed using grid search and guidded grid search method. Then the result of relocation hypocenter become input for tomography pseudo bending inversion process. It can be used to identification the velocity distribution in subsurface. The result of relocation hypocenter by grid search and guided grid search method after tomography process shown in locally and globally. In locally area grid search method result is better than guided grid search according to geological reseach area. But in globally area the result of guided grid search method is better for a broad area because the velocity variation is more diverse than the other one and in accordance with local geological research conditions.

  17. The use of geoscience methods for terrestrial forensic searches

    NASA Astrophysics Data System (ADS)

    Pringle, J. K.; Ruffell, A.; Jervis, J. R.; Donnelly, L.; McKinley, J.; Hansen, J.; Morgan, R.; Pirrie, D.; Harrison, M.

    2012-08-01

    Geoscience methods are increasingly being utilised in criminal, environmental and humanitarian forensic investigations, and the use of such methods is supported by a growing body of experimental and theoretical research. Geoscience search techniques can complement traditional methodologies in the search for buried objects, including clandestine graves, weapons, explosives, drugs, illegal weapons, hazardous waste and vehicles. This paper details recent advances in search and detection methods, with case studies and reviews. Relevant examples are given, together with a generalised workflow for search and suggested detection technique(s) table. Forensic geoscience techniques are continuing to rapidly evolve to assist search investigators to detect hitherto difficult to locate forensic targets.

  18. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  19. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  20. Exhaustive search system and method using space-filling curves

    DOEpatents

    Spires, Shannon V.

    2003-10-21

    A search system and method for one agent or for multiple agents using a space-filling curve provides a way to control one or more agents to cover an area of any space of any dimensionality using an exhaustive search pattern. An example of the space-filling curve is a Hilbert curve. The search area can be a physical geography, a cyberspace search area, or an area searchable by computing resources. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace.

  1. Heuristic methods for the single machine scheduling problem with different ready times and a common due date

    NASA Astrophysics Data System (ADS)

    Birgin, Ernesto G.; Ronconi, Débora P.

    2012-10-01

    The single machine scheduling problem with a common due date and non-identical ready times for the jobs is examined in this work. Performance is measured by the minimization of the weighted sum of earliness and tardiness penalties of the jobs. Since this problem is NP-hard, the application of constructive heuristics that exploit specific characteristics of the problem to improve their performance is investigated. The proposed approaches are examined through a computational comparative study on a set of 280 benchmark test problems with up to 1000 jobs.

  2. Heuristics for Online Information Retrieval: A Typology and Preliminary Listing.

    ERIC Educational Resources Information Center

    Harter, Stephen P.; Peters, Anne Rogers

    1985-01-01

    Presents typology of online search heuristics consisting of six main classes: philosophical attitudes and overall approach; language of problem description; record and file structure; concept formulation and reformulation; recall and precision; and cost efficiency. Heuristics in each of the six classes are listed and selected examples are briefly…

  3. Heuristic Inquiry: A Personal Journey of Acculturation and Identity Reconstruction

    ERIC Educational Resources Information Center

    Djuraskovic, Ivana; Arthur, Nancy

    2010-01-01

    Heuristic methodology attempts to discover the nature and meaning of phenomenon through internal self-search, exploration, and discovery. Heuristic methodology encourages the researcher to explore and pursue the creative journey that begins inside one's being and ultimately uncovers its direction and meaning through internal discovery (Douglass &…

  4. Establishing usability heuristics for heuristics evaluation in a specific domain: Is there a consensus?

    PubMed

    Hermawati, Setia; Lawson, Glyn

    2016-09-01

    Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed.

  5. Cuckoo search epistasis: a new method for exploring significant genetic interactions.

    PubMed

    Aflakparast, M; Salimi, H; Gerami, A; Dubé, M-P; Visweswaran, S; Masoudi-Nejad, A

    2014-06-01

    The advent of high-throughput sequencing technology has resulted in the ability to measure millions of single-nucleotide polymorphisms (SNPs) from thousands of individuals. Although these high-dimensional data have paved the way for better understanding of the genetic architecture of common diseases, they have also given rise to challenges in developing computational methods for learning epistatic relationships among genetic markers. We propose a new method, named cuckoo search epistasis (CSE) for identifying significant epistatic interactions in population-based association studies with a case-control design. This method combines a computationally efficient Bayesian scoring function with an evolutionary-based heuristic search algorithm, and can be efficiently applied to high-dimensional genome-wide SNP data. The experimental results from synthetic data sets show that CSE outperforms existing methods including multifactorial dimensionality reduction and Bayesian epistasis association mapping. In addition, on a real genome-wide data set related to Alzheimer's disease, CSE identified SNPs that are consistent with previously reported results, and show the utility of CSE for application to genome-wide data.

  6. Efficient protein structure search using indexing methods

    PubMed Central

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively. PMID:23691543

  7. Efficient protein structure search using indexing methods.

    PubMed

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  8. A new methodology for Reynolds-averaged modeling based on the amalgamation of heuristic-modeling and turbulence-theory methods

    NASA Astrophysics Data System (ADS)

    Yoshizawa, Akira; Nisizima, Shoiti; Shimomura, Yutaka; Kobayashi, Hiromichi; Matsuo, Yuichi; Abe, Hiroyuki; Fujiwara, Hitoshi

    2006-03-01

    A new methodology for the Reynolds-averaged Navier-Stokes modeling is presented on the basis of the amalgamation of heuristic-modeling and turbulence-theory methods. A characteristic turbulence time scale is synthesized in a heuristic manner through the combination of several characteristic time scales. An algebraic model of turbulent-viscosity type for the Reynolds stress is derived from the Reynolds-stress transport equation with the time scale embedded. It is applied to the state of weak spatial and temporal nonequilibrium, and is compared with its theoretical counterpart derived by the two-scale direct-interaction approximation. The synthesized time scale is justified through the agreement of the two expressions derived by these entirely different methods. The derived model is tested in rotating isotropic, channel, and homogeneous-shear flows. It is extended to a nonlinear algebraic model and a supersonic model. The latter is shown to succeed in reproducing the reduction in the growth rate of a free-shear layer flow, without causing wrong effects on wall-bounded flows such as channel and boundary-layer flows.

  9. Finite difference methods for reducing numerical diffusion in TEACH-type calculations. [Teaching Elliptic Axisymmetric Characteristics Heuristically

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.

    1985-01-01

    A methodological evaluation for two-finite differencing schemes for computer-aided gas turbine design is presented. The two computational schemes include; a Bounded Skewed Finite Differencing Scheme (BSUDS); and a Quadratic Upwind Differencing Scheme (QSDS). In the evaluation, the derivations of the schemes were incorporated into two-dimensional and three-dimensional versions of the Teaching Axisymmetric Characteristics Heuristically (TEACH) computer code. Assessments were made according to performance criteria for the solution of problems of turbulent, laminar, and coannular turbulent flow. The specific performance criteria used in the evaluation were simplicity, accuracy, and computational economy. It is found that the BSUDS scheme performed better with respect to the criteria than the QUDS. Some of the reasons for the more successful performance BSUDS are discussed.

  10. Finite difference methods for reducing numerical diffusion in TEACH-type calculations. [Teaching Elliptic Axisymmetric Characteristics Heuristically

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.

    1985-01-01

    A methodological evaluation for two-finite differencing schemes for computer-aided gas turbine design is presented. The two computational schemes include; a Bounded Skewed Finite Differencing Scheme (BSUDS); and a Quadratic Upwind Differencing Scheme (QSDS). In the evaluation, the derivations of the schemes were incorporated into two-dimensional and three-dimensional versions of the Teaching Axisymmetric Characteristics Heuristically (TEACH) computer code. Assessments were made according to performance criteria for the solution of problems of turbulent, laminar, and coannular turbulent flow. The specific performance criteria used in the evaluation were simplicity, accuracy, and computational economy. It is found that the BSUDS scheme performed better with respect to the criteria than the QUDS. Some of the reasons for the more successful performance BSUDS are discussed.

  11. Fluency Heuristic: A Model of How the Mind Exploits a By-Product of Information Retrieval

    ERIC Educational Resources Information Center

    Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten

    2008-01-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the…

  12. Fluency Heuristic: A Model of How the Mind Exploits a By-Product of Information Retrieval

    ERIC Educational Resources Information Center

    Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten

    2008-01-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the…

  13. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    ERIC Educational Resources Information Center

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  14. Pitfalls in Teaching Judgment Heuristics

    ERIC Educational Resources Information Center

    Shepperd, James A.; Koch, Erika J.

    2005-01-01

    Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…

  15. Pitfalls in Teaching Judgment Heuristics

    ERIC Educational Resources Information Center

    Shepperd, James A.; Koch, Erika J.

    2005-01-01

    Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…

  16. AN EVALUATION OF A METHOD FOR IMPROVING SEARCH STRATEGIES IN A COORDINATE SEARCHING SYSTEM.

    ERIC Educational Resources Information Center

    HEWER, DAVID J.

    SEARCH STRATEGIES WHICH CAN BE CONTINUOUSLY MODIFIED WERE DEVELOPED FOR COORDINATE SEARCHING SYSTEMS. USING THE FILES OF THE NASA TECHNOLOGY UTILIZATION PROGRAM AT THE KNOWLEDGE AVAILABILITY SYSTEMS CENTER, UNIVERSITY OF PITTSBURGH, A STUDY WAS CONDUCTED OF THE RETRIEVAL OF RELEVANT DOCUMENTS BY BOTH MANUAL AND MACHINE METHODS FOR FIVE QUESTIONS…

  17. ALMA Pipeline Heuristics

    NASA Astrophysics Data System (ADS)

    Muders, D.; Boone, F.; Wyrowski, F.; Lightfoot, J.; Kosugi, G.; Wilson, C.; Davis, L.; Shepherd, D.

    2007-10-01

    The Atacama Large Millimeter Array / Atacama Compact Array (ALMA / ACA) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes such as single fields, mosaics or on-the-fly maps. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics must capture the expert knowledge required to provide data products that can be used without further processing. The Pipeline Heuristics system is being developed as a set of Python scripts using as the data processing engines the Common Astronomy Software Applications (CASA[PY]) libraries and the ATNF Spectral Analysis Package (ASAP). The interferometry heuristics scripts currently provide an end-to-end process for the single field mode comprising flagging, initial calibration, re-flagging, re-calibration, and imaging of the target data. A Java browser provides user-friendly access to the heuristics results. The initial single-dish heuristics scripts implement automatic spectral line detection, baseline fitting and image gridding. The resulting data cubes are analyzed to detect source emission spectrally and spatially in order to calculate signal-to-noise ratios for comparison against the science goals specified by the observer.

  18. Training and search methods for speech recognition.

    PubMed Central

    Jelinek, F

    1995-01-01

    Speech recognition involves three processes: extraction of acoustic indices from the speech signal, estimation of the probability that the observed index string was caused by a hypothesized utterance segment, and determination of the recognized utterance via a search among hypothesized alternatives. This paper is not concerned with the first process. Estimation of the probability of an index string involves a model of index production by any given utterance segment (e.g., a word). Hidden Markov models (HMMs) are used for this purpose [Makhoul, J. & Schwartz, R. (1995) Proc. Natl. Acad. Sci. USA 92, 9956-9963]. Their parameters are state transition probabilities and output probability distributions associated with the transitions. The Baum algorithm that obtains the values of these parameters from speech data via their successive reestimation will be described in this paper. The recognizer wishes to find the most probable utterance that could have caused the observed acoustic index string. That probability is the product of two factors: the probability that the utterance will produce the string and the probability that the speaker will wish to produce the utterance (the language model probability). Even if the vocabulary size is moderate, it is impossible to search for the utterance exhaustively. One practical algorithm is described [Viterbi, A. J. (1967) IEEE Trans. Inf. Theory IT-13, 260-267] that, given the index string, has a high likelihood of finding the most probable utterance. PMID:7479810

  19. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  20. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data.

  1. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  2. Method and system for efficiently searching an encoded vector index

    DOEpatents

    Bui, Thuan Quang; Egan, Randy Lynn; Kathmann, Kevin James

    2001-09-04

    Method and system aspects for efficiently searching an encoded vector index are provided. The aspects include the translation of a search query into a candidate bitmap, and the mapping of data from the candidate bitmap into a search result bitmap according to entry values in the encoded vector index. Further, the translation includes the setting of a bit in the candidate bitmap for each entry in a symbol table that corresponds to candidate of the search query. Also included in the mapping is the identification of a bit value in the candidate bitmap pointed to by an entry in an encoded vector.

  3. Path integration mediated systematic search: a Bayesian model.

    PubMed

    Vickerstaff, Robert J; Merkle, Tobias

    2012-08-21

    The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Methods for Measuring Search Engine Performance over Time.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit

    2002-01-01

    Introduces methods for evaluating Web search engine performance over a time period. Describes the necessary setup for such studies, illustrates the use of these measures through a specific example, and recommends the use of the measures as a guideline for testing and improving search engine functionality. (Author/LRW)

  5. Job Search as Goal-Directed Behavior: Objectives and Methods

    ERIC Educational Resources Information Center

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  6. Job Search as Goal-Directed Behavior: Objectives and Methods

    ERIC Educational Resources Information Center

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  7. Hyper-heuristics with low level parameter adaptation.

    PubMed

    Ren, Zhilei; Jiang, He; Xuan, Jifeng; Luo, Zhongxuan

    2012-01-01

    Recent years have witnessed the great success of hyper-heuristics applying to numerous real-world applications. Hyper-heuristics raise the generality of search methodologies by manipulating a set of low level heuristics (LLHs) to solve problems, and aim to automate the algorithm design process. However, those LLHs are usually parameterized, which may contradict the domain independent motivation of hyper-heuristics. In this paper, we show how to automatically maintain low level parameters (LLPs) using a hyper-heuristic with LLP adaptation (AD-HH), and exemplify the feasibility of AD-HH by adaptively maintaining the LLPs for two hyper-heuristic models. Furthermore, aiming at tackling the search space expansion due to the LLP adaptation, we apply a heuristic space reduction (SAR) mechanism to improve the AD-HH framework. The integration of the LLP adaptation and the SAR mechanism is able to explore the heuristic space more effectively and efficiently. To evaluate the performance of the proposed algorithms, we choose the p-median problem as a case study. The empirical results show that with the adaptation of the LLPs and the SAR mechanism, the proposed algorithms are able to achieve competitive results over the three heterogeneous classes of benchmark instances.

  8. Multiobjective hyper heuristic scheme for system design and optimization

    NASA Astrophysics Data System (ADS)

    Rafique, Amer Farhan

    2012-11-01

    As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.

  9. An improvement of a beam search method for warehouse storage allocation planning problems minimizing the number of operations and the aggregated number of products for each customer

    NASA Astrophysics Data System (ADS)

    Nishi, Tatsushi; Yamamoto, Shinichiro; Konishi, Masami

    The storage allocation planning problem in warehouse management is to determine the allocation of products to the storage space and intermediate operations for retrieving products so as to minimize the number of operations, and maximize the collected number of products for each customer when the sequence of requests for inlet and retrieval operations are given. In this paper, we propose an efficient beam search method for generating a near optimal solution with a reasonable computation time. A heuristic procedure is also proposed in order to reduce a search space in the beam search method by using the information of subsequent inlet and retrieving requests. The validity of the proposed method is confirmed by comparing the results with the optimal solution derived by solving an MILP problem. The effectiveness of the proposed method is demonstrated by solving an actual large-sized problem consisting of more than 3000 operations.

  10. Intelligent process mapping through systematic improvement of heuristics

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  11. Method and System for Object Recognition Search

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A. (Inventor); Duong, Vu A. (Inventor); Stubberud, Allen R. (Inventor)

    2012-01-01

    A method for object recognition using shape and color features of the object to be recognized. An adaptive architecture is used to recognize and adapt the shape and color features for moving objects to enable object recognition.

  12. Parallel Heuristics for Scalable Community Detection

    SciTech Connect

    Lu, Howard; Kalyanaraman, Anantharaman; Halappanavar, Mahantesh; Choudhury, Sutanay

    2014-05-17

    Community detection has become a fundamental operation in numerous graph-theoretic applications. It is used to reveal natural divisions that exist within real world networks without imposing prior size or cardinality constraints on the set of communities. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed by Blondel et al. in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method is also inherently sequential, thereby limiting its scalability to problems that can be solved on desktops. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose multiple heuristics that are designed to break the sequential barrier. Our heuristics are agnostic to the underlying parallel architecture. For evaluation purposes, we implemented our heuristics on shared memory (OpenMP) and distributed memory (MapReduce-MPI) machines, and tested them over real world graphs derived from multiple application domains (internet, biological, natural language processing). Experimental results demonstrate the ability of our heuristics to converge to high modularity solutions comparable to those output by the serial algorithm in nearly the same number of iterations, while also drastically reducing time to solution.

  13. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  14. Hybridisations of Variable Neighbourhood Search and Modified Simplex Elements to Harmony Search and Shuffled Frog Leaping Algorithms for Process Optimisations

    NASA Astrophysics Data System (ADS)

    Aungkulanon, P.; Luangpaiboon, P.

    2010-10-01

    Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.

  15. A method for representing search results in three dimensions.

    PubMed Central

    Miller, M. H.

    1997-01-01

    This paper presents a new method for representing results of an information retrieval search in a three dimensional environment. Aside from the fact that users find 3-D interfaces visually appealing, there are strong practical reasons for developing 3-D representations of search results. Traditional information retrieval systems present results in ordered lists which are difficult to browse, and exclude useful information. The current method employs a multivariate statistical method called Local Latent Semantic Indexing (LLSI) to create meaningful local dimensions in which to view search results. A prototype Internet-ready system is described which utilizes Virtual Reality Modeling Language (VRML) to display search results. Preliminary tests of this system with a small collection of MEDLINE articles are very encouraging. PMID:9357683

  16. The Convolution Method in Neutrino Physics Searches

    SciTech Connect

    Tsakstara, V.; Kosmas, T. S.; Chasioti, V. C.; Divari, P. C.; Sinatkas, J.

    2007-12-26

    We concentrate on the convolution method used in nuclear and astro-nuclear physics studies and, in particular, in the investigation of the nuclear response of various neutrino detection targets to the energy-spectra of specific neutrino sources. Since the reaction cross sections of the neutrinos with nuclear detectors employed in experiments are extremely small, very fine and fast convolution techniques are required. Furthermore, sophisticated de-convolution methods are also needed whenever a comparison between calculated unfolded cross sections and existing convoluted results is necessary.

  17. Parallel heuristics for scalable community detection

    DOE PAGES

    Lu, Hao; Halappanavar, Mahantesh; Kalyanaraman, Ananth

    2015-08-14

    Community detection has become a fundamental operation in numerous graph-theoretic applications. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method ismore » also inherently sequential, thereby limiting its scalability. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose heuristics that are designed to break the sequential barrier. For evaluation purposes, we implemented our heuristics using OpenMP multithreading, and tested them over real world graphs derived from multiple application domains. Compared to the serial Louvain implementation, our parallel implementation is able to produce community outputs with a higher modularity for most of the inputs tested, in comparable number or fewer iterations, while providing real speedups of up to 16x using 32 threads.« less

  18. Accelerated Profile HMM Searches.

    PubMed

    Eddy, Sean R

    2011-10-01

    Profile hidden Markov models (profile HMMs) and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV) algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  19. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems

    PubMed Central

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383

  20. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems.

    PubMed

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems.

  1. ALMA Pipeline Heuristics

    NASA Astrophysics Data System (ADS)

    Lightfoot, J.; Wyrowski, F.; Muders, D.; Boone, F.; Davis, L.; Shepherd, D.; Wilson, C.

    2006-07-01

    The ALMA (Atacama Large Millimeter Array) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics system must capture the expert knowledge required to provide data products that can be used without further processing. Observing modes to be processed by the system include single field interferometry, mosaics and single dish `on-the-fly' maps, and combinations of these modes. The data will be produced by the main ALMA array, the ALMA Compact Array (ACA) and single dish antennas. The Pipeline Heuristics system is being developed as a set of Python scripts. For interferometry these use as data processing engines the CASA/AIPS++ libraries and their bindings as CORBA objects within the ALMA Common Software (ACS). Initial development has used VLA and Plateau de Bure data sets to build and test a heuristic script capable of reducing single field data. In this paper we describe the reduction datapath and the algorithms used at each stage. Test results are presented. The path for future development is outlined.

  2. Behavior of heuristics on large and hard satisfiability problems

    NASA Astrophysics Data System (ADS)

    Ardelius, John; Aurell, Erik

    2006-09-01

    We study the behavior of a heuristic for solving random satisfiability problems by stochastic local search near the satisfiability threshold. The heuristic for average satisfiability (ASAT), is similar to the Focused Metropolis Search heuristic, and shares the property of being focused, i.e., only variables in unsatisfied clauses are updated in each step. It is significantly simpler than the benchmark WALKSAT heuristic. We show that ASAT solves instances as large as N=106 in linear time, on average, up to a ratio of 4.21 clauses per variable in random three-satisfiability. For K higher than 3, ASAT appears to solve instances of K -satisfiability up to the Montanari-Ricci-Tersenghi-Parisi full replica symmetry breaking (FSRB) threshold denoted αs(K) in linear time.

  3. Behavior of heuristics on large and hard satisfiability problems.

    PubMed

    Ardelius, John; Aurell, Erik

    2006-09-01

    We study the behavior of a heuristic for solving random satisfiability problems by stochastic local search near the satisfiability threshold. The heuristic for average satisfiability (ASAT), is similar to the Focused Metropolis Search heuristic, and shares the property of being focused, i.e., only variables in unsatisfied clauses are updated in each step. It is significantly simpler than the benchmark WALKSAT heuristic. We show that ASAT solves instances as large as N=10(6) in linear time, on average, up to a ratio of 4.21 clauses per variable in random three-satisfiability. For K higher than 3, ASAT appears to solve instances of K -satisfiability up to the Montanari-Ricci-Tersenghi-Parisi full replica symmetry breaking (FSRB) threshold denoted alpha(s)(K) in linear time.

  4. A Flexible Transition State Searching Method for Atmospheric Reaction Systems

    SciTech Connect

    Lin, Xiao-Xiao; Liu, Yi-Rong; Huang, Teng; Chen, Jiao; Jiang, Shuai; Huang, Wei

    2015-04-01

    The precise and rapid exploration of transition states (TSs) is a major challenge when studying atmospheric reactions due to their complexity. In this work, a Monte Carlo Transition State Search Method (MCTSSM), which integrates Monte Carlo sampling technique with transition state optimization methods using an efficient computer script, has been developed for transition state searches. The efficiency and the potential application in atmospheric reactions of this method have been demonstrated by three types of test suits related to the reactions of atmospheric volatile organic compounds (VOCs): (1) OH addition, (2) OH hydrogen-abstraction, and (3) the other reactive group (e.g. Cl, O3, NO3), especially for the reaction of β-pinene-sCI (stabilized Criegee Intermediates) with water. It was shown that the application of this method with effective restricted parameters has greatly simplified the time-consuming and tedious manual search procedure for transition state (TS) of the bimolecular reaction systems.

  5. A flexible transition state searching method for atmospheric reaction systems

    NASA Astrophysics Data System (ADS)

    Lin, Xiao-Xiao; Liu, Yi-Rong; Huang, Teng; Chen, Jiao; Jiang, Shuai; Huang, Wei

    2015-04-01

    The precise and rapid exploration of transition states (TSs) is a major challenge when studying atmospheric reactions due to their complexity. In this work, a Monte Carlo Transition State Search Method (MCTSSM), which integrates Monte Carlo sampling technique with transition state optimization methods using an efficient computer script, has been developed for transition state searches. The efficiency and the potential application in atmospheric reactions of this method have been demonstrated by three types of test suits related to the reactions of atmospheric volatile organic compounds (VOCs): (1) OH addition, (2) OH hydrogen-abstraction, and (3) the other reactive group (e.g. Cl, O3, NO3), especially for the reaction of β-pinene-sCI (stabilized Criegee Intermediates) with water. It was shown that the application of this method with effective restricted parameters has greatly simplified the time-consuming and tedious manual search procedure for transition state (TS) of the bimolecular reaction systems.

  6. A quantum heuristic algorithm for the traveling salesman problem

    NASA Astrophysics Data System (ADS)

    Bang, Jeongho; Ryu, Junghee; Lee, Changhyoup; Yoo, Seokwon; Lim, James; Lee, Jinhyoung

    2012-12-01

    We propose a quantum heuristic algorithm to solve the traveling salesman problem by generalizing the Grover search. Sufficient conditions are derived to greatly enhance the probability of finding the tours with the cheapest costs reaching almost to unity. These conditions are characterized by the statistical properties of tour costs and are shown to be automatically satisfied in the large-number limit of cities. In particular for a continuous distribution of the tours along the cost, we show that the quantum heuristic algorithm exhibits a quadratic speedup compared to its classical heuristic algorithm.

  7. Simple heuristics in over-the-counter drug choices: a new hint for medical education and practice

    PubMed Central

    Riva, Silvia; Monti, Marco; Antonietti, Alessandro

    2011-01-01

    Introduction Over-the-counter (OTC) drugs are widely available and often purchased by consumers without advice from a health care provider. Many people rely on self-management of medications to treat common medical conditions. Although OTC medications are regulated by the National and the International Health and Drug Administration, many people are unaware of proper dosing, side effects, adverse drug reactions, and possible medication interactions. Purpose This study examined how subjects make their decisions to select an OTC drug, evaluating the role of cognitive heuristics which are simple and adaptive rules that help the decision-making process of people in everyday contexts. Subjects and methods By analyzing 70 subjects’ information-search and decision-making behavior when selecting OTC drugs, we examined the heuristics they applied in order to assess whether simple decision-making processes were also accurate and relevant. Subjects were tested with a sequence of two experimental tests based on a computerized Java system devised to analyze participants’ choices in a virtual environment. Results We found that subjects’ information-search behavior reflected the use of fast and frugal heuristics. In addition, although the heuristics which correctly predicted subjects’ decisions implied significantly fewer cues on average than the subjects did in the information-search task, they were accurate in describing order of information search. A simple combination of a fast and frugal tree and a tallying rule predicted more than 78% of subjects’ decisions. Conclusion The current emphasis in health care is to shift some responsibility onto the consumer through expansion of self medication. To know which cognitive mechanisms are behind the choice of OTC drugs is becoming a relevant purpose of current medical education. These findings have implications both for the validity of simple heuristics describing information searches in the field of OTC drug choices and

  8. Heuristics for multiobjective multiple sequence alignment.

    PubMed

    Abbasi, Maryam; Paquete, Luís; Pereira, Francisco B

    2016-07-15

    Aligning multiple sequences arises in many tasks in Bioinformatics. However, the alignments produced by the current software packages are highly dependent on the parameters setting, such as the relative importance of opening gaps with respect to the increase of similarity. Choosing only one parameter setting may provide an undesirable bias in further steps of the analysis and give too simplistic interpretations. In this work, we reformulate multiple sequence alignment from a multiobjective point of view. The goal is to generate several sequence alignments that represent a trade-off between maximizing the substitution score and minimizing the number of indels/gaps in the sum-of-pairs score function. This trade-off gives to the practitioner further information about the similarity of the sequences, from which she could analyse and choose the most plausible alignment. We introduce several heuristic approaches, based on local search procedures, that compute a set of sequence alignments, which are representative of the trade-off between the two objectives (substitution score and indels). Several algorithm design options are discussed and analysed, with particular emphasis on the influence of the starting alignment and neighborhood search definitions on the overall performance. A perturbation technique is proposed to improve the local search, which provides a wide range of high-quality alignments. The proposed approach is tested experimentally on a wide range of instances. We performed several experiments with sequences obtained from the benchmark database BAliBASE 3.0. To evaluate the quality of the results, we calculate the hypervolume indicator of the set of score vectors returned by the algorithms. The results obtained allow us to identify reasonably good choices of parameters for our approach. Further, we compared our method in terms of correctly aligned pairs ratio and columns correctly aligned ratio with respect to reference alignments. Experimental results show

  9. Gene selection heuristic algorithm for nutrigenomics studies.

    PubMed

    Valour, D; Hue, I; Grimard, B; Valour, B

    2013-07-15

    Large datasets from -omics studies need to be deeply investigated. The aim of this paper is to provide a new method (LEM method) for the search of transcriptome and metabolome connections. The heuristic algorithm here described extends the classical canonical correlation analysis (CCA) to a high number of variables (without regularization) and combines well-conditioning and fast-computing in "R." Reduced CCA models are summarized in PageRank matrices, the product of which gives a stochastic matrix that resumes the self-avoiding walk covered by the algorithm. Then, a homogeneous Markov process applied to this stochastic matrix converges the probabilities of interconnection between genes, providing a selection of disjointed subsets of genes. This is an alternative to regularized generalized CCA for the determination of blocks within the structure matrix. Each gene subset is thus linked to the whole metabolic or clinical dataset that represents the biological phenotype of interest. Moreover, this selection process reaches the aim of biologists who often need small sets of genes for further validation or extended phenotyping. The algorithm is shown to work efficiently on three published datasets, resulting in meaningfully broadened gene networks.

  10. Inferring heuristic classification hierarchies from natural language input

    NASA Technical Reports Server (NTRS)

    Hull, Richard; Gomez, Fernando

    1993-01-01

    A methodology for inferring hierarchies representing heuristic knowledge about the check out, control, and monitoring sub-system (CCMS) of the space shuttle launch processing system from natural language input is explained. Our method identifies failures explicitly and implicitly described in natural language by domain experts and uses those descriptions to recommend classifications for inclusion in the experts' heuristic hierarchies.

  11. A Heuristic Fast Method to Solve the Nonlinear Schroedinger Equation in Fiber Bragg Gratings with Arbitrary Shape Input Pulse

    SciTech Connect

    Emami, F.; Hatami, M.; Keshavarz, A. R.; Jafari, A. H.

    2009-08-13

    Using a combination of Runge-Kutta and Jacobi iterative method, we could solve the nonlinear Schroedinger equation describing the pulse propagation in FBGs. By decomposing the electric field to forward and backward components in fiber Bragg grating and utilizing the Fourier series analysis technique, the boundary value problem of a set of coupled equations governing the pulse propagation in FBG changes to an initial condition coupled equations which can be solved by simple Runge-Kutta method.

  12. Search area Expanding Strategy and Dynamic Priority Setting Method in the Improved 2-opt Method

    NASA Astrophysics Data System (ADS)

    Matayoshi, Mitsukuni; Nakamura, Morikazu; Miyagi, Hayao

    We propose a new 2-opt base method in a Memetic algorithm, that is, Genetic Algorithms(GAs) with a local search. The basic idea is from the fast 2-opt(1) method and the improved 2-opt method(20). Our new search method uses the “Priority" employed in the improved 2-opt method. The “Priority" represents the contribution level in exchange of genes. Matayoshi's method exchanges genes based on previous contribution to the fitness value improvement. We propose a new search method by using the concept of the Priority. We call our method the search area expanding strategy method in the improved 2-opt method. Our method escalates the search area by using “Priority". In computer experiment, it is shown that the computation time to find exact solution depends on the value of the Priority. If our method does not set an appropriate priority beforehand, then we propose the method to adapt to suitable value. If improvement does not achieved for certain generations, our dynamic priority method tries to modify the priority by the mutation operation. Experimental results show that the search area expanding strategy method embedded with the dynamic priority setting method can find the exact solution at earlier generation than other methods for comparison.

  13. System, method and apparatus for conducting a keyterm search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A keyterm search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more keyterms. Next, a gleaning model of the query is created. The gleaning model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  14. System, method and apparatus for conducting a phrase search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A phrase search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more sequences of terms. Next, a relational model of the query is created. The relational model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  15. Using Pattern Search Methods for Surface Structure Determinationof Nanomaterials

    SciTech Connect

    Zhao, Zhengji; Meza, Juan; Van Hove, Michel

    2006-06-09

    Atomic scale surface structure plays an important roleindescribing many properties of materials, especially in the case ofnanomaterials. One of the most effective techniques for surface structuredetermination is low-energy electron diffraction (LEED), which can beused in conjunction with optimization to fit simulated LEED intensitiesto experimental data. This optimization problem has a number ofcharacteristics that make it challenging: it has many local minima, theoptimization variables can be either continuous or categorical, theobjective function can be discontinuous, there are no exact analyticderivatives (and no derivatives at all for categorical variables), andfunction evaluations are expensive. In this study, we show how to apply aparticular class of optimization methods known as pattern search methodsto address these challenges. These methods donot explicitly usederivatives, and are particularly appropriate when categorical variablesare present, an important feature that has not been addressed in previousLEED studies. We have found that pattern search methods can produceexcellent results, compared to previously used methods, both in terms ofperformance and locating optimal results.

  16. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis.

    PubMed

    Tashkova, Katerina; Korošec, Peter; Silc, Jurij; Todorovski, Ljupčo; Džeroski, Sašo

    2011-10-11

    We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and

  17. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis

    PubMed Central

    2011-01-01

    Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These

  18. Reliable Transition State Searches Integrated with the Growing String Method.

    PubMed

    Zimmerman, Paul

    2013-07-09

    The growing string method (GSM) is highly useful for locating reaction paths connecting two molecular intermediates. GSM has often been used in a two-step procedure to locate exact transition states (TS), where GSM creates a quality initial structure for a local TS search. This procedure and others like it, however, do not always converge to the desired transition state because the local search is sensitive to the quality of the initial guess. This article describes an integrated technique for simultaneous reaction path and exact transition state search. This is achieved by implementing an eigenvector following optimization algorithm in internal coordinates with Hessian update techniques. After partial convergence of the string, an exact saddle point search begins under the constraint that the maximized eigenmode of the TS node Hessian has significant overlap with the string tangent near the TS. Subsequent optimization maintains connectivity of the string to the TS as well as locks in the TS direction, all but eliminating the possibility that the local search leads to the wrong TS. To verify the robustness of this approach, reaction paths and TSs are found for a benchmark set of more than 100 elementary reactions.

  19. SSAHA: A Fast Search Method for Large DNA Databases

    PubMed Central

    Ning, Zemin; Cox, Anthony J.; Mullikin, James C.

    2001-01-01

    We describe an algorithm, SSAHA (Sequence Search and Alignment by Hashing Algorithm), for performing fast searches on databases containing multiple gigabases of DNA. Sequences in the database are preprocessed by breaking them into consecutive k-tuples of k contiguous bases and then using a hash table to store the position of each occurrence of each k-tuple. Searching for a query sequence in the database is done by obtaining from the hash table the “hits” for each k-tuple in the query sequence and then performing a sort on the results. We discuss the effect of the tuple length k on the search speed, memory usage, and sensitivity of the algorithm and present the results of computational experiments which show that SSAHA can be three to four orders of magnitude faster than BLAST or FASTA, while requiring less memory than suffix tree methods. The SSAHA algorithm is used for high-throughput single nucleotide polymorphism (SNP) detection and very large scale sequence assembly. Also, it provides Web-based sequence search facilities for Ensembl projects. PMID:11591649

  20. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  1. Reexamining our bias against heuristics.

    PubMed

    McLaughlin, Kevin; Eva, Kevin W; Norman, Geoff R

    2014-08-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources of bias in the literature implicating the use of heuristics in diagnostic error and highlight the fact that there are also data suggesting that under certain circumstances using heuristics may lead to better decisions that formal analysis. They suggest that diagnostic error is frequently misattributed to the use of heuristics and propose an alternative view whereby content knowledge is the root cause of diagnostic performance and heuristics lie on the causal pathway between knowledge and diagnostic error or success.

  2. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    NASA Astrophysics Data System (ADS)

    Abtahi, Amir-Reza; Bijari, Afsane

    2017-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  3. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    NASA Astrophysics Data System (ADS)

    Abtahi, Amir-Reza; Bijari, Afsane

    2016-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  4. Heuristic reusable dynamic programming: efficient updates of local sequence alignment.

    PubMed

    Hong, Changjin; Tewfik, Ahmed H

    2009-01-01

    Recomputation of the previously evaluated similarity results between biological sequences becomes inevitable when researchers realize errors in their sequenced data or when the researchers have to compare nearly similar sequences, e.g., in a family of proteins. We present an efficient scheme for updating local sequence alignments with an affine gap model. In principle, using the previous matching result between two amino acid sequences, we perform a forward-backward alignment to generate heuristic searching bands which are bounded by a set of suboptimal paths. Given a correctly updated sequence, we initially predict a new score of the alignment path for each contour to select the best candidates among them. Then, we run the Smith-Waterman algorithm in this confined space. Furthermore, our heuristic alignment for an updated sequence shows that it can be further accelerated by using reusable dynamic programming (rDP), our prior work. In this study, we successfully validate "relative node tolerance bound" (RNTB) in the pruned searching space. Furthermore, we improve the computational performance by quantifying the successful RNTB tolerance probability and switch to rDP on perturbation-resilient columns only. In our searching space derived by a threshold value of 90 percent of the optimal alignment score, we find that 98.3 percent of contours contain correctly updated paths. We also find that our method consumes only 25.36 percent of the runtime cost of sparse dynamic programming (sDP) method, and to only 2.55 percent of that of a normal dynamic programming with the Smith-Waterman algorithm.

  5. Structural Functionalism as a Heuristic Device.

    ERIC Educational Resources Information Center

    Chilcott, John H.

    1998-01-01

    Argues that structural functionalism as a method for conducting fieldwork and as a format for the analysis of ethnographic data remains a powerful model, one that is easily understood by professional educators. As a heuristic device, functionalist theory can help in the solution of a problem that is otherwise incapable of theoretical…

  6. Describing a Performance Improvement Specialist: The Heurist.

    ERIC Educational Resources Information Center

    Westgaard, Odin

    1997-01-01

    Describes the work of performance improvement specialists and presents a method for determining whether a particular person or position meets the job criteria. Discusses the attributes of being a heurist, or taking a holistic approach to problem solving. Lists 10 steps for a needs assessment and 30 characteristics of successful performance…

  7. Heuristic Classification. Technical Report Number 12.

    ERIC Educational Resources Information Center

    Clancey, William J.

    A broad range of well-structured problems--embracing forms of diagnosis, catalog selection, and skeletal planning--are solved in expert computer systems by the method of heuristic classification. These programs have a characteristic inference structure that systematically relates data to a pre-enumerated set of solutions by abstraction, heuristic…

  8. Exploration of Stellarator Configuration Space with Global Search Methods

    SciTech Connect

    H.E. Mynick; N. Pomphrey; S. Ethier

    2001-09-10

    An exploration of stellarator configuration space z for quasi-axisymmetric stellarator (QAS) designs is discussed, using methods which provide a more global view of that space. To this end, we have implemented a ''differential evolution'' (DE) search algorithm in an existing stellarator optimizer, which is much less prone to become trapped in local, suboptimal minima of the cost function chi than the local search methods used previously. This search algorithm is complemented by mapping studies of chi over z aimed at gaining insight into the results of the automated searches. We find that a wide range of the attractive QAS configurations previously found fall into a small number of classes, with each class corresponding to a basin of chi(z). We develop maps on which these earlier stellarators can be placed, the relations among them seen, and understanding gained into the physics differences between them. It is also found that, while still large, the region of z space containing practically realizable QAS configurations is much smaller than earlier supposed.

  9. The Use of Resistivity Methods in Terrestrial Forensic Searches

    NASA Astrophysics Data System (ADS)

    Wolf, R. C.; Raisuddin, I.; Bank, C.

    2013-12-01

    The increasing use of near-surface geophysical methods in forensic searches has demonstrated the need for further studies to identify the ideal physical, environmental and temporal settings for each geophysical method. Previous studies using resistivity methods have shown promising results, but additional work is required to more accurately interpret and analyze survey findings. The Ontario Provincial Police's UCRT (Urban Search and Rescue; Chemical, Biolgical, Radiological, Nuclear and Explosives; Response Team) is collaborating with the University of Toronto and two additional universities in a multi-year study investigating the applications of near-surface geophysical methods to terrestrial forensic searches. In the summer of 2012, on a test site near Bolton, Ontario, the OPP buried weapons, drums and pigs (naked, tarped, and clothed) to simulate clandestine graves and caches. Our study aims to conduct repeat surveys using an IRIS Syscal Junior with 48 electrode switching system resistivity-meter. These surveys will monitor changes in resistivity reflecting decomposition of the object since burial, and identify the strengths and weaknesses of resistivity when used in a rural, clandestine burial setting. Our initial findings indicate the usefulness of this method, as prominent resistivity changes have been observed. We anticipate our results will help to assist law enforcement agencies in determining the type of resistivity results to expect based on time since burial, depth of burial and state of dress of the body.

  10. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    ERIC Educational Resources Information Center

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  11. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    ERIC Educational Resources Information Center

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  12. A Hidden Markov Model Approach to the Problem of Heuristic Selection in Hyper-heuristics with a Case Study in High School Timetabling Problems.

    PubMed

    Kheiri, Ahmed; Keedwell, Ed

    2016-06-03

    Operations research is a well established field that uses computational systems to support decisions in business and public life. Good solutions to operations research problems can make a large difference to the efficient running of businesses and organisations and so the field often searches for new methods to improve these solutions. The high school timetabling problem is an example of an operations research problem and is a challenging task which requires assigning events and resources to time slots subject to a set of constraints. In this paper a new sequence-based selection hyper-heuristic is presented that produces excellent results on a suite of high school timetabling problems. In this study, we present an easy-to-implement, easy-to-maintain and effective sequence-based selection hyper-heuristic to solve high school timetabling problems using a benchmark of unified real-world instances collected from different countries. We show that with sequence-based methods, it is possible to discover new best known solutions for a number of the problems in the timetabling domain. Through this investigation, the usefulness of sequence-based selection hyper-heuristics has been demonstrated and the capability of these methods has been shown to exceed the state-of-the-art.

  13. Obtaining Maxwell's equations heuristically

    NASA Astrophysics Data System (ADS)

    Diener, Gerhard; Weissbarth, Jürgen; Grossmann, Frank; Schmidt, Rüdiger

    2013-02-01

    Starting from the experimental fact that a moving charge experiences the Lorentz force and applying the fundamental principles of simplicity (first order derivatives only) and linearity (superposition principle), we show that the structure of the microscopic Maxwell equations for the electromagnetic fields can be deduced heuristically by using the transformation properties of the fields under space inversion and time reversal. Using the experimental facts of charge conservation and that electromagnetic waves propagate with the speed of light, together with Galilean invariance of the Lorentz force, allows us to finalize Maxwell's equations and to introduce arbitrary electrodynamics units naturally.

  14. Photovoltaic maximum power point search method using a light sensor

    NASA Astrophysics Data System (ADS)

    Ostrowski, Mariusz

    2015-05-01

    The main disadvantage of PV panels is their low efficiency and non-linear current-voltage characteristic. Both of the above depend on the insolation and the temperature. That is why, it is necessary to use the maximum power point search systems. Commonly used solutions vary not only in complexity and accuracy but also in the speed of searching the maximum power point. Usually, the measurement of current and voltage is used to determine the maximum power point. The most common in literature are the perturb and observe and incremental conductance methods. The disadvantage of these solutions is the need to search across the whole current-voltage curve, which results in a significant power loss. In order to prevent it, the techniques mentioned above are combined with other methods. This procedure determines the starting point of one of the above methods and results in shortening the search time. Modern solutions use the temperature measurement to determine the open circuit voltage. The simulations show that the voltage in the maximum power point depends mainly on the temperature of the photovoltaic panel, and the current depends mainly on the lighting conditions. The proposed method uses the measurement of illuminance and calculates the current at the maximum power point, which is used as a reference signal in power conversion system. Due to the non-linearity of the light sensor and of the photovoltaic panel, the relation between them cannot be determined directly. Therefore, the proposed method use the modified correlation function to calculate current corresponding to the light.

  15. Conformational analysis of macrocycles: finding what common search methods miss.

    PubMed

    Bonnet, Pascal; Agrafiotis, Dimitris K; Zhu, Fangqiang; Martin, Eric

    2009-10-01

    As computational drug design becomes increasingly reliant on virtual screening and on high-throughput 3D modeling, the need for fast, robust, and reliable methods for sampling molecular conformations has become greater than ever. Furthermore, chemical novelty is at a premium, forcing medicinal chemists to explore more complex structural motifs and unusual topologies. This necessitates the use of conformational sampling techniques that work well in all cases. Here, we compare the performance of several popular conformational search algorithms on three broad classes of macrocyclic molecules. These methods include Catalyst, CAESAR, MacroModel, MOE, Omega, Rubicon and two newer self-organizing algorithms known as stochastic proximity embedding (SPE) and self-organizing superimposition (SOS) that have been developed at Johnson & Johnson. Our results show a compelling advantage for the three distance geometry methods (SOS, SPE, and Rubicon) followed to a lesser extent by MacroModel. The remaining techniques, particularly those based on systematic search, often failed to identify any of the lowest energy conformations and are unsuitable for this class of structures. Taken together with our previous study on drug-like molecules (Agrafiotis, D. K.; Gibbs, A.; Zhu, F.; Izrailev, S.; Martin, E. Conformational Sampling of Bioactive Molecules: A Comparative Study. J. Chem. Inf. Model., 2007, 47, 1067-1086), these results suggest that SPE and SOS are two of the most robust and universally applicable conformational search methods, with the latter being preferred because of its superior speed.

  16. An explicit-solvent conformation search method using open software

    PubMed Central

    Gaalswyk, Kari

    2016-01-01

    Computer modeling is a popular tool to identify the most-probable conformers of a molecule. Although the solvent can have a large effect on the stability of a conformation, many popular conformational search methods are only capable of describing molecules in the gas phase or with an implicit solvent model. We have developed a work-flow for performing a conformation search on explicitly-solvated molecules using open source software. This method uses replica exchange molecular dynamics (REMD) to sample the conformational states of the molecule efficiently. Cluster analysis is used to identify the most probable conformations from the simulated trajectory. This work-flow was tested on drug molecules α-amanitin and cabergoline to illustrate its capabilities and effectiveness. The preferred conformations of these molecules in gas phase, implicit solvent, and explicit solvent are significantly different. PMID:27280078

  17. Alpha-beta coordination method for collective search

    DOEpatents

    Goldsmith, Steven Y.

    2002-01-01

    The present invention comprises a decentralized coordination strategy called alpha-beta coordination. The alpha-beta coordination strategy is a family of collective search methods that allow teams of communicating agents to implicitly coordinate their search activities through a division of labor based on self-selected roles and self-determined status. An agent can play one of two complementary roles. An agent in the alpha role is motivated to improve its status by exploring new regions of the search space. An agent in the beta role is also motivated to improve its status, but is conservative and tends to remain aggregated with other agents until alpha agents have clearly identified and communicated better regions of the search space. An agent can select its role dynamically based on its current status value relative to the status values of neighboring team members. Status can be determined by a function of the agent's sensor readings, and can generally be a measurement of source intensity at the agent's current location. An agent's decision cycle can comprise three sequential decision rules: (1) selection of a current role based on the evaluation of the current status data, (2) selection of a specific subset of the current data, and (3) determination of the next heading using the selected data. Variations of the decision rules produce different versions of alpha and beta behaviors that lead to different collective behavior properties.

  18. The Gaussian CLs method for searches of new physics

    DOE PAGES

    Qian, X.; Tan, A.; Ling, J. J.; ...

    2016-04-23

    Here we describe a method based on the CLs approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CLs method. Our work provides a self-contained mathematical proof for the Gaussian CLs method, that explicitly outlines the requiredmore » conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CLs method in an example of searching for a sterile neutrino, where the CLs approach was rarely used before. We also compare data analysis results produced by the Gaussian CLs method and various CI methods to showcase their differences.« less

  19. Cumulative query method for influenza surveillance using search engine data.

    PubMed

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  20. A comparison of fully automated methods of data analysis and computer assisted heuristic methods in an electrode kinetic study of the pathologically variable [Fe(CN)6](3-/4-) process by AC voltammetry.

    PubMed

    Morris, Graham P; Simonov, Alexandr N; Mashkina, Elena A; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E; Gavaghan, David J; Bond, Alan M

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6](3-/4-) process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E(0) (reversible potential), k(0) (heterogeneous charge transfer rate constant at E(0)), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm's Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k(0) values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN)6](3-/4-) process, but remarkably, all fit the quasi-reversible model satisfactorily.

  1. Reexamining Our Bias against Heuristics

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Eva, Kevin W.; Norman, Geoff R.

    2014-01-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources…

  2. Reexamining Our Bias against Heuristics

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Eva, Kevin W.; Norman, Geoff R.

    2014-01-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources…

  3. Memory-Based Decision-Making with Heuristics: Evidence for a Controlled Activation of Memory Representations

    ERIC Educational Resources Information Center

    Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank

    2011-01-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…

  4. Memory-Based Decision-Making with Heuristics: Evidence for a Controlled Activation of Memory Representations

    ERIC Educational Resources Information Center

    Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank

    2011-01-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…

  5. Heuristics in Composition and Literary Criticism.

    ERIC Educational Resources Information Center

    McCarthy, B. Eugene

    1978-01-01

    Describes the "particle, wave, field" heuristic for gathering information, and shows how students can apply that heuristic in analyzing literature and in using procedures of historical criticism. (RL)

  6. A Geographical Heuristic Routing Protocol for VANETs.

    PubMed

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-09-23

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation).

  7. A Geographical Heuristic Routing Protocol for VANETs

    PubMed Central

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-01-01

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation). PMID:27669254

  8. A systematic method for search term selection in systematic reviews.

    PubMed

    Thompson, Jenna; Davis, Jacqueline; Mazerolle, Lorraine

    2014-06-01

    The wide variety of readily available electronic media grants anyone the freedom to retrieve published references from almost any area of research around the world. Despite this privilege, keeping up with primary research evidence is almost impossible because of the increase in professional publishing across disciplines. Systematic reviews are a solution to this problem as they aim to synthesize all current information on a particular topic and present a balanced and unbiased summary of the findings. They are fast becoming an important method of research across a number of fields, yet only a small number of guidelines exist on how to define and select terms for a systematic search. This article presents a replicable method for selecting terms in a systematic search using the semantic concept recognition software called leximancer (Leximancer, University of Queensland, Brisbane, Australia). We use this software to construct a set of terms from a corpus of literature pertaining to transborder interventions for drug control and discuss the applicability of this method to systematic reviews in general. This method aims to contribute a more 'systematic' approach for selecting terms in a manner that is entirely replicable for any user.

  9. How the twain can meet: Prospect theory and models of heuristics in risky choice.

    PubMed

    Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph

    2017-03-01

    Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice.

  10. Search a methane hydrate in the Arctic with photonics methods

    NASA Astrophysics Data System (ADS)

    Grishkanich, Alexsandr S.; Polyakov, Vadim; Sidorov, Igor; Kascheev, Sergey; Elizarov, Valentin; Zhevlakov, Aleksandr; Mak, Andrey

    2016-04-01

    Identifying methane anomalies responsible for the temperature increase, by hiking trails in the Arctic requires great human labor. It is necessary to use lidar methods for search and identification of methane from permafrost. Necessary to create a Raman lidar for monitoring of emissions of methane hydrate from the permafrost. Hyperspectral resolution would resolve the isotope shifts in the Stokes spectra, thereby to determine the isotopic composition of methane ratio C14/C12 CH4 carbon emissions and identify the source for study (permafrost or oil deposits)

  11. Improving Nearest Neighbour Search in 3d Spatial Access Method

    NASA Astrophysics Data System (ADS)

    Suhaibaha, A.; Rahman, A. A.; Uznir, U.; Anton, F.; Mioc, D.

    2016-10-01

    Nearest Neighbour (NN) is one of the important queries and analyses for spatial application. In normal practice, spatial access method structure is used during the Nearest Neighbour query execution to retrieve information from the database. However, most of the spatial access method structures are still facing with unresolved issues such as overlapping among nodes and repetitive data entry. This situation will perform an excessive Input/Output (IO) operation which is inefficient for data retrieval. The situation will become more crucial while dealing with 3D data. The size of 3D data is usually large due to its detail geometry and other attached information. In this research, a clustered 3D hierarchical structure is introduced as a 3D spatial access method structure. The structure is expected to improve the retrieval of Nearest Neighbour information for 3D objects. Several tests are performed in answering Single Nearest Neighbour search and k Nearest Neighbour (kNN) search. The tests indicate that clustered hierarchical structure is efficient in handling Nearest Neighbour query compared to its competitor. From the results, clustered hierarchical structure reduced the repetitive data entry and the accessed page. The proposed structure also produced minimal Input/Output operation. The query response time is also outperformed compared to the other competitor. For future outlook of this research several possible applications are discussed and summarized.

  12. Maximal area and conformal welding heuristics for optimal slice selection in splenic volume estimation

    NASA Astrophysics Data System (ADS)

    Gutenko, Ievgeniia; Peng, Hao; Gu, Xianfeng; Barish, Mathew; Kaufman, Arie

    2016-03-01

    Accurate estimation of splenic volume is crucial for the determination of disease progression and response to treatment for diseases that result in enlargement of the spleen. However, there is no consensus with respect to the use of single or multiple one-dimensional, or volumetric measurement. Existing methods for human reviewers focus on measurement of cross diameters on a representative axial slice and craniocaudal length of the organ. We propose two heuristics for the selection of the optimal axial plane for splenic volume estimation: the maximal area axial measurement heuristic and the novel conformal welding shape-based heuristic. We evaluate these heuristics on time-variant data derived from both healthy and sick subjects and contrast them to established heuristics. Under certain conditions our heuristics are superior to standard practice volumetric estimation methods. We conclude by providing guidance on selecting the optimal heuristic for splenic volume estimation.

  13. The Hybrid Search: A Mass Spectral Library Search Method for Discovery of Modifications in Proteomics.

    PubMed

    Burke, Meghan Catherine; Mirokhin, Yuri A; Tchekhovskoi, Dmitrii V; Markey, Sanford P; Heidbrink Thompson, Jenny L; Larkin, Christopher; Stein, Stephen E

    2017-04-03

    We present a mass spectral library based method to identify tandem mass spectra of peptides that contain unanticipated modifications and amino acid variants. We describe this as a 'hybrid' method because it combines matching both ion m/z and mass losses. The losses are differences in mass between an ion peak and its precursor mass. This difference, termed DeltaMass, is used to shift the product ions in the library spectrum that contain the modification, thereby allowing library product ions that contain the unexpected modification to match the query spectrum. Clustered unidentified spectra from the Clinical Proteomic Tumor Analysis Consortium (CPTAC) and Chinese hamster ovary cells were used to evaluate this method. Results demonstrate the ability of the hybrid method to identify unanticipated modifications, insertions and deletions, which may include those due to an incomplete protein sequence database or to search settings that exclude the correct identification, in high resolution tandem mass spectra without regard of their precursor mass. This has been made possible by indexing of m/z values of each fragment ion and their difference in mass from their precursor ion.

  14. Heuristic scenario builder for power system operator training

    SciTech Connect

    Irisarri, G.; Rafian, M. ); Miller, B.N. ); Dobrowolski, E.J. )

    1992-05-01

    The Heuristic Scenario Builder (HSB), a knowledge-based training scenario builder for the EPRI Operator Training Simulator (OTS), is described in this paper. Expert systems and heuristic searches are used in the HSB to find training scenarios that closely fit trainee profiles and that address particular training requirements. Expert knowledge obtained from instructors and other operations personnel is used throughout the HSB to determine the scenarios. The HSB is an integral part of the OTS and is currently in operation at Philadelphia Electric's OTS installation.

  15. Critical Systems Heuristics

    NASA Astrophysics Data System (ADS)

    Ulrich, Werner; Reynolds, Martin

    Critical systems heuristics (CSH) is a framework for reflective professional practice organised around the central tool of boundary critique. This paper, written jointly by the original developer, Werner Ulrich, and Martin Reynolds, an experienced practitioner of CSH, offers a systematic introduction to the idea and use of boundary critique. Its core concepts are explained in detail and their use is illustrated by means of two case studies from the domain of environmental planning and management. A particular focus is on working constructively with tensions between opposing perspectives as they arise in many situations of professional intervention. These include tensions such as ‘situation' versus ‘system', ‘is' versus ‘ought' judgements, concerns of ‘those involved' versus ‘those affected but not involved', stakeholders' ‘stakes' versus ‘stakeholding issues', and others. Accordingly, boundary critique is presented as a participatory process of unfolding and questioning boundary judgements rather than as an expert-driven process of boundary setting. The paper concludes with a discussion of some essential skills and considerations regarding the practice of boundary critique.

  16. Heuristics for Job-Shop Scheduling

    DTIC Science & Technology

    1988-01-01

    None 11. KEY WORDS (Continue on fewer&* Slde II neceseome aid idontify by Wock numberh) scheduling job-shop heuristic geometric 20. ABSTRACT (C nRtnue...partial fulfillment of the requirements for the degree of Doctor of Science in Mechanical Engineering. Abstract Two methods of obtaining approximate... Representation 27 2.1 Cartesian Completion Space. .. .. .. .. .I.. ... ... ... ... 27 2.2 Capabilities and Limitations of the Representation

  17. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics.

    PubMed

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-08-01

    RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of [Formula: see text]. Subsequently, numerous faster 'Sankoff-style' approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity ([Formula: see text] quartic time). Breaking this barrier, we introduce the novel Sankoff-style algorithm 'sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)', which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff's original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. © The Author 2015. Published by Oxford University Press.

  18. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics

    PubMed Central

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-01-01

    Motivation: RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of O(n6). Subsequently, numerous faster ‘Sankoff-style’ approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity (≥ quartic time). Results: Breaking this barrier, we introduce the novel Sankoff-style algorithm ‘sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)’, which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff’s original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. Availability and implementation: SPARSE is freely available at http://www.bioinf.uni-freiburg.de/Software/SPARSE. Contact: backofen@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25838465

  19. The use of JOIS through the CAPTAIN : Provision of simple searching methods

    NASA Astrophysics Data System (ADS)

    Takano, Katsuhiro

    JICST has started the services of two systems which enable to use JOIS through CAPTAIN in cooperation with A.M.S. Co. Ltd. and Kokusai Sogo Database K.K. They provide three types of searching methods which incorporate features of CAPTAIN. The first one is theme searching. You are allowed to search theme only by selecting subject area number of science and technology or theme numbers that have been registered on the CAPTAIN screen. The second one is instructed searching. You are allowed to search only by particular numbers or search terms according to the searching instructions appeared on the screen. The last one in direct searching. You are allowed to search by directly entering commands which correspond to JOIS commands. This paper outlines the systems which connect JOIS to CAPTAIN centering on these searching methods.

  20. Automating the search of molecular motor templates by evolutionary methods.

    PubMed

    Fernández, Jose D; Vico, Francisco J

    2011-11-01

    Biological molecular motors are nanoscale devices capable of transforming chemical energy into mechanical work, which are being researched in many scientific disciplines. From a computational point of view, the characteristics and dynamics of these motors are studied at multiple time scales, ranging from very detailed and complex molecular dynamics simulations spanning a few microseconds, to extremely simple and coarse-grained theoretical models of their working cycles. However, this research is performed only in the (relatively few) instances known from molecular biology. In this work, results from elastic network analysis and behaviour-finding methods are applied to explore a subset of the configuration space of template molecular structures that are able to transform chemical energy into directed movement, for a fixed instance of working cycle. While using methods based on elastic networks limits the scope of our results, it enables the implementation of computationally lightweight methods, in a way that evolutionary search techniques can be applied to discover novel molecular motor templates. The results show that molecular motion can be attained from a variety of structural configurations, when a functional working cycle is provided. Additionally, these methods enable a new computational way to test hypotheses about molecular motors. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.

    PubMed

    Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán

    2014-03-11

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.

  2. Search space reduction with multiset for effectively solving the container pre-marshalling problem

    NASA Astrophysics Data System (ADS)

    Koike, Hidekatsu

    2017-09-01

    The task of finding the minimum sequence of container movements that transforms an initial bay into a bay that does not require redundant relocations during an actual loading operation is referred to as the container pre-marshalling problem (CPMP). A majority of existing approaches focus only on improving computational efficiency through heuristics, which may not guarantee the optimality of obtained solutions. This paper discusses an efficient search method for obtaining optimal solutions of the CPMP. Unlike most existing approaches, which introduce heuristics for efficiency, this paper pursues an optimization method that preserves optimality without heuristics. Further, through experimental results, we demonstrate how the proposed approach surpasses heuristic approaches in both optimality and efficiency in large-scale problem instances.

  3. When decision heuristics and science collide.

    PubMed

    Yu, Erica C; Sprenger, Amber M; Thomas, Rick P; Dougherty, Michael R

    2014-04-01

    The ongoing discussion among scientists about null-hypothesis significance testing and Bayesian data analysis has led to speculation about the practices and consequences of "researcher degrees of freedom." This article advances this debate by asking the broader questions that we, as scientists, should be asking: How do scientists make decisions in the course of doing research, and what is the impact of these decisions on scientific conclusions? We asked practicing scientists to collect data in a simulated research environment, and our findings show that some scientists use data collection heuristics that deviate from prescribed methodology. Monte Carlo simulations show that data collection heuristics based on p values lead to biases in estimated effect sizes and Bayes factors and to increases in both false-positive and false-negative rates, depending on the specific heuristic. We also show that using Bayesian data collection methods does not eliminate these biases. Thus, our study highlights the little appreciated fact that the process of doing science is a behavioral endeavor that can bias statistical description and inference in a manner that transcends adherence to any particular statistical framework.

  4. A navigation flow map method of representing students' searching behaviors and strategies on the web, with relation to searching outcomes.

    PubMed

    Lin, Chia-Ching; Tsai, Chin-Chung

    2007-10-01

    To acquire a better understanding of the online search strategies that students employ to use the Internet, this study investigated six university students' approaches to Web-based information searches. A new method, called navigation flow map (NFM), is presented that graphically displays the fluid and multilayered relationships between Web navigation and information retrieval that students use while navigating the Web. To document the application of NFM, the Web search strategies of six university students were analyzed as they used the Internet to perform two different tasks: scientific-based and social studies-based information searches. Through protocol analyses using the NFM method, the students' searching strategies were categorized into two types: Match or Exploration. The findings revealed that participants with an Exploration approach had more complicated and richer task-specific ways of searching information than those with a Match approach; and further, through between-task comparisons, we found that participants appeared to use different searching strategies to process natural science information compared to social studies information. Finally, the participants in the Exploration group also exhibited better task performance on the criterion measures than those in the Match group.

  5. Précis of Simple heuristics that make us smart.

    PubMed

    Todd, P M; Gigerenzer, G

    2000-10-01

    How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.

  6. Climate adaptation heuristics and the science/policy divide

    SciTech Connect

    Preston, Benjamin L.; Mustelin, Johanna; Maloney, Megan C.

    2013-09-05

    The adaptation science enterprise has expanded rapidly in recent years, presumably in response to growth in demand for knowledge that can facilitate adaptation policy and practice. However, evidence suggests such investments in adaptation science have not necessarily translated into adaptation implementation. One potential constraint on adaptation may be the underlying heuristics that are used as the foundation for both adaptation research and practice. In this paper, we explore the adaptation academic literature with the objective of identifying adaptation heuristics, assessing the extent to which they have become entrenched within the adaptation discourse, and discussing potential weaknesses in their framing that could undermine adaptation efforts. This investigation is supported by a multi-method analysis that includes both a quantitative content analysis of the adaptation literature that evidences the use of adaptation heuristics and a qualitative analysis of the implications of such heuristics for enhancing or hindering the implementation of adaptation. Results demonstrate that a number of heuristic devices are commonly used in both the peer-reviewed adaptation literature as well as within grey literature designed to inform adaptation practitioners. Furthermore, the apparent lack of critical reflection upon the robustness of these heuristics for diverse contexts may contribute to potential cognitive bias with respect to the framing of adaptation by both researchers and practitioners. Finally, we discuss this phenomenon by drawing upon heuristic-analytic theory, which has explanatory utility in understanding both the origins of such heuristics as well as the measures that can be pursued toward the co-generation of more robust approaches to adaptation problem-solving.

  7. Climate adaptation heuristics and the science/policy divide

    DOE PAGES

    Preston, Benjamin L.; Mustelin, Johanna; Maloney, Megan C.

    2013-09-05

    The adaptation science enterprise has expanded rapidly in recent years, presumably in response to growth in demand for knowledge that can facilitate adaptation policy and practice. However, evidence suggests such investments in adaptation science have not necessarily translated into adaptation implementation. One potential constraint on adaptation may be the underlying heuristics that are used as the foundation for both adaptation research and practice. In this paper, we explore the adaptation academic literature with the objective of identifying adaptation heuristics, assessing the extent to which they have become entrenched within the adaptation discourse, and discussing potential weaknesses in their framing thatmore » could undermine adaptation efforts. This investigation is supported by a multi-method analysis that includes both a quantitative content analysis of the adaptation literature that evidences the use of adaptation heuristics and a qualitative analysis of the implications of such heuristics for enhancing or hindering the implementation of adaptation. Results demonstrate that a number of heuristic devices are commonly used in both the peer-reviewed adaptation literature as well as within grey literature designed to inform adaptation practitioners. Furthermore, the apparent lack of critical reflection upon the robustness of these heuristics for diverse contexts may contribute to potential cognitive bias with respect to the framing of adaptation by both researchers and practitioners. Finally, we discuss this phenomenon by drawing upon heuristic-analytic theory, which has explanatory utility in understanding both the origins of such heuristics as well as the measures that can be pursued toward the co-generation of more robust approaches to adaptation problem-solving.« less

  8. Generating effective project scheduling heuristics by abstraction and reconstitution

    NASA Technical Reports Server (NTRS)

    Janakiraman, Bhaskar; Prieditis, Armand

    1992-01-01

    A project scheduling problem consists of a finite set of jobs, each with fixed integer duration, requiring one or more resources such as personnel or equipment, and each subject to a set of precedence relations, which specify allowable job orderings, and a set of mutual exclusion relations, which specify jobs that cannot overlap. No job can be interrupted once started. The objective is to minimize project duration. This objective arises in nearly every large construction project--from software to hardware to buildings. Because such project scheduling problems are NP-hard, they are typically solved by branch-and-bound algorithms. In these algorithms, lower-bound duration estimates (admissible heuristics) are used to improve efficiency. One way to obtain an admissible heuristic is to remove (abstract) all resources and mutual exclusion constraints and then obtain the minimal project duration for the abstracted problem; this minimal duration is the admissible heuristic. Although such abstracted problems can be solved efficiently, they yield inaccurate admissible heuristics precisely because those constraints that are central to solving the original problem are abstracted. This paper describes a method to reconstitute the abstracted constraints back into the solution to the abstracted problem while maintaining efficiency, thereby generating better admissible heuristics. Our results suggest that reconstitution can make good admissible heuristics even better.

  9. Fluency heuristic: a model of how the mind exploits a by-product of information retrieval.

    PubMed

    Hertwig, Ralph; Herzog, Stefan M; Schooler, Lael J; Reimer, Torsten

    2008-09-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the most of an automatic by-product of retrieval from memory, namely, retrieval fluency. In 4 experiments, the authors show that retrieval fluency can be a proxy for real-world quantities, that people can discriminate between two objects' retrieval fluencies, and that people's inferences are in line with the fluency heuristic (in particular fast inferences) and with experimentally manipulated fluency. The authors conclude that the fluency heuristic may be one tool in the mind's repertoire of strategies that artfully probes memory for encapsulated frequency information that can veridically reflect statistical regularities in the world.

  10. A hybrid approach using chaotic dynamics and global search algorithms for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Igeta, Hideki; Hasegawa, Mikio

    Chaotic dynamics have been effectively applied to improve various heuristic algorithms for combinatorial optimization problems in many studies. Currently, the most used chaotic optimization scheme is to drive heuristic solution search algorithms applicable to large-scale problems by chaotic neurodynamics including the tabu effect of the tabu search. Alternatively, meta-heuristic algorithms are used for combinatorial optimization by combining a neighboring solution search algorithm, such as tabu, gradient, or other search method, with a global search algorithm, such as genetic algorithms (GA), ant colony optimization (ACO), or others. In these hybrid approaches, the ACO has effectively optimized the solution of many benchmark problems in the quadratic assignment problem library. In this paper, we propose a novel hybrid method that combines the effective chaotic search algorithm that has better performance than the tabu search and global search algorithms such as ACO and GA. Our results show that the proposed chaotic hybrid algorithm has better performance than the conventional chaotic search and conventional hybrid algorithms. In addition, we show that chaotic search algorithm combined with ACO has better performance than when combined with GA.

  11. Phylogenomic Methods to Guide Paleontological Searches for the Early Cyanobacteria

    NASA Astrophysics Data System (ADS)

    Blank, C. E.

    2004-12-01

    Phylogenomic methods can help paleontologists target their searches for early microbial microfossils and potentially help them better interpret the early fossil record. In this study, the deep-branching relationships in the cyanobacteria were resolved using whole genome sequences, multiple genes for taxa lacking genomes, and intein presence/absence in the DnaE protein. Once a framework tree was produced, characters were mapped onto the tree. Characters included morphology (unicellular vs. filamentous), habitat (marine vs. freshwater), metabolism (use of sulfide as electron donor, nitrogen fixation), presence/absence of complex morphological traits (akinetes, heterocysts, hormogonia), salt tolerance, and thermal tolerance. It was found that the earliest cyanobacteria were unicellular coccoids, with cell diameters < 2 microns, that lived in freshwater environments. This suggests that paleontologists should focus their searches for the earliest cyanobacteria to freshwater deposits (lakes, streams) and to small diameter coccoids (not mats, not filaments). The earliest "cyanobacterial" microfossils (Eosynechococcus and Eoentophysalis) are large-diameter coccoids found in shallow marine platform carbonates. Because these cells have large diameters, if they were cyanobacteria one would also expect to see their sister taxa in the fossil record (i.e., large-diameter filamentous forms with sheaths, also akinetes). Because these are not found until 2.0 Ga (and akinetes until 1.5 Ga), this suggests that these earliest microfossils are not cyanobacteria. There are several instances in the cyanobacterial tree where ancestors with low salt tolerance gave rise to lineages that grow in brackish, marine, and/or hypersaline environments. This suggests that either the cyanobacteria first originated on continents and later colonized more saline environments, or that the cyanobacteria first originated in shallow "seas" that were not very saline but gradually became more saline by about

  12. Regarding Chilcott's "Structural Functionalism as a Heuristic Device" Heuristically.

    ERIC Educational Resources Information Center

    Blot, Richard K.

    1998-01-01

    The heuristic value of Chilcott's essay lies less in its support for structural functionalism and more in its concern to reexamine theory in the work of earlier educational anthropologists for what earlier theories and practices can add to current research. (SLD)

  13. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    PubMed Central

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and

  14. Heuristic reconstructions of neutron penumbral images

    SciTech Connect

    Nozaki, Shinya; Chen Yenwei

    2004-10-01

    Penumbral imaging is a technique of coded aperture imaging proposed for imaging of highly penetrating radiations. To date, the penumbral imaging technique has been successfully applied to neutron imaging in laser fusion experiments. Since the reconstruction of penumbral images is based on linear deconvolution methods, such as inverse filter and Wiener filer, the point spread function of apertures should be space invariant; it is also sensitive to the noise contained in penumbral images. In this article, we propose a new heuristic reconstruction method for neutron penumbral imaging, which can be used for a space-variant imaging system and is also very tolerant to the noise.

  15. Searching for truth: internet search patterns as a method of investigating online responses to a Russian illicit drug policy debate.

    PubMed

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-12-13

    This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r(s) = 0.88, P < .001), "Bychkov" (r(s) = .78, P < .001) and "Khimki"(r(s) = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for "Bychkov" and 48,084 for "Egor Bychkov", compared to 53

  16. Fast or Frugal, but Not Both: Decision Heuristics Under Time Pressure.

    PubMed

    Bobadilla-Suarez, Sebastian; Love, Bradley C

    2017-05-29

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. How Forgetting Aids Heuristic Inference

    ERIC Educational Resources Information Center

    Schooler, Lael J.; Hertwig, Ralph

    2005-01-01

    Some theorists, ranging from W. James (1890) to contemporary psychologists, have argued that forgetting is the key to proper functioning of memory. The authors elaborate on the notion of beneficial forgetting by proposing that loss of information aids inference heuristics that exploit mnemonic information. To this end, the authors bring together 2…

  18. A heuristic approach to incremental and reactive scheduling

    NASA Technical Reports Server (NTRS)

    Odubiyi, Jide B.; Zoch, David R.

    1989-01-01

    An heuristic approach to incremental and reactive scheduling is described. Incremental scheduling is the process of modifying an existing schedule if the initial schedule does not meet its stated initial goals. Reactive scheduling occurs in near real-time in response to changes in available resources or the occurrence of targets of opportunity. Only minor changes are made during both incremental and reactive scheduling because a goal of re-scheduling procedures is to minimally impact the schedule. The described heuristic search techniques, which are employed by the Request Oriented Scheduling Engine (ROSE), a prototype generic scheduler, efficiently approximate the cost of reaching a goal from a given state and effective mechanisms for controlling search.

  19. Variable neighborhood search for reverse engineering of gene regulatory networks.

    PubMed

    Nicholson, Charles; Goodwin, Leslie; Clark, Corey

    2017-01-01

    A new search heuristic, Divided Neighborhood Exploration Search, designed to be used with inference algorithms such as Bayesian networks to improve on the reverse engineering of gene regulatory networks is presented. The approach systematically moves through the search space to find topologies representative of gene regulatory networks that are more likely to explain microarray data. In empirical testing it is demonstrated that the novel method is superior to the widely employed greedy search techniques in both the quality of the inferred networks and computational time.

  20. Conflict and Bias in Heuristic Judgment

    ERIC Educational Resources Information Center

    Bhatia, Sudeep

    2017-01-01

    Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy.…

  1. Conflict and Bias in Heuristic Judgment

    ERIC Educational Resources Information Center

    Bhatia, Sudeep

    2017-01-01

    Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy.…

  2. A Heuristic Approach to the Theater Distribution Problem

    DTIC Science & Technology

    2014-03-27

    integer programming model exists to search for optimal solutions to these problems, but it is fairly time consuming, and produces only one of potentially...solutions are compared to those obtained by the integer programming approach. The heuristic models implemented in this research develop feasible...outstanding guidance on this thesis research as well as the introduction to joint mobility modeling in OPER 674 which sparked my interest in this area of

  3. Analysis Methods for the DRIFT Dark Matter Search

    NASA Astrophysics Data System (ADS)

    Ayad, R.; Hyatt, M.; Hanson-Hart, Z.; Katz-Hyman, M.; Maher, P.; Posner, A.; Martoff, C. J.; Kirkpatrick, J.; Snowden-Ifft, D. P.; Lawson, T. B.; Lightfoot, P. K.; Morgan, B.; Paling, S. M.; Roberts, J. W.; Robinson, M.; Spooner, N. J. C.

    2003-04-01

    The DRIFT Experiment [1] is an underground search for WIMP Dark Matter using a novel detector invented for this purpose: the Negative Ion TPC (NITPC). Data is collected in the form of digitized time-records of signals received on each active anode wire of the NITPC endcap. Analysis procedures developed to characterize this data and discriminate backgrounds (x-rays, gamma rays, alpha particles) from potential Dark Matter signals (simulated with neutron elastic scattering) will be discussed. [1] Low Pressure Negative Ion TPC for Dark Matter Search. D. P. Snowden-Ifft, C. J. Martoff, J. M. Burwell, Phys Rev. D. Rapid Comm. 61, 101301 (2000)

  4. Hegemony, hermeneutics, and the heuristic of hope.

    PubMed

    Dorcy, Kathleen Shannon

    2010-01-01

    Hope has become a commodity, one that society expects those who suffer to invest in and one that healthcare providers are expected to promote as an outcome. In nursing research, a single hegemonic epistemology/ontology has been implemented through an exclusive hermeneutic (interpretation of data) and has resulted in hope being designated as an external objective heuristic for those who suffer. Evidence is articulated in this article for adopting a broader method of analysis and interpretation (genealogy) that can facilitate fuller apprehension of hope in the human experience of suffering and despair.

  5. Non-uniform cosine modulated filter banks using meta-heuristic algorithms in CSD space

    PubMed Central

    Kalathil, Shaeen; Elias, Elizabeth

    2014-01-01

    This paper presents an efficient design of non-uniform cosine modulated filter banks (CMFB) using canonic signed digit (CSD) coefficients. CMFB has got an easy and efficient design approach. Non-uniform decomposition can be easily obtained by merging the appropriate filters of a uniform filter bank. Only the prototype filter needs to be designed and optimized. In this paper, the prototype filter is designed using window method, weighted Chebyshev approximation and weighted constrained least square approximation. The coefficients are quantized into CSD, using a look-up-table. The finite precision CSD rounding, deteriorates the filter bank performances. The performances of the filter bank are improved using suitably modified meta-heuristic algorithms. The different meta-heuristic algorithms which are modified and used in this paper are Artificial Bee Colony algorithm, Gravitational Search algorithm, Harmony Search algorithm and Genetic algorithm and they result in filter banks with less implementation complexity, power consumption and area requirements when compared with those of the conventional continuous coefficient non-uniform CMFB. PMID:26644921

  6. A new conjugate gradient method and its global convergence under the exact line search

    NASA Astrophysics Data System (ADS)

    Omer, Osman; Rivaie, Mohd; Mamat, Mustafa; Abdalla, Awad

    2014-12-01

    The conjugate gradient methods are numerously used for solving nonlinear unconstrained optimization problems, especially of large scale. Their wide applications are due to their simplicity and low memory requirement. To analyze conjugate gradient methods, two types of line searches are used; exact and inexact. In this paper, we present a new method of nonlinear conjugate gradient methods under the exact line search. The theoretical analysis shows that the new method generates a descent direction in each iteration and globally convergent under the exact line search. Moreover, numerical experiments based on comparing the new method with other well known conjugate gradient methods show that the new is efficient for some unconstrained optimization problems.

  7. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.

  8. Familiarity and Recollection in Heuristic Decision Making

    PubMed Central

    Schwikert, Shane R.; Curran, Tim

    2014-01-01

    Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the by-products of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the two heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective pre-experimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical frame work that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PMID:25347534

  9. Familiarity and recollection in heuristic decision making.

    PubMed

    Schwikert, Shane R; Curran, Tim

    2014-12-01

    Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the byproducts of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the 2 heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective preexperimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical framework that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options.

  10. A Pattern Search Filter Method for Nonlinear Programming Without Derivatives

    DTIC Science & Technology

    2003-06-12

    this context. Optimality conditions for a differentiable function can be stated in terms of the cone generated by the convex hull of a set S, i.e...Corollary 5.10. It gives conditions for the limit point of a refining sequence to satisfy optimality conditions on problem (1.1). It is that the convex ...useful division into global SEARCH and local POLL steps. It is shown here that the algorithm identifies limit points at which optimality conditions

  11. Web of science: a unique method of cited reference searching.

    PubMed Central

    Sevinc, Alper

    2004-01-01

    The number of times an article is acknowledged as a reference in another article reflects its scientific impact. Citation analysis is one of the parameters for assessing the quality of research published in scientific, technology and social science journals. Web of Science enables users to search current and retrospective multidisciplinary information. Parameters and practical applications evaluating journal and article citation characteristics available through the Science Citation Index are summarized. Images Figure 1 Figure 2 PMID:15253331

  12. Web of science: a unique method of cited reference searching.

    PubMed

    Sevinc, Alper

    2004-07-01

    The number of times an article is acknowledged as a reference in another article reflects its scientific impact. Citation analysis is one of the parameters for assessing the quality of research published in scientific, technology and social science journals. Web of Science enables users to search current and retrospective multidisciplinary information. Parameters and practical applications evaluating journal and article citation characteristics available through the Science Citation Index are summarized.

  13. A Computer Simulation Study of a Sensor-Based Heuristic Navigation for Three-Dimensional Rough Terrain with Obstacles

    DTIC Science & Technology

    1989-06-01

    utilizing heuristics adopted from human behavior. Simulation results produce a near-optimal path solution in a very short time. Simulation results also...adopted from human behavior. Simulation results produce a near-optimal path solution in a very short time. Simulation results also prove that this...or the hill climb search strategy. PATH PLAN adopts flavors of the hill climb and the A search strategy a well as human heuristics. Though it

  14. A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) Authoring Tools

    DTIC Science & Technology

    2016-03-01

    Report No.: ARL-SR-0333. Nielsen J. Heuristic evaluation. In: Nielsen J, Mack RL, editors. Usability inspection methods. New York (NY): John Wiley...http://www.nngroup.com/articles/ten-usability-heuristics/. Norman DA. Cognitive engineering. In: Norman DA, Draper SW, editors. User centered

  15. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    ERIC Educational Resources Information Center

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2007-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer…

  16. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    ERIC Educational Resources Information Center

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2007-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer…

  17. A Multi-Start Evolutionary Local Search for the Two-Echelon Location Routing Problem

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Phuong; Prins, Christian; Prodhon, Caroline

    This paper presents a new hybrid metaheuristic between a greedy randomized adaptive search procedure (GRASP) and an evolutionary/iterated local search (ELS/ILS), using Tabu list to solve the two-echelon location routing problem (LRP-2E). The GRASP uses in turn three constructive heuristics followed by local search to generate the initial solutions. From a solution of GRASP, an intensification strategy is carried out by a dynamic alternation between ELS and ILS. In this phase, each child is obtained by mutation and evaluated through a splitting procedure of giant tour followed by a local search. The tabu list, defined by two characteristics of solution (total cost and number of trips), is used to avoid searching a space already explored. The results show that our metaheuristic clearly outperforms all previously published methods on LRP-2E benchmark instances. Furthermore, it is competitive with the best meta-heuristic published for the single-echelon LRP.

  18. A hybrid cuckoo search algorithm with Nelder Mead method for solving global optimization problems.

    PubMed

    Ali, Ahmed F; Tawhid, Mohamed A

    2016-01-01

    Cuckoo search algorithm is a promising metaheuristic population based method. It has been applied to solve many real life problems. In this paper, we propose a new cuckoo search algorithm by combining the cuckoo search algorithm with the Nelder-Mead method in order to solve the integer and minimax optimization problems. We call the proposed algorithm by hybrid cuckoo search and Nelder-Mead method (HCSNM). HCSNM starts the search by applying the standard cuckoo search for number of iterations then the best obtained solution is passing to the Nelder-Mead algorithm as an intensification process in order to accelerate the search and overcome the slow convergence of the standard cuckoo search algorithm. The proposed algorithm is balancing between the global exploration of the Cuckoo search algorithm and the deep exploitation of the Nelder-Mead method. We test HCSNM algorithm on seven integer programming problems and ten minimax problems and compare against eight algorithms for solving integer programming problems and seven algorithms for solving minimax problems. The experiments results show the efficiency of the proposed algorithm and its ability to solve integer and minimax optimization problems in reasonable time.

  19. A Relatively Painless Method of Introduction to the Psychological Literature Search

    ERIC Educational Resources Information Center

    Gardner, Louis E.

    1977-01-01

    Described is an innovative teaching method for developing student psychological literature search skills. The method involves empirical analysis of cliches and writing abstracts of psychological studies used in support or rejection of cliche hypotheses. (Author/DB)

  20. A Relatively Painless Method of Introduction to the Psychological Literature Search

    ERIC Educational Resources Information Center

    Gardner, Louis E.

    1977-01-01

    Described is an innovative teaching method for developing student psychological literature search skills. The method involves empirical analysis of cliches and writing abstracts of psychological studies used in support or rejection of cliche hypotheses. (Author/DB)

  1. A systematic-heuristic approach for space trajectory design.

    PubMed

    Vasile, Massimiliano

    2004-05-01

    In this paper a novel algorithm is proposed for space trajectory design that combines a systematic and a heuristic method for global optimization. For the systematic part of the algorithm a branching technique is used, whereas a particular implementation of evolution programming forms the core of the heuristic part. The idea is to use a limited population evolving for a small number of generations, according to specific evolution rules, in subregions of the solution space defined by a branching procedure. On the other hand the branching rules are functions of the outcome from the evolution optimization. The proposed combined systematic-heuristic global optimization performs quite well on the cases analyzed in this paper, suggesting the possibility of more complex applications.

  2. Automated unit-level testing with heuristic rules

    NASA Technical Reports Server (NTRS)

    Carlisle, W. Homer; Chang, Kai-Hsiung; Cross, James H.; Keleher, William; Shackelford, Keith

    1990-01-01

    Software testing plays a significant role in the development of complex software systems. Current testing methods generally require significant effort to generate meaningful test cases. The QUEST/Ada system is a prototype system designed using CLIPS to experiment with expert system based test case generation. The prototype is designed to test for condition coverage, and attempts to generate test cases to cover all feasible branches contained in an Ada program. This paper reports on heuristics sued by the system. These heuristics vary according to the amount of knowledge obtained by preprocessing and execution of the boolean conditions in the program.

  3. Social biases determine spatiotemporal sparseness of ciliate mating heuristics

    PubMed Central

    2012-01-01

    Ciliates become highly social, even displaying animal-like qualities, in the joint presence of aroused conspecifics and nonself mating pheromones. Pheromone detection putatively helps trigger instinctual and learned courtship and dominance displays from which social judgments are made about the availability, compatibility, and fitness representativeness or likelihood of prospective mates and rivals. In earlier studies, I demonstrated the heterotrich Spirostomum ambiguum improves mating competence by effecting preconjugal strategies and inferences in mock social trials via behavioral heuristics built from Hebbian-like associative learning. Heuristics embody serial patterns of socially relevant action that evolve into ordered, topologically invariant computational networks supporting intra- and intermate selection. S. ambiguum employs heuristics to acquire, store, plan, compare, modify, select, and execute sets of mating propaganda. One major adaptive constraint over formation and use of heuristics involves a ciliate’s initial subjective bias, responsiveness, or preparedness, as defined by Stevens’ Law of subjective stimulus intensity, for perceiving the meaningfulness of mechanical pressures accompanying cell-cell contacts and additional perimating events. This bias controls durations and valences of nonassociative learning, search rates for appropriate mating strategies, potential net reproductive payoffs, levels of social honesty and deception, successful error diagnosis and correction of mating signals, use of insight or analysis to solve mating dilemmas, bioenergetics expenditures, and governance of mating decisions by classical or quantum statistical mechanics. I now report this same social bias also differentially affects the spatiotemporal sparseness, as measured with metric entropy, of ciliate heuristics. Sparseness plays an important role in neural systems through optimizing the specificity, efficiency, and capacity of memory representations. The

  4. Meta-Heuristic Combining Prior Online and Offline Information for the Quadratic Assignment Problem.

    PubMed

    Sun, Jianyong; Zhang, Qingfu; Yao, Xin

    2014-03-01

    The construction of promising solutions for NP-hard combinatorial optimization problems (COPs) in meta-heuristics is usually based on three types of information, namely a priori information, a posteriori information learned from visited solutions during the search procedure, and online information collected in the solution construction process. Prior information reflects our domain knowledge about the COPs. Extensive domain knowledge can surely make the search effective, yet it is not always available. Posterior information could guide the meta-heuristics to globally explore promising search areas, but it lacks local guidance capability. On the contrary, online information can capture local structures, and its application can help exploit the search space. In this paper, we studied the effects of using this information on metaheuristic's algorithmic performances for the COPs. The study was illustrated by a set of heuristic algorithms developed for the quadratic assignment problem. We first proposed an improved scheme to extract online local information, then developed a unified framework under which all types of information can be combined readily. Finally, we studied the benefits of the three types of information to meta-heuristics. Conclusions were drawn from the comprehensive study, which can be used as principles to guide the design of effective meta-heuristic in the future.

  5. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges.

    PubMed

    Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James

    2017-09-01

    Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.

  6. A bicriteria heuristic for an elective surgery scheduling problem.

    PubMed

    Marques, Inês; Captivo, M Eugénia; Vaz Pato, Margarida

    2015-09-01

    Resource rationalization and reduction of waiting lists for surgery are two main guidelines for hospital units outlined in the Portuguese National Health Plan. This work is dedicated to an elective surgery scheduling problem arising in a Lisbon public hospital. In order to increase the surgical suite's efficiency and to reduce the waiting lists for surgery, two objectives are considered: maximize surgical suite occupation and maximize the number of surgeries scheduled. This elective surgery scheduling problem consists of assigning an intervention date, an operating room and a starting time for elective surgeries selected from the hospital waiting list. Accordingly, a bicriteria surgery scheduling problem arising in the hospital under study is presented. To search for efficient solutions of the bicriteria optimization problem, the minimization of a weighted Chebyshev distance to a reference point is used. A constructive and improvement heuristic procedure specially designed to address the objectives of the problem is developed and results of computational experiments obtained with empirical data from the hospital are presented. This study shows that by using the bicriteria approach presented here it is possible to build surgical plans with very good performance levels. This method can be used within an interactive approach with the decision maker. It can also be easily adapted to other hospitals with similar scheduling conditions.

  7. The Saccharomyces Genome Database: Advanced Searching Methods and Data Mining.

    PubMed

    Cherry, J Michael

    2015-12-02

    At the core of the Saccharomyces Genome Database (SGD) are chromosomal features that encode a product. These include protein-coding genes and major noncoding RNA genes, such as tRNA and rRNA genes. The basic entry point into SGD is a gene or open-reading frame name that leads directly to the locus summary information page. A keyword describing function, phenotype, selective condition, or text from abstracts will also provide a door into the SGD. A DNA or protein sequence can be used to identify a gene or a chromosomal region using BLAST. Protein and DNA sequence identifiers, PubMed and NCBI IDs, author names, and function terms are also valid entry points. The information in SGD has been gathered and is maintained by a group of scientific biocurators and software developers who are devoted to providing researchers with up-to-date information from the published literature, connections to all the major research resources, and tools that allow the data to be explored. All the collected information cannot be represented or summarized for every possible question; therefore, it is necessary to be able to search the structured data in the database. This protocol describes the YeastMine tool, which provides an advanced search capability via an interactive tool. The SGD also archives results from microarray expression experiments, and a strategy designed to explore these data using the SPELL (Serial Pattern of Expression Levels Locator) tool is provided.

  8. New hybrid conjugate gradient methods with the generalized Wolfe line search.

    PubMed

    Xu, Xiao; Kong, Fan-Yu

    2016-01-01

    The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line search respectively to compute the step size α k of the two hybrid methods. With the new Wolfe line search, the two hybrid methods had descent property and global convergence property of the two hybrid methods that can also be proved.

  9. Earthdata Search: Methods for Improving Data Discovery, Visualization, and Access

    NASA Astrophysics Data System (ADS)

    Quinn, P.; Pilone, D.; Crouch, M.; Siarto, J.; Sun, B.

    2015-12-01

    In a landscape of heterogeneous data from diverse sources and disciplines, producing useful tools poses a significant challenge. NASA's Earthdata Search application tackles this challenge, enabling discovery and inter-comparison of data across the wide array of scientific disciplines that use NASA Earth observation data. During this talk, we will give a brief overview of the application, and then share our approach for understanding and satisfying the needs of users from several disparate scientific communities. Our approach involves: - Gathering fine-grained metrics to understand user behavior - Using metrics to quantify user success - Combining metrics, feedback, and user research to understand user needs - Applying professional design toward addressing user needs - Using metrics and A/B testing to evaluate the viability of changes - Providing enhanced features for services to promote adoption - Encouraging good metadata quality and soliciting feedback for metadata issues - Open sourcing the application and its components to allow it to serve more users

  10. Heuristical Strategies on the Study Theme "The Unsaturated Hydrocarbons -- Alkenes"

    ERIC Educational Resources Information Center

    Naumescu, Adrienne Kozan; Pasca, Roxana-Diana

    2011-01-01

    The influence of heuristical strategies upon the level of two experimental classes is studied in this paper. The didactic experiment took place at secondary school in Cluj-Napoca, in 2008-2009 school year. The study theme "The Unsaturated Hydrocarbons--Alkenes" has been efficiently learned by using the most active methods: laboratory…

  11. A heuristic solution for the disassembly line balancing problem incorporating sequence dependent costs

    NASA Astrophysics Data System (ADS)

    Lambert, A. J. D.; Gupta, Surendra M.

    2005-11-01

    This paper deals with disassembly sequencing problems subjected to sequence dependent disassembly costs. We present a heuristic and an iterative method based on partial branch and bound concept to solve such problems. Since heuristic methods intrinsically generate suboptimum solutions, we compared the heuristically obtained solutions with the exact solutions to see if they are reasonably good or not. This process, however, is limited to small or perhaps medium sized problems only as the required CPU time for exact methods tends to increase exponentially with the problem size. For the problems tested, we observed that the methods described in this paper generate surprisingly good results using almost negligible amount of CPU time.

  12. A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.

    PubMed

    Dubljević, Veljko; Racine, Eric

    2014-10-01

    The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).

  13. A spectral KRMI conjugate gradient method under the strong-Wolfe line search

    NASA Astrophysics Data System (ADS)

    Khadijah, Wan; Rivaie, Mohd.; Mamat, Mustafa; Jusoh, Ibrahim

    2016-06-01

    In this paper, a modification of spectral conjugate gradient (CG) method is proposed which combines the advantages of the spectral CG method and the RMIL method namely as spectral Khadijah-Rivaie-Mustafa-Ibrahim (SKRMI) to solve unconstrained optimization problems. Based on inexact line searches, the objective function generates a sufficient descent direction and the global convergence property for the proposed method has been proved. Moreover, the method reduces to the standard RMIL method if exact line search is applied. Numerical results are also presented to examine the efficiency of the proposed method.

  14. Heuristics Applied in the Development of Advanced Space Mission Concepts

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.

    1998-01-01

    Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.

  15. A method of searching for related literature on protein structure analysis by considering a user's intention

    PubMed Central

    2015-01-01

    Background In recent years, with advances in techniques for protein structure analysis, the knowledge about protein structure and function has been published in a vast number of articles. A method to search for specific publications from such a large pool of articles is needed. In this paper, we propose a method to search for related articles on protein structure analysis by using an article itself as a query. Results Each article is represented as a set of concepts in the proposed method. Then, by using similarities among concepts formulated from databases such as Gene Ontology, similarities between articles are evaluated. In this framework, the desired search results vary depending on the user's search intention because a variety of information is included in a single article. Therefore, the proposed method provides not only one input article (primary article) but also additional articles related to it as an input query to determine the search intention of the user, based on the relationship between two query articles. In other words, based on the concepts contained in the input article and additional articles, we actualize a relevant literature search that considers user intention by varying the degree of attention given to each concept and modifying the concept hierarchy graph. Conclusions We performed an experiment to retrieve relevant papers from articles on protein structure analysis registered in the Protein Data Bank by using three query datasets. The experimental results yielded search results with better accuracy than when user intention was not considered, confirming the effectiveness of the proposed method. PMID:25952498

  16. The Stanford Cluster Search: Scope, Method, and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Willick, Jeffrey A.; Thompson, Keith L.; Mathiesen, Benjamin F.; Perlmutter, Saul; Knop, Robert A.; Hill, Gary J.

    2001-06-01

    We describe the scientific motivation behind, and the methodology of, the Stanford Cluster Search (StaCS), a program to compile a catalog of optically selected galaxy clusters at intermediate and high (0.3<~z<~1) redshifts. The clusters are identified using a matched filter algorithm applied to deep CCD images covering ~60 deg2 of sky. These images are obtained from several data archives, principally that of the Berkeley Supernova Cosmology Project of Perlmutter et al. Potential clusters are confirmed with spectroscopic observations at the 9.2 m Hobby-Eberly Telescope. Follow-up observations at optical, submillimeter, and X-ray wavelengths are planned in order to estimate cluster masses. Our long-term scientific goal is to measure the cluster number density as a function of mass and redshift, n(M, z), which is sensitive to the cosmological density parameter Ωm and the amplitude of density fluctuations σ8. The combined data set will contain clusters ranging over an order of magnitude in mass and allow constraints on these parameters accurate to ~10%. We present our first spectroscopically confirmed cluster candidates and describe how to access them electronically. The Hobby-Eberly Telescope (HET) is a joint project of the University of Texas at Austin, the Pennsylvania State University, Stanford University, Ludwig-Maximillians-Universität München, and Georg-August-Universität Göttingen. The HET is named in honor of its principal benefactors, William P. Hobby and Robert E. Eberly.

  17. Global Search Capabilities of Indirect Methods for Impulsive Transfers

    NASA Astrophysics Data System (ADS)

    Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong

    2015-09-01

    An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.

  18. A comparison of field-based similarity searching methods: CatShape, FBSS, and ROCS.

    PubMed

    Moffat, Kirstin; Gillet, Valerie J; Whittle, Martin; Bravi, Gianpaolo; Leach, Andrew R

    2008-04-01

    Three field-based similarity methods are compared in retrospective virtual screening experiments. The methods are the CatShape module of CATALYST, ROCS, and an in-house program developed at the University of Sheffield called FBSS. The programs are used in both rigid and flexible searches carried out in the MDL Drug Data Report. UNITY 2D fingerprints are also used to provide a comparison with a more traditional approach to similarity searching, and similarity based on simple whole-molecule properties is used to provide a baseline for the more sophisticated searches. Overall, UNITY 2D fingerprints and ROCS with the chemical force field option gave comparable performance and were superior to the shape-only 3D methods. When the flexible methods were compared with the rigid methods, it was generally found that the flexible methods gave slightly better results than their respective rigid methods; however, the increased performance did not justify the additional computational cost required.

  19. Hide and vanish: data sets where the most parsimonious tree is known but hard to find, and their implications for tree search methods.

    PubMed

    Goloboff, Pablo A

    2014-10-01

    Three different types of data sets, for which the uniquely most parsimonious tree can be known exactly but is hard to find with heuristic tree search methods, are studied. Tree searches are complicated more by the shape of the tree landscape (i.e. the distribution of homoplasy on different trees) than by the sheer abundance of homoplasy or character conflict. Data sets of Type 1 are those constructed by Radel et al. (2013). Data sets of Type 2 present a very rugged landscape, with narrow peaks and valleys, but relatively low amounts of homoplasy. For such a tree landscape, subjecting the trees to TBR and saving suboptimal trees produces much better results when the sequence of clipping for the tree branches is randomized instead of fixed. An unexpected finding for data sets of Types 1 and 2 is that starting a search from a random tree instead of a random addition sequence Wagner tree may increase the probability that the search finds the most parsimonious tree; a small artificial example where these probabilities can be calculated exactly is presented. Data sets of Type 3, the most difficult data sets studied here, comprise only congruent characters, and a single island with only one most parsimonious tree. Even if there is a single island, missing entries create a very flat landscape which is difficult to traverse with tree search algorithms because the number of equally parsimonious trees that need to be saved and swapped to effectively move around the plateaus is too large. Minor modifications of the parameters of tree drifting, ratchet, and sectorial searches allow travelling around these plateaus much more efficiently than saving and swapping large numbers of equally parsimonious trees with TBR. For these data sets, two new related criteria for selecting taxon addition sequences in Wagner trees (the "selected" and "informative" addition sequences) produce much better results than the standard random or closest addition sequences. These new methods for Wagner

  20. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    PubMed

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  1. A speaker change detection method based on coarse searching

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-yuan; He, Qian-hua; Li, Yan-xiong; He, Jun

    2013-03-01

    The conventional speaker change detection (SCD) method using Bayesian Information Criterion (BIC) has been widely used. However, its performance relies on the choice of penalty factor and suffers from mass calculation. The twostep SCD is less time consuming but generates more detection errors. The limitation of conventional method's performance originates from the two adjacent data windows. We propose a strategy that inserts an interval between the two adjacent fixed-size data windows in each analysis window. The dissimilarity value between the data windows is regarded as the probability of a speaker identity change within the interval area. Then this analysis window is slid along the audio by a large step to locate the areas where speaker change points may appear. Afterwards we only focus on these areas and locate precisely where the change points are. Other areas where a speaker change point unlikely appears are abandoned. The proposed method is computationally efficient and more robust to noise and penalty factor compared with conventional method. Evaluated on the corpus of China Central Television (CCTV) news, the proposed method obtains 74.18% reduction in calculation time and 22.24% improvement in F1-measure compared with the conventional approach.

  2. A climbing string method for saddle point search.

    PubMed

    Ren, Weiqing; Vanden-Eijnden, Eric

    2013-04-07

    The string method originally proposed for the computation of minimum energy paths (MEPs) is modified to find saddle points around a given minimum on a potential energy landscape using the location of this minimum as only input. In the modified method the string is evolved by gradient flow in path space, with one of its end points fixed at the minimum and the other end point (the climbing image) evolving towards a saddle point according to a modified potential force in which the component of the potential force in the tangent direction of the string is reversed. The use of a string allows us to monitor the evolution of the climbing image and prevent its escape from the basin of attraction of the minimum. This guarantees that the string always converges towards a MEP connecting the minimum to a saddle point lying on the boundary of the basin of attraction of this minimum. The convergence of the climbing image to the saddle point can also be accelerated by an inexact Newton method in the late stage of the computation. The performance of the numerical method is illustrated using the example of a 7-atom cluster on a substrate. Comparison is made with the dimer method.

  3. A new method to improve network topological similarity search: applied to fold recognition

    PubMed Central

    Lhota, John; Hauptman, Ruth; Hart, Thomas; Ng, Clara; Xie, Lei

    2015-01-01

    Motivation: Similarity search is the foundation of bioinformatics. It plays a key role in establishing structural, functional and evolutionary relationships between biological sequences. Although the power of the similarity search has increased steadily in recent years, a high percentage of sequences remain uncharacterized in the protein universe. Thus, new similarity search strategies are needed to efficiently and reliably infer the structure and function of new sequences. The existing paradigm for studying protein sequence, structure, function and evolution has been established based on the assumption that the protein universe is discrete and hierarchical. Cumulative evidence suggests that the protein universe is continuous. As a result, conventional sequence homology search methods may be not able to detect novel structural, functional and evolutionary relationships between proteins from weak and noisy sequence signals. To overcome the limitations in existing similarity search methods, we propose a new algorithmic framework—Enrichment of Network Topological Similarity (ENTS)—to improve the performance of large scale similarity searches in bioinformatics. Results: We apply ENTS to a challenging unsolved problem: protein fold recognition. Our rigorous benchmark studies demonstrate that ENTS considerably outperforms state-of-the-art methods. As the concept of ENTS can be applied to any similarity metric, it may provide a general framework for similarity search on any set of biological entities, given their representation as a network. Availability and implementation: Source code freely available upon request Contact: lxie@iscb.org PMID:25717198

  4. "A Heuristic for Visual Thinking in History"

    ERIC Educational Resources Information Center

    Staley, David J.

    2007-01-01

    This article details a heuristic history teachers can use in assigning and evaluating multimedia projects in history. To use this heuristic successfully, requires more than simply following the steps in the list or stages in a recipe: in many ways, it requires a reorientation in what it means to think like an historian. This article, as much as…

  5. SEARCH FOR MITOGENETIC RADIATION BY MEANS OF THE PHOTOELECTRIC METHOD

    PubMed Central

    Lorenz, Egon

    1934-01-01

    The intensity of mitogenetic radiation was estimated from data given by Gurwitsch. The sensitivity of the biological method and of the physical methods were compared. With onion-base pulp and onion roots as mitogenetic inductors, the photographic method gave no perceptible blackening for exposures up to 184 hours. A photoelectric counter tube was described with cadmium as photoelectric metal. Its sensitivity was such that a radiation intensity of 10 to 15 quanta per cm.2 per second of the Hg line 2536 A was detectable. Spurious effects produced by the counter tube were described and means for their avoidance given. A number of different biological materials, all supposed to be excellent mitogenetic radiators, were investigated by means of the counter tube. No mitogenetic radiation could be detected. PMID:19872817

  6. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    NASA Astrophysics Data System (ADS)

    Rocha, Humberto; Dias, Joana M.; Ferreira, Brígida C.; Lopes, Maria C.

    2013-05-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem.

  7. (Re)searching Methods: Reading Fiction in Literary Response Groups

    ERIC Educational Resources Information Center

    Janzen, Melanie D.

    2015-01-01

    The trouble with education research is that the research is burdened with trouble before it begins. Working as a poststructural education researcher and engaged in a recent research project that sought to engage with questions of teacher identity, I employed an alternative data elicitation method of literary response groups--similar to that of…

  8. (Re)searching Methods: Reading Fiction in Literary Response Groups

    ERIC Educational Resources Information Center

    Janzen, Melanie D.

    2015-01-01

    The trouble with education research is that the research is burdened with trouble before it begins. Working as a poststructural education researcher and engaged in a recent research project that sought to engage with questions of teacher identity, I employed an alternative data elicitation method of literary response groups--similar to that of…

  9. The method of common search direction of joint inversion

    NASA Astrophysics Data System (ADS)

    Zhao, C.; Tang, R.

    2013-12-01

    In geophysical inversion, the first step is to construct an objective function. The second step is using the optimization algorithm to minimize the objective function, such as the gradient method and the conjugate gradient method. Compared with the former, the conjugate gradient method can find a better direction to make the error decreasing faster and has been widely used for a long time. At present, the joint inversion is generally using the conjugate gradient method. The most important thing of joint inversion is to construct the partial derivative matrix with respect to different physical properties. Then we should add the constraints among different physical properties into the integrated matrix and also use the cross gradient as constrained of joint inversion. There are two ways to apply the cross gradient into inverse process that can be added to the data function or the model function. One way is adding the cross gradient into data function. The partial derivative matrix will grow two times, meanwhile it's also requested to calculate the cross gradient of each grid and bring great computation cost.

  10. eTACTS: A Method for Dynamically Filtering Clinical Trial Search Results

    PubMed Central

    Miotto, Riccardo; Jiang, Silis; Weng, Chunhua

    2013-01-01

    Objective Information overload is a significant problem facing online clinical trial searchers. We present eTACTS, a novel interactive retrieval framework using common eligibility tags to dynamically filter clinical trial search results. Materials and Methods eTACTS mines frequent eligibility tags from free-text clinical trial eligibility criteria and uses these tags for trial indexing. After an initial search, eTACTS presents to the user a tag cloud representing the current results. When the user selects a tag, eTACTS retains only those trials containing that tag in their eligibility criteria and generates a new cloud based on tag frequency and co-occurrences in the remaining trials. The user can then select a new tag or unselect a previous tag. The process iterates until a manageable number of trials is returned. We evaluated eTACTS in terms of filtering efficiency, diversity of the search results, and user eligibility to the filtered trials using both qualitative and quantitative methods. Results eTACTS (1) rapidly reduced search results from over a thousand trials to ten; (2) highlighted trials that are generally not top-ranked by conventional search engines; and (3) retrieved a greater number of suitable trials than existing search engines. Discussion eTACTS enables intuitive clinical trial searches by indexing eligibility criteria with effective tags. User evaluation was limited to one case study and a small group of evaluators due to the long duration of the experiment. Although a larger-scale evaluation could be conducted, this feasibility study demonstrated significant advantages of eTACTS over existing clinical trial search engines. Conclusion A dynamic eligibility tag cloud can potentially enhance state-of-the-art clinical trial search engines by allowing intuitive and efficient filtering of the search result space. PMID:23916863

  11. General heuristics algorithms for solving capacitated arc routing problem

    NASA Astrophysics Data System (ADS)

    Fadzli, Mohammad; Najwa, Nurul; Masran, Hafiz

    2015-05-01

    In this paper, we try to determine the near-optimum solution for the capacitated arc routing problem (CARP). In general, NP-hard CARP is a special graph theory specifically arises from street services such as residential waste collection and road maintenance. By purpose, the design of the CARP model and its solution techniques is to find optimum (or near-optimum) routing cost for a fleet of vehicles involved in operation. In other words, finding minimum-cost routing is compulsory in order to reduce overall operation cost that related with vehicles. In this article, we provide a combination of various heuristics algorithm to solve a real case of CARP in waste collection and benchmark instances. These heuristics work as a central engine in finding initial solutions or near-optimum in search space without violating the pre-setting constraints. The results clearly show that these heuristics algorithms could provide good initial solutions in both real-life and benchmark instances.

  12. Heuristic evaluation on mobile interfaces: a new checklist.

    PubMed

    Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis

    2014-01-01

    The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers.

  13. Heuristic Evaluation on Mobile Interfaces: A New Checklist

    PubMed Central

    Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis

    2014-01-01

    The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. PMID:25295300

  14. A new type of descent conjugate gradient method with exact line search

    NASA Astrophysics Data System (ADS)

    Hajar, Nurul; Mamat, Mustafa; Rivaie, Mohd.; Jusoh, Ibrahim

    2016-06-01

    Nowadays, conjugate gradient (CG) methods are impressive for solving nonlinear unconstrained optimization problems. In this paper, a new CG method is proposed and analyzed. This new CG method satisfies descent condition and its global convergence is established using exact line search. Numerical results show that this new CG method substantially outperforms the previous CG methods. This new CG method is considered robust, efficient and provided faster and stable convergence.

  15. Track Association Performance of the Best Hypotheses Search Method

    NASA Astrophysics Data System (ADS)

    Siminski, J. A.; Fiedler, H.; Schildknecht, T.

    2013-08-01

    Uncontrolled space objects in the geostationary orbit domain are hazardous threats for active satellites. Catalogs need to be build up, in order to protect this precious domain. The Swiss ZimSMART telescope, located in Zimmerwald, regularly scans the geostationary ring in order to provide a homogenous coverage. This surveying technique typically yields short measurement arcs, called tracklets. Each tracklet provides information about the line-of-sight and the rates of change but typically not about the full state of the observed object. Computationally intensive multi-hypothesis filter methods have been developed to associate tracklets with each other. An effective implementation to this approach is presented that uses an optimization algorithm to reduce the number of initial hypotheses. The method is tested with a set of real measurements of the aforementioned telescope.

  16. Mesh Adaptive Direct Search Methods for Constrained Nonsmooth Optimization

    DTIC Science & Technology

    2012-02-24

    L.A. Leclaire. Snow water equivalent estimation using blackbox optimization. Technical report, Les Cahiers du GERAD G-2011-09, 2011. To appear in...Technical report, Les Cahiers du GERAD G-2011-03, 2011. To appear in Optimization Letters. 3. C. Audet, J.E. Dennis and S. Le Digabel. Trade-off...studies in blackbox optimization. Technical report, Les Cahiers du GERAD G-2010-49, 2010. To appear in Optimization Methods and Software. 4. C. Audet, and

  17. Applications of Principled Search Methods in Climate Influences and Mechanisms

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.

  18. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  19. A New Method for Searching for Free Fractional Charge Particles in Bulk Matter

    SciTech Connect

    Perl, Martin

    1999-11-10

    We present a new experimental method for searching for free fractional charge in bulk matter; this new method derives from the traditional Millikan liquid drop method, but allows the use of much larger drops, 20 to 100 {micro}m in diameter, compared to the traditional method that uses drops less than 15 {micro}m in diameter. These larger drops provide the substantial advantage that it is then much easier to consistently generate drops containing liquid suspensions of powdered meteorites and other special minerals. These materials are of great importance in bulk searches for fractional charge particles that may have been produced in the early universe.

  20. An Empirical Study in the Simulation of Heuristic Error Behavior.

    DTIC Science & Technology

    1986-01-01

    generation (global] procedure initialize.graph.descriptor (var g graph -descriptor); varIi : integer; 0. begin g.depth := 0; g.gsnerated := 0; g expanded...state puzzle-state; var search-tree, graph : node.ptr; var g : graph -descriptor); e%:. " var start, current, c, p node-ptr; open. successor-list node...node.ptr; .e inv- g graph -descriptor; profile-input, profil*-output : text; frequency : array (1..max-heuristics, 0 .max-n. 0.max-k] of integer; profile

  1. Feature selection method based on multi-fractal dimension and harmony search algorithm and its application

    NASA Astrophysics Data System (ADS)

    Zhang, Chen; Ni, Zhiwei; Ni, Liping; Tang, Na

    2016-10-01

    Feature selection is an important method of data preprocessing in data mining. In this paper, a novel feature selection method based on multi-fractal dimension and harmony search algorithm is proposed. Multi-fractal dimension is adopted as the evaluation criterion of feature subset, which can determine the number of selected features. An improved harmony search algorithm is used as the search strategy to improve the efficiency of feature selection. The performance of the proposed method is compared with that of other feature selection algorithms on UCI data-sets. Besides, the proposed method is also used to predict the daily average concentration of PM2.5 in China. Experimental results show that the proposed method can obtain competitive results in terms of both prediction accuracy and the number of selected features.

  2. Search-based optimization.

    PubMed

    Wheeler, Ward C

    2003-08-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  3. Search-based optimization

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  4. Search-based optimization

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  5. eTACTS: a method for dynamically filtering clinical trial search results.

    PubMed

    Miotto, Riccardo; Jiang, Silis; Weng, Chunhua

    2013-12-01

    Information overload is a significant problem facing online clinical trial searchers. We present eTACTS, a novel interactive retrieval framework using common eligibility tags to dynamically filter clinical trial search results. eTACTS mines frequent eligibility tags from free-text clinical trial eligibility criteria and uses these tags for trial indexing. After an initial search, eTACTS presents to the user a tag cloud representing the current results. When the user selects a tag, eTACTS retains only those trials containing that tag in their eligibility criteria and generates a new cloud based on tag frequency and co-occurrences in the remaining trials. The user can then select a new tag or unselect a previous tag. The process iterates until a manageable number of trials is returned. We evaluated eTACTS in terms of filtering efficiency, diversity of the search results, and user eligibility to the filtered trials using both qualitative and quantitative methods. eTACTS (1) rapidly reduced search results from over a thousand trials to ten; (2) highlighted trials that are generally not top-ranked by conventional search engines; and (3) retrieved a greater number of suitable trials than existing search engines. eTACTS enables intuitive clinical trial searches by indexing eligibility criteria with effective tags. User evaluation was limited to one case study and a small group of evaluators due to the long duration of the experiment. Although a larger-scale evaluation could be conducted, this feasibility study demonstrated significant advantages of eTACTS over existing clinical trial search engines. A dynamic eligibility tag cloud can potentially enhance state-of-the-art clinical trial search engines by allowing intuitive and efficient filtering of the search result space. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  6. A review of the scientific rationale and methods used in the search for other planetary systems

    NASA Technical Reports Server (NTRS)

    Black, D. C.

    1985-01-01

    Planetary systems appear to be one of the crucial links in the chain leading from simple molecules to living systems, particularly complex (intelligent?) living systems. Although there is currently no observational proof of the existence of any planetary system other than our own, techniques are now being developed which will permit a comprehensive search for other planetary systems. The scientific rationale for and methods used in such a search effort are reviewed here.

  7. A review of the scientific rationale and methods used in the search for other planetary systems

    NASA Technical Reports Server (NTRS)

    Black, D. C.

    1985-01-01

    Planetary systems appear to be one of the crucial links in the chain leading from simple molecules to living systems, particularly complex (intelligent?) living systems. Although there is currently no observational proof of the existence of any planetary system other than our own, techniques are now being developed which will permit a comprehensive search for other planetary systems. The scientific rationale for and methods used in such a search effort are reviewed here.

  8. Cooperative system and method using mobile robots for testing a cooperative search controller

    DOEpatents

    Byrne, Raymond H.; Harrington, John J.; Eskridge, Steven E.; Hurtado, John E.

    2002-01-01

    A test system for testing a controller provides a way to use large numbers of miniature mobile robots to test a cooperative search controller in a test area, where each mobile robot has a sensor, a communication device, a processor, and a memory. A method of using a test system provides a way for testing a cooperative search controller using multiple robots sharing information and communicating over a communication network.

  9. Portfolios in Stochastic Local Search: Efficiently Computing Most Probable Explanations in Bayesian Networks

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Roth, Dan; Wilkins, David C.

    2001-01-01

    Portfolio methods support the combination of different algorithms and heuristics, including stochastic local search (SLS) heuristics, and have been identified as a promising approach to solve computationally hard problems. While successful in experiments, theoretical foundations and analytical results for portfolio-based SLS heuristics are less developed. This article aims to improve the understanding of the role of portfolios of heuristics in SLS. We emphasize the problem of computing most probable explanations (MPEs) in Bayesian networks (BNs). Algorithmically, we discuss a portfolio-based SLS algorithm for MPE computation, Stochastic Greedy Search (SGS). SGS supports the integration of different initialization operators (or initialization heuristics) and different search operators (greedy and noisy heuristics), thereby enabling new analytical and experimental results. Analytically, we introduce a novel Markov chain model tailored to portfolio-based SLS algorithms including SGS, thereby enabling us to analytically form expected hitting time results that explain empirical run time results. For a specific BN, we show the benefit of using a homogenous initialization portfolio. To further illustrate the portfolio approach, we consider novel additive search heuristics for handling determinism in the form of zero entries in conditional probability tables in BNs. Our additive approach adds rather than multiplies probabilities when computing the utility of an explanation. We motivate the additive measure by studying the dramatic impact of zero entries in conditional probability tables on the number of zero-probability explanations, which again complicates the search process. We consider the relationship between MAXSAT and MPE, and show that additive utility (or gain) is a generalization, to the probabilistic setting, of MAXSAT utility (or gain) used in the celebrated GSAT and WalkSAT algorithms and their descendants. Utilizing our Markov chain framework, we show that

  10. Instruments and methods to search for extraterrestrial life

    NASA Astrophysics Data System (ADS)

    Hoover, Richard B.

    2015-09-01

    Is Life restricted to the Planet Earth? or Does life exist elsewhere in the Cosmos? The existence of extraterrestrial life is the fundamental question of Astrobiology. Detecting evidence for living organisms beyond our planet is even more difficult than finding fossilized remains of ancient organisms. Microbiological investigations during the past century have established the fundamental physical and chemical requirements and limits for life on Earth. It is now known that life requires only water, a source of energy, and a small suite of biogenic elements under a surprisingly wide range of environmental conditions. The discovery that microbial extremophiles live and grow over a very broad span of temperature, pH, salinity, pressure and radiation levels has greatly enhanced the possibility that life may be present on many bodies of our Solar System. Recent discoveries by Space Missions and Rovers have invalidated many long held paradigms regarding the distribution of water, organic chemicals and the possibility of life elsewhere in the Cosmos. This paper considers the discovery of water, ice and organics on distant planets, moons and comets and evidence for fossil organisms on Mars and in SNC and carbonaceous meteorites. Instruments and methods are considered for spectroscopy and fluorescence of biomolecules (e.g., photosynthetic pigments) for remote detection of conclusive evidence for extraterrestrial life. Optical Video Microscopy is discussed as a direct means for detecting extraterrestrial life using small visible light/UV video microscopes, with ample magnification to record motile bacteria and other living organisms in samples collected by Rovers or Landers. Locomotion of living cells of bacteria and other microbes requires great expenditure of energy and motile cells can be distinguished by video microscopy from the physico-chemical movements (by Brownian Motion, Diffusion or Current Drift) of dead cells, dust particles and abiotic mineral grains.

  11. Low-Mode Conformational Search Method with Semiempirical Quantum Mechanical Calculations: Application to Enantioselective Organocatalysis.

    PubMed

    Kamachi, Takashi; Yoshizawa, Kazunari

    2016-02-22

    A conformational search program for finding low-energy conformations of large noncovalent complexes has been developed. A quantitatively reliable semiempirical quantum mechanical PM6-DH+ method, which is able to accurately describe noncovalent interactions at a low computational cost, was employed in contrast to conventional conformational search programs in which molecular mechanical methods are usually adopted. Our approach is based on the low-mode method whereby an initial structure is perturbed along one of its low-mode eigenvectors to generate new conformations. This method was applied to determine the most stable conformation of transition state for enantioselective alkylation by the Maruoka and cinchona alkaloid catalysts and Hantzsch ester hydrogenation of imines by chiral phosphoric acid. Besides successfully reproducing the previously reported most stable DFT conformations, the conformational search with the semiempirical quantum mechanical calculations newly discovered a more stable conformation at a low computational cost.

  12. Exploring Heuristics for the Vehicle Routing Problem with Split Deliveries and Time Windows

    DTIC Science & Technology

    2014-09-18

    this research introduces an ant colony metaheuristic coupled with a local search heuristic embedded within a dynamic program seeking to solve a... Colony Optimization ................................................................................. 21 2.3.4.1 MAX-MIN Ant System...69 4.2.2.1 Ant Colony Optimization

  13. Methods and means used in programming intelligent searches of technical documents

    NASA Technical Reports Server (NTRS)

    Gross, David L.

    1993-01-01

    In order to meet the data research requirements of the Safety, Reliability & Quality Assurance activities at Kennedy Space Center (KSC), a new computer search method for technical data documents was developed. By their very nature, technical documents are partially encrypted because of the author's use of acronyms, abbreviations, and shortcut notations. This problem of computerized searching is compounded at KSC by the volume of documentation that is produced during normal Space Shuttle operations. The Centralized Document Database (CDD) is designed to solve this problem. It provides a common interface to an unlimited number of files of various sizes, with the capability to perform any diversified types and levels of data searches. The heart of the CDD is the nature and capability of its search algorithms. The most complex form of search that the program uses is with the use of a domain-specific database of acronyms, abbreviations, synonyms, and word frequency tables. This database, along with basic sentence parsing, is used to convert a request for information into a relational network. This network is used as a filter on the original document file to determine the most likely locations for the data requested. This type of search will locate information that traditional techniques, (i.e., Boolean structured key-word searching), would not find.

  14. Signal search analysis: a new method to localize and characterize functionally important DNA sequences.

    PubMed Central

    Bucher, P; Bryan, B

    1984-01-01

    The generation of "signal search data" represents a general method of describing the common properties of a set of DNA sequences presumed to be functionally analogous. Besides the detailed description of this method we present two computer programs which use signal search data as input data: One that processes them to a "constraint profile" and another one which lists over-represented "signals" of potential functional relevance. To illustrate the possibilities of our method we have analysed a set of transcription initiation sites of sea urchin histone genes. PMID:6546421

  15. Deterministic algorithm with agglomerative heuristic for location problems

    NASA Astrophysics Data System (ADS)

    Kazakovtsev, L.; Stupina, A.

    2015-10-01

    Authors consider the clustering problem solved with the k-means method and p-median problem with various distance metrics. The p-median problem and the k-means problem as its special case are most popular models of the location theory. They are implemented for solving problems of clustering and many practically important logistic problems such as optimal factory or warehouse location, oil or gas wells, optimal drilling for oil offshore, steam generators in heavy oil fields. Authors propose new deterministic heuristic algorithm based on ideas of the Information Bottleneck Clustering and genetic algorithms with greedy heuristic. In this paper, results of running new algorithm on various data sets are given in comparison with known deterministic and stochastic methods. New algorithm is shown to be significantly faster than the Information Bottleneck Clustering method having analogous preciseness.

  16. Collaborative Relevance Judgment: A Group Consensus Method for Evaluating User Search Performance.

    ERIC Educational Resources Information Center

    Zhang, Xiangmin

    2002-01-01

    Discusses relevance judgments in information retrieval; considers the collaborative nature of information retrieval in a group, organization, or societal context; and proposes a method that measures relevance based on group/peer consensus. Reports results of an experiment using this method to compare the search performance of different types of…

  17. Convergence of the standard RLS method and UDUT factorisation of covariance matrix for solving the algebraic Riccati equation of the DLQR via heuristic approximate dynamic programming

    NASA Astrophysics Data System (ADS)

    Moraes Rêgo, Patrícia Helena; Viana da Fonseca Neto, João; Ferreira, Ernesto M.

    2015-08-01

    The main focus of this article is to present a proposal to solve, via UDUT factorisation, the convergence and numerical stability problems that are related to the covariance matrix ill-conditioning of the recursive least squares (RLS) approach for online approximations of the algebraic Riccati equation (ARE) solution associated with the discrete linear quadratic regulator (DLQR) problem formulated in the actor-critic reinforcement learning and approximate dynamic programming context. The parameterisations of the Bellman equation, utility function and dynamic system as well as the algebra of Kronecker product assemble a framework for the solution of the DLQR problem. The condition number and the positivity parameter of the covariance matrix are associated with statistical metrics for evaluating the approximation performance of the ARE solution via RLS-based estimators. The performance of RLS approximators is also evaluated in terms of consistence and polarisation when associated with reinforcement learning methods. The used methodology contemplates realisations of online designs for DLQR controllers that is evaluated in a multivariable dynamic system model.

  18. A set-covering based heuristic algorithm for the periodic vehicle routing problem.

    PubMed

    Cacchiani, V; Hemmelmayr, V C; Tricoire, F

    2014-01-30

    We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011)  [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.

  19. A set-covering based heuristic algorithm for the periodic vehicle routing problem

    PubMed Central

    Cacchiani, V.; Hemmelmayr, V.C.; Tricoire, F.

    2014-01-01

    We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011)  [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems. PMID:24748696

  20. A three-term conjugate gradient method under the strong-Wolfe line search

    NASA Astrophysics Data System (ADS)

    Khadijah, Wan; Rivaie, Mohd; Mamat, Mustafa

    2017-08-01

    Recently, numerous studies have been concerned in conjugate gradient methods for solving large-scale unconstrained optimization method. In this paper, a three-term conjugate gradient method is proposed for unconstrained optimization which always satisfies sufficient descent direction and namely as Three-Term Rivaie-Mustafa-Ismail-Leong (TTRMIL). Under standard conditions, TTRMIL method is proved to be globally convergent under strong-Wolfe line search. Finally, numerical results are provided for the purpose of comparison.

  1. Dual-mode nested search method for categorical uncertain multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  2. BetaSearch: a new method for querying β-residue motifs

    PubMed Central

    2012-01-01

    Background Searching for structural motifs across known protein structures can be useful for identifying unrelated proteins with similar function and characterising secondary structures such as β-sheets. This is infeasible using conventional sequence alignment because linear protein sequences do not contain spatial information. β-residue motifs are β-sheet substructures that can be represented as graphs and queried using existing graph indexing methods, however, these approaches are designed for general graphs that do not incorporate the inherent structural constraints of β-sheets and require computationally-expensive filtering and verification procedures. 3D substructure search methods, on the other hand, allow β-residue motifs to be queried in a three-dimensional context but at significant computational costs. Findings We developed a new method for querying β-residue motifs, called BetaSearch, which leverages the natural planar constraints of β-sheets by indexing them as 2D matrices, thus avoiding much of the computational complexities involved with structural and graph querying. BetaSearch exhibits faster filtering, verification, and overall query time than existing graph indexing approaches whilst producing comparable index sizes. Compared to 3D substructure search methods, BetaSearch achieves 33 and 240 times speedups over index-based and pairwise alignment-based approaches, respectively. Furthermore, we have presented case-studies to demonstrate its capability of motif matching in sequentially dissimilar proteins and described a method for using BetaSearch to predict β-strand pairing. Conclusions We have demonstrated that BetaSearch is a fast method for querying substructure motifs. The improvements in speed over existing approaches make it useful for efficiently performing high-volume exploratory querying of possible protein substructural motifs or conformations. BetaSearch was used to identify a nearly identical β-residue motif between an entirely

  3. Mixed Integer Programming and Heuristic Scheduling for Space Communication

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming

    2013-01-01

    Optimal planning and scheduling for a communication network was created where the nodes within the network are communicating at the highest possible rates while meeting the mission requirements and operational constraints. The planning and scheduling problem was formulated in the framework of Mixed Integer Programming (MIP) to introduce a special penalty function to convert the MIP problem into a continuous optimization problem, and to solve the constrained optimization problem using heuristic optimization. The communication network consists of space and ground assets with the link dynamics between any two assets varying with respect to time, distance, and telecom configurations. One asset could be communicating with another at very high data rates at one time, and at other times, communication is impossible, as the asset could be inaccessible from the network due to planetary occultation. Based on the network's geometric dynamics and link capabilities, the start time, end time, and link configuration of each view period are selected to maximize the communication efficiency within the network. Mathematical formulations for the constrained mixed integer optimization problem were derived, and efficient analytical and numerical techniques were developed to find the optimal solution. By setting up the problem using MIP, the search space for the optimization problem is reduced significantly, thereby speeding up the solution process. The ratio of the dimension of the traditional method over the proposed formulation is approximately an order N (single) to 2*N (arraying), where N is the number of receiving antennas of a node. By introducing a special penalty function, the MIP problem with non-differentiable cost function and nonlinear constraints can be converted into a continuous variable problem, whose solution is possible.

  4. Applying heuristic inquiry to nurse migration from the UK to Australia.

    PubMed

    Vafeas, Caroline; Hendricks, Joyce

    2017-01-23

    Background Heuristic inquiry is a research approach that improves understanding of the essence of an experience. This qualitative method relies on researchers' ability to discover and interpret their own experience while exploring those of others. Aim To present a discussion of heuristic inquiry's methodology and its application to the experience of nurse migration. Discussion The researcher's commitment to the research is central to heuristic inquiry. It is immersive, reflective, reiterative and a personally-affecting method of gathering knowledge. Researchers are acknowledged as the only people who can validate the findings of the research by exploring their own experiences while also examining those of others with the same experiences to truly understand the phenomena being researched. This paper presents the ways in which the heuristic process guides this discovery in relation to traditional research steps. Conclusion Heuristic inquiry is an appropriate method for exploring nurses' experiences of migration because nurse researchers can tell their own stories and it brings understanding of themselves and the phenomenon as experienced by others. Implications for practice Although not a popular method in nursing research, heuristic inquiry offers a depth of exploration and understanding that may not be revealed by other methods.

  5. Heuristic Modeling for TRMM Lifetime Predictions

    NASA Technical Reports Server (NTRS)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  6. Evolutionary Local Search of Fuzzy Rules through a novel Neuro-Fuzzy encoding method.

    PubMed

    Carrascal, A; Manrique, D; Ríos, J; Rossi, C

    2003-01-01

    This paper proposes a new approach for constructing fuzzy knowledge bases using evolutionary methods. We have designed a genetic algorithm that automatically builds neuro-fuzzy architectures based on a new indirect encoding method. The neuro-fuzzy architecture represents the fuzzy knowledge base that solves a given problem; the search for this architecture takes advantage of a local search procedure that improves the chromosomes at each generation. Experiments conducted both on artificially generated and real world problems confirm the effectiveness of the proposed approach.

  7. New Internet search volume-based weighting method for integrating various environmental impacts

    SciTech Connect

    Ji, Changyoon Hong, Taehoon

    2016-01-15

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  8. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    PubMed

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  9. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    PubMed Central

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  10. A semantics-based method for clustering of Chinese web search results

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  11. Free Energy-Based Conformational Search Algorithm Using the Movable Type Sampling Method.

    PubMed

    Pan, Li-Li; Zheng, Zheng; Wang, Ting; Merz, Kenneth M

    2015-12-08

    In this article, we extend the movable type (MT) sampling method to molecular conformational searches (MT-CS) on the free energy surface of the molecule in question. Differing from traditional systematic and stochastic searching algorithms, this method uses Boltzmann energy information to facilitate the selection of the best conformations. The generated ensembles provided good coverage of the available conformational space including available crystal structures. Furthermore, our approach directly provides the solvation free energies and the relative gas and aqueous phase free energies for all generated conformers. The method is validated by a thorough analysis of thrombin ligands as well as against structures extracted from both the Protein Data Bank (PDB) and the Cambridge Structural Database (CSD). An in-depth comparison between OMEGA and MT-CS is presented to illustrate the differences between the two conformational searching strategies, i.e., energy-based versus free energy-based searching. These studies demonstrate that our MT-based ligand conformational search algorithm is a powerful approach to delineate the conformational ensembles of molecular species on free energy surfaces.

  12. A conjugate gradient method with descent properties under strong Wolfe line search

    NASA Astrophysics Data System (ADS)

    Zull, N.; ‘Aini, N.; Shoid, S.; Ghani, N. H. A.; Mohamed, N. S.; Rivaie, M.; Mamat, M.

    2017-09-01

    The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functions using MATLAB. The tests are measured in terms of iteration numbers and CPU time under strong Wolfe line search. Overall, this new method performs efficiently and comparable to the other famous methods.

  13. Social heuristics shape intuitive cooperation.

    PubMed

    Rand, David G; Peysakhovich, Alexander; Kraft-Todd, Gordon T; Newman, George E; Wurzbacher, Owen; Nowak, Martin A; Greene, Joshua D

    2014-04-22

    Cooperation is central to human societies. Yet relatively little is known about the cognitive underpinnings of cooperative decision making. Does cooperation require deliberate self-restraint? Or is spontaneous prosociality reined in by calculating self-interest? Here we present a theory of why (and for whom) intuition favors cooperation: cooperation is typically advantageous in everyday life, leading to the formation of generalized cooperative intuitions. Deliberation, by contrast, adjusts behaviour towards the optimum for a given situation. Thus, in one-shot anonymous interactions where selfishness is optimal, intuitive responses tend to be more cooperative than deliberative responses. We test this 'social heuristics hypothesis' by aggregating across every cooperation experiment using time pressure that we conducted over a 2-year period (15 studies and 6,910 decisions), as well as performing a novel time pressure experiment. Doing so demonstrates a positive average effect of time pressure on cooperation. We also find substantial variation in this effect, and show that this variation is partly explained by previous experience with one-shot lab experiments.

  14. The Search Conference as a Method in Planning Community Health Promotion Actions.

    PubMed

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-08-19

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives.

  15. Fault diagnosis of rolling element bearings with a spectrum searching method

    NASA Astrophysics Data System (ADS)

    Li, Wei; Qiu, Mingquan; Zhu, Zhencai; Jiang, Fan; Zhou, Gongbo

    2017-09-01

    Rolling element bearing faults in rotating systems are observed as impulses in the vibration signals, which are usually buried in noise. In order to effectively detect faults in bearings, a novel spectrum searching method is proposed in this paper. The structural information of the spectrum (SIOS) on a predefined frequency grid is constructed through a searching algorithm, such that the harmonics of the impulses generated by faults can be clearly identified and analyzed. Local peaks of the spectrum are projected onto certain components of the frequency grid, and then the SIOS can interpret the spectrum via the number and power of harmonics projected onto components of the frequency grid. Finally, bearings can be diagnosed based on the SIOS by identifying its dominant or significant components. The mathematical formulation is developed to guarantee the correct construction of the SIOS through searching. The effectiveness of the proposed method is verified with both simulated and experimental bearing signals.

  16. A new convergent conjugate gradient method under the exact line search

    NASA Astrophysics Data System (ADS)

    Omer, Osman; Mamat, Mustafa; Rivaie, Mohd

    2015-05-01

    Conjugate gradient methods are widely used for unconstrained optimization problems, especially large scale problems. That is, for its simplicity, low memory requirement, and global convergence properties. In this paper, we study the global convergence properties of a new conjugate gradient method under the exact line search. Under some assumptions, the proofs of the sufficient descent property and the global convergence are given. The numerical results show that our new method is efficient for some unconstrained optimization problems.

  17. Heuristic thinking makes a chemist smart.

    PubMed

    Graulich, Nicole; Hopf, Henning; Schreiner, Peter R

    2010-05-01

    We focus on the virtually neglected use of heuristic principles in understanding and teaching of organic chemistry. As human thinking is not comparable to computer systems employing factual knowledge and algorithms--people rarely make decisions through careful considerations of every possible event and its probability, risks or usefulness--research in science and teaching must include psychological aspects of the human decision making processes. Intuitive analogical and associative reasoning and the ability to categorize unexpected findings typically demonstrated by experienced chemists should be made accessible to young learners through heuristic concepts. The psychology of cognition defines heuristics as strategies that guide human problem-solving and deciding procedures, for example with patterns, analogies, or prototypes. Since research in the field of artificial intelligence and current studies in the psychology of cognition have provided evidence for the usefulness of heuristics in discovery, the status of heuristics has grown into something useful and teachable. In this tutorial review, we present a heuristic analysis of a familiar fundamental process in organic chemistry--the cyclic six-electron case, and we show that this approach leads to a more conceptual insight in understanding, as well as in teaching and learning.

  18. Search strategies in a human water maze analogue analyzed with automatic classification methods.

    PubMed

    Schoenfeld, Robby; Moenich, Nadine; Mueller, Franz-Josef; Lehmann, Wolfgang; Leplow, Bernd

    2010-03-17

    Although human spatial cognition is at the focus of intense research efforts, experimental evidence on how search strategies differ among age and gender groups remains elusive. To address this problem, we investigated the interaction between age, sex, and strategy usage within a novel virtual water maze-like procedure (VWM). We studied 28 young adults 20-29 years (14 males) and 30 middle-aged adults 50-59 years (15 males). Younger age groups outperformed older groups with respect to place learning. We also observed a moderate sex effect, with males outperforming females. Unbiased classification of human search behavior within this paradigm was done by means of an exploratory method using sparse non-negative matrix factorization (SNMF) and a parameter-based algorithm as an a priori classifier. Analyses of search behavior with the SNMF and the parameter-based method showed that the older group relied on less efficient search strategies, but females did not drop so dramatically. Place learning was related to the adaptation of elaborated search strategies. Participants using place-directed strategies obtained the highest score on place learning, and deterioration of place learning in the elderly was due to the use of less efficient non-specific strategies. A high convergence of the SNMF and the parameter-based classifications could be shown. Furthermore, the SNMF classification was cross validated with the traditional eyeballing method. As a result of this analysis, we conclude that SNMF is a robust exploratory method for the classification of search behavior in water maze procedures.

  19. Automatic Reaction Pathway Search via Combined Molecular Dynamics and Coordinate Driving Method.

    PubMed

    Yang, Manyi; Zou, Jingxiang; Wang, Guoqiang; Li, Shuhua

    2017-02-16

    We proposed and implemented a combined molecular dynamics and coordinate driving (MD/CD) method for automatically searching multistep reaction pathways of chemical reactions. In this approach, the molecular dynamic (MD) method at the molecular mechanics (MM) or semiempirical quantum mechanical (QM) level is employed to explore the conformational space of the minimum structures, and the modified coordinate driving (CD) method is used to build reaction pathways for representative conformers. The MD/CD method is first applied to two model reactions (the Claisen rearrangement and the intermolecular aldol reaction). By comparing the obtained results with those of the existing methods, we found that the MD/CD method has a comparable performance in searching low-energy reaction pathways. Then, the MD/CD method is further applied to investigate two reactions: the electrocyclic reaction of benzocyclobutene-7-carboxaldehyde and the intramolecular Diels-Alder reaction of ketothioester with 11 effectively rotatable single bonds. For the first reaction, our results can correctly account for its torquoselectivity. For the second one, our method predicts eight reaction channels, leading to eight different stereo- and regioselective products. The MD/CD method is expected to become an efficient and cost-effective theoretical tool for automatically searching low-energy reaction pathways for relatively complex chemical reactions.

  20. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  1. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    ERIC Educational Resources Information Center

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  2. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  3. A health literacy and usability heuristic evaluation of a mobile consumer health application.

    PubMed

    Monkman, Helen; Kushniruk, Andre

    2013-01-01

    Usability and health literacy are two critical factors in the design and evaluation of consumer health information systems. However, methods for evaluating these two factors in conjunction remain limited. This study adapted a set of existing guidelines for the design of consumer health Web sites into evidence-based evaluation heuristics tailored specifically for mobile consumer health applications. In order to test the approach, a mobile consumer health application (app) was then evaluated using these heuristics. In addition to revealing ways to improve the usability of the system, this analysis identified opportunities to augment the content to make it more understandable by users with limited health literacy. This study successfully demonstrated the utility of converting existing design guidelines into heuristics for the evaluation of usability and health literacy. The heuristics generated could be applied for assessing and revising other existing consumer health information systems.

  4. Social welfare as small-scale help: evolutionary psychology and the deservingness heuristic.

    PubMed

    Petersen, Michael Bang

    2012-01-01

    Public opinion concerning social welfare is largely driven by perceptions of recipient deservingness. Extant research has argued that this heuristic is learned from a variety of cultural, institutional, and ideological sources. The present article provides evidence supporting a different view: that the deservingness heuristic is rooted in psychological categories that evolved over the course of human evolution to regulate small-scale exchanges of help. To test predictions made on the basis of this view, a method designed to measure social categorization is embedded in nationally representative surveys conducted in different countries. Across the national- and individual-level differences that extant research has used to explain the heuristic, people categorize welfare recipients on the basis of whether they are lazy or unlucky. This mode of categorization furthermore induces people to think about large-scale welfare politics as its presumed ancestral equivalent: small-scale help giving. The general implications for research on heuristics are discussed.

  5. Heuristic-based energy landscape paving for the circular packing problem with performance constraints of equilibrium

    NASA Astrophysics Data System (ADS)

    Liu, Jingfa; Jiang, Yucong; Li, Gang; Xue, Yu; Liu, Zhaoxia; Zhang, Zhen

    2015-08-01

    The optimal layout problem of circle group in a circular container with performance constraints of equilibrium belongs to a class of NP-hard problem. The key obstacle of solving this problem is the lack of an effective global optimization method. We convert the circular packing problem with performance constraints of equilibrium into the unconstrained optimization problem by using quasi-physical strategy and penalty function method. By putting forward a new updating mechanism of the histogram function in energy landscape paving (ELP) method and incorporating heuristic conformation update strategies into the ELP method, we obtain an improved ELP (IELP) method. Subsequently, by combining the IELP method and the local search (LS) procedure, we put forward a hybrid algorithm, denoted by IELP-LS, for the circular packing problem with performance constraints of equilibrium. We test three sets of benchmarks consisting of 21 representative instances from the current literature. The proposed algorithm breaks the records of all 10 instances in the first set, and achieves the same or even better results than other methods in literature for 10 out of 11 instances in the second and third sets. The computational results show that the proposed algorithm is an effective method for solving the circular packing problem with performance constraints of equilibrium.

  6. Surfing for suicide methods and help: content analysis of websites retrieved with search engines in Austria and the United States.

    PubMed

    Till, Benedikt; Niederkrotenthaler, Thomas

    2014-08-01

    The Internet provides a variety of resources for individuals searching for suicide-related information. Structured content-analytic approaches to assess intercultural differences in web contents retrieved with method-related and help-related searches are scarce. We used the 2 most popular search engines (Google and Yahoo/Bing) to retrieve US-American and Austrian search results for the term suicide, method-related search terms (e.g., suicide methods, how to kill yourself, painless suicide, how to hang yourself), and help-related terms (e.g., suicidal thoughts, suicide help) on February 11, 2013. In total, 396 websites retrieved with US search engines and 335 websites from Austrian searches were analyzed with content analysis on the basis of current media guidelines for suicide reporting. We assessed the quality of websites and compared findings across search terms and between the United States and Austria. In both countries, protective outweighed harmful website characteristics by approximately 2:1. Websites retrieved with method-related search terms (e.g., how to hang yourself) contained more harmful (United States: P < .001, Austria: P < .05) and fewer protective characteristics (United States: P < .001, Austria: P < .001) compared to the term suicide. Help-related search terms (e.g., suicidal thoughts) yielded more websites with protective characteristics (United States: P = .07, Austria: P < .01). Websites retrieved with U.S. search engines generally had more protective characteristics (P < .001) than searches with Austrian search engines. Resources with harmful characteristics were better ranked than those with protective characteristics (United States: P < .01, Austria: P < .05). The quality of suicide-related websites obtained depends on the search terms used. Preventive efforts to improve the ranking of preventive web content, particularly regarding method-related search terms, seem necessary. © Copyright 2014 Physicians Postgraduate Press, Inc.

  7. Searching for life in the Universe: unconventional methods for an unconventional problem.

    PubMed

    Nealson, K H; Tsapin, A; Storrie-Lombardi, M

    2002-12-01

    The search for life, on and off our planet, can be done by conventional methods with which we are all familiar. These methods are sensitive and specific, and are often capable of detecting even single cells. However, if the search broadens to include life that may be different (even subtly different) in composition, the methods and even the approach must be altered. Here we discuss the development of what we call non-earthcentric life detection--detecting life with methods that could detect life no matter what its form or composition. To develop these methods, we simply ask, can we define life in terms of its general properties and particularly those that can be measured and quantified? Taking such an approach we can search for life using physics and chemistry to ask questions about structure, chemical composition, thermodynamics, and kinetics. Structural complexity can be searched for using computer algorithms that recognize complex structures. Once identified, these structures can be examined for a variety of chemical traits, including elemental composition, chirality, and complex chemistry. A second approach involves defining our environment in terms of energy sources (i.e., reductants), and oxidants (e.g. what is available to eat and breathe), and then looking for areas in which such phenomena are inexplicably out of chemical equilibrium. These disequilibria, when found, can then be examined in detail for the presence of the structural and chemical complexity that presumably characterizes any living systems. By this approach, we move the search for life to one that should facilitate the detection of any earthly life it encountered, as well as any non-conventional life forms that have structure, complex chemistry, and live via some form of redox chemistry.

  8. Differential Evolution Based Intelligent System State Search Method for Composite Power System Reliability Evaluation

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, Ashok; Kumarappan, N.

    2015-09-01

    This paper presents a new approach for evaluating the reliability indices of a composite power system that adopts binary differential evolution (BDE) algorithm in the search mechanism to select the system states. These states also called dominant states, have large state probability and higher loss of load curtailment necessary to maintain real power balance. A chromosome of a BDE algorithm represents the system state. BDE is not applied for its traditional application of optimizing a non-linear objective function, but used as tool for exploring more number of dominant states by producing new chromosomes, mutant vectors and trail vectors based on the fitness function. The searched system states are used to evaluate annualized system and load point reliability indices. The proposed search methodology is applied to RBTS and IEEE-RTS test systems and results are compared with other approaches. This approach evaluates the indices similar to existing methods while analyzing less number of system states.

  9. The constrainedness of search

    SciTech Connect

    Gent, I.P.; MacIntyre, E.; Prosser, P.

    1996-12-31

    We introduce a parameter that measures the {open_quotes}constrainedness{close_quotes} of an ensemble of combinatorial problems. If problems are over-constrained, they are likely to be insoluble. If problems are under-constrained, they are likely to be soluble. This constrainedness parameter generalizes a number of parameters previously used in different NP-complete problem classes. Phase transitions in different NP classes can thus be directly compared. This parameter can also be used in a heuristic to guide search. The heuristic captures the intuition of making the most constrained choice first, since it is often useful to branch into the least constrained subproblem. Many widely disparate heuristics can be seen as minimizing constrainedness.

  10. Making the Right Connections: Perceptions of Human Resource/Personnel Directors Concerning Electronic Job-Search Methods.

    ERIC Educational Resources Information Center

    Hubbard, Joan C.; North, Alexa B.; Arjomand, H. Lari

    1997-01-01

    Examines methods used to search for entry-level managerial positions and assesses how human resource and personnel directors in Georgia perceive these methods. Findings indicate that few of the directors use electronic technology to fill such positions, but they view positively those applicants who use electronic job searching methods. (RJM)

  11. Frequency-based heuristics for material perception.

    PubMed

    Giesel, Martin; Zaidi, Qasim

    2013-12-06

    People often make rapid visual judgments of the properties of surfaces they are going to walk on or touch. How do they do this when the interactions of illumination geometry with 3-D material structure and object shape result in images that inverse optics algorithms cannot resolve without externally imposed constraints? A possibly effective strategy would be to use heuristics based on information that can be gleaned rapidly from retinal images. By using perceptual scaling of a large sample of images, combined with correspondence and canonical correlation analyses, we discovered that material properties, such as roughness, thickness, and undulations, are characterized by specific scales of luminance variations. Using movies, we demonstrate that observers' percepts of these 3-D qualities vary continuously as a function of the relative energy in corresponding 2-D frequency bands. In addition, we show that judgments of roughness, thickness, and undulations are predictably altered by adaptation to dynamic noise at the corresponding scales. These results establish that the scale of local 3-D structure is critical in perceiving material properties, and that relative contrast at particular spatial frequencies is important for perceiving the critical 3-D structure from shading cues, so that cortical mechanisms for estimating material properties could be constructed by combining the parallel outputs of sets of frequency-selective neurons. These results also provide methods for remote sensing of material properties in machine vision, and rapid synthesis, editing and transfer of material properties for computer graphics and animation.

  12. Scheduling constrained tools using heuristic techniques

    NASA Astrophysics Data System (ADS)

    Maram, Venkataramana; Rahman, Syariza Abdul; Maram, Sandhya Rani

    2015-12-01

    One of the main challenge to the current manufacturing production planning is to provide schedules of operations to maximize resource utilization to yield highest overall productivity. This is achieved by scheduling available resources to activities. There can be many different real time scenarios with different combination of input resources to produce parts. In this paper, the problem is simplified to single machine with individual process times and due dates to represent the real world scheduling problem. The main objective function is to minimize the total tardiness or late jobs. Nearest greedy method of assignment problem algorithm is used to find the initial solution followed by Simulated Annealing (SA) algorithm for the improvement part. Simulated Annealing is one of the meta-heuristic techniques in solving combinatorial optimization problem. The general purpose Microsoft Visual C++ is used to developed algorithm for finding the best solution. The proposed hybrid approach able to generate best schedule in 7th and optimal in 170th iteration with tardiness 8 and 7 hours respectively.

  13. Frequency-based heuristics for material perception

    PubMed Central

    Giesel, Martin; Zaidi, Qasim

    2013-01-01

    People often make rapid visual judgments of the properties of surfaces they are going to walk on or touch. How do they do this when the interactions of illumination geometry with 3-D material structure and object shape result in images that inverse optics algorithms cannot resolve without externally imposed constraints? A possibly effective strategy would be to use heuristics based on information that can be gleaned rapidly from retinal images. By using perceptual scaling of a large sample of images, combined with correspondence and canonical correlation analyses, we discovered that material properties, such as roughness, thickness, and undulations, are characterized by specific scales of luminance variations. Using movies, we demonstrate that observers' percepts of these 3-D qualities vary continuously as a function of the relative energy in corresponding 2-D frequency bands. In addition, we show that judgments of roughness, thickness, and undulations are predictably altered by adaptation to dynamic noise at the corresponding scales. These results establish that the scale of local 3-D structure is critical in perceiving material properties, and that relative contrast at particular spatial frequencies is important for perceiving the critical 3-D structure from shading cues, so that cortical mechanisms for estimating material properties could be constructed by combining the parallel outputs of sets of frequency-selective neurons. These results also provide methods for remote sensing of material properties in machine vision, and rapid synthesis, editing and transfer of material properties for computer graphics and animation. PMID:24317425

  14. An adaptive random search for short term generation scheduling with network constraints.

    PubMed

    Marmolejo, J A; Velasco, Jonás; Selley, Héctor J

    2017-01-01

    This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  15. An adaptive random search for short term generation scheduling with network constraints

    PubMed Central

    Velasco, Jonás; Selley, Héctor J.

    2017-01-01

    This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach. PMID:28234954

  16. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    PubMed

    Richie, Megan; Josephson, S Andrew

    2017-07-28

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  17. Bflinks: Reliable Bugfix Links via Bidirectional References and Tuned Heuristics

    PubMed Central

    2014-01-01

    Background. Data from software version archives and defect databases can be used for defect insertion circumstance analysis and defect prediction. The first step in such analyses is identifying defect-correcting changes in the version archive (bugfix commits) and enriching them with additional metadata by establishing bugfix links to corresponding entries in the defect database. Candidate bugfix commits are typically identified via heuristic string matching on the commit message. Research Questions. Which filters could be used to obtain a set of bugfix links? How to tune their parameters? What accuracy is achieved? Method. We analyze a modular set of seven independent filters, including new ones that make use of reverse links, and evaluate visual heuristics for setting cutoff parameters. For a commercial repository, a product expert manually verifies over 2500 links to validate the results with unprecedented accuracy. Results. The heuristics pick a very good parameter value for five filters and a reasonably good one for the sixth. The combined filtering, called bflinks, provides 93% precision and only 7% results loss. Conclusion. Bflinks can provide high-quality results and adapts to repositories with different properties. PMID:27433506

  18. Planning collision free paths for two cooperating robots using a divide-and-conquer C-space traversal heuristic

    NASA Technical Reports Server (NTRS)

    Weaver, Johnathan M.

    1993-01-01

    A method was developed to plan feasible and obstacle-avoiding paths for two spatial robots working cooperatively in a known static environment. Cooperating spatial robots as referred to herein are robots which work in 6D task space while simultaneously grasping and manipulating a common, rigid payload. The approach is configuration space (c-space) based and performs selective rather than exhaustive c-space mapping. No expensive precomputations are required. A novel, divide-and-conquer type of heuristic is used to guide the selective mapping process. The heuristic does not involve any robot, environment, or task specific assumptions. A technique was also developed which enables solution of the cooperating redundant robot path planning problem without requiring the use of inverse kinematics for a redundant robot. The path planning strategy involves first attempting to traverse along the configuration space vector from the start point towards the goal point. If an unsafe region is encountered, an intermediate via point is identified by conducting a systematic search in the hyperplane orthogonal to and bisecting the unsafe region of the vector. This process is repeatedly applied until a solution to the global path planning problem is obtained. The basic concept behind this strategy is that better local decisions at the beginning of the trouble region may be made if a possible way around the 'center' of the trouble region is known. Thus, rather than attempting paths which look promising locally (at the beginning of a trouble region) but which may not yield overall results, the heuristic attempts local strategies that appear promising for circumventing the unsafe region.

  19. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    NASA Astrophysics Data System (ADS)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  20. A method of characterizing network topology based on the breadth-first search tree

    NASA Astrophysics Data System (ADS)

    Zhou, Bin; He, Zhe; Wang, Nianxin; Wang, Bing-Hong

    2016-05-01

    A method based on the breadth-first search tree is proposed in this paper to characterize the hierarchical structure of network. In this method, a similarity coefficient is defined to quantitatively distinguish networks, and quantitatively measure the topology stability of the network generated by a model. The applications of the method are discussed in ER random network, WS small-world network and BA scale-free network. The method will be helpful for deeply describing network topology and provide a starting point for researching the topology similarity and isomorphism of networks.

  1. A fast tomographic method for searching the minimum free energy path

    SciTech Connect

    Chen, Changjun; Huang, Yanzhao; Xiao, Yi; Jiang, Xuewei

    2014-10-21

    Minimum Free Energy Path (MFEP) provides a lot of important information about the chemical reactions, like the free energy barrier, the location of the transition state, and the relative stability between reactant and product. With MFEP, one can study the mechanisms of the reaction in an efficient way. Due to a large number of degrees of freedom, searching the MFEP is a very time-consuming process. Here, we present a fast tomographic method to perform the search. Our approach first calculates the free energy surfaces in a sequence of hyperplanes perpendicular to a transition path. Based on an objective function and the free energy gradient, the transition path is optimized in the collective variable space iteratively. Applications of the present method to model systems show that our method is practical. It can be an alternative approach for finding the state-to-state MFEP.

  2. Novel citation-based search method for scientific literature: application to meta-analyses.

    PubMed

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  3. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    PubMed

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  4. Heuristic algorithm for optical character recognition of Arabic script

    NASA Astrophysics Data System (ADS)

    Yarman-Vural, Fatos T.; Atici, A.

    1996-02-01

    In this paper, a heuristic method is developed for segmentation, feature extraction and recognition of the Arabic script. The study is part of a large project for the transcription of the documents in Ottoman Archives. A geometrical and topological feature analysis method is developed for segmentation and feature extraction stages. Chain code transformation is applied to main strokes of the characters which are then classified by the hidden Markov model (HMM) in the recognition stage. Experimental results indicate that the performance of the proposed method is impressive, provided that the thinning process does not yield spurious branches.

  5. MAXBAND Version 3.1: Heuristic and optimal approach for setting the left turn phase sequences in signalized networks

    SciTech Connect

    Pillai, R.S.; Rathi, A.K.

    1995-02-01

    The main objective of synchronized signal timing is to keep traffic moving along arterials in platoons throughout the signal system by proper setting of left turn phase sequence at signals along the arterials/networks. The synchronization of traffic signals located along the urban/suburban arterials in metropolitan areas is perhaps one of the most cost-effective methods for improving traffic flow along these streets. MAXBAND Version 2.1 (formerly known as MAXBAND-86), a progression-based optimization model, is used for generating signal timing plan for urban networks. This model formulates the problem as a mixed integer linear program and uses Land and Powell branch and bound search to arrive at the optimal solution. The computation time of MAXBAND Version 2.1 tends to be excessive for realistic multiarterial network problems due to the exhaustive nature of the branch and bound search technique. Furthermore, the Land and Powell branch and bound code is known to be numerically unstable, which results in suboptimal solutions for network problems with a range on the cycle time variable. This report presents the development of a new version of MAXBAND called MAXBAND Version 3.1. This new version has a fast heuristic algorithm and a fast optimal algorithm for generating signal timing plan for arterials and networks. MAXBAND 3.1 can generate optimal/near-optimal solutions in fraction of the time needed to compute the optimal solution by Version 2.1. The heuristic algorithm in the new model is based on restricted search using branch and bound technique. The algorithm for generating the optimal solution is faster and more efficient than version 2.1 algorithm. Furthermore, the new version is numerically stable. The efficiency of the new model is demonstrated by numerical results for a set of test problems.

  6. Fast optimization of binary clusters using a novel dynamic lattice searching method

    NASA Astrophysics Data System (ADS)

    Wu, Xia; Cheng, Wen

    2014-09-01

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd)79 clusters with DFT-fit parameters of Gupta potential.

  7. Fast optimization of binary clusters using a novel dynamic lattice searching method.

    PubMed

    Wu, Xia; Cheng, Wen

    2014-09-28

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd)79 clusters with DFT-fit parameters of Gupta potential.

  8. Fast optimization of binary clusters using a novel dynamic lattice searching method

    SciTech Connect

    Wu, Xia Cheng, Wen

    2014-09-28

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd){sub 79} clusters with DFT-fit parameters of Gupta potential.

  9. Distribution Planning: An Integration of Constraint Satisfaction & Heuristic Search Techniques

    DTIC Science & Technology

    1990-01-01

    Engineering Laboratory (HEL) (301) 278-5867, DSN 298-5867 Rajay Goyal, Neena Sathi, Bill Elm, Ivan Johnson* Carnegie Group Inc. (412) 642-6900 Dr. Mark ... Fox The Robotics Institute, Carnegie Mellon University BACKGROUND The dynamics and complexity of logistics planning require decision suppon tools

  10. A new method to search for high-redshift clusters using photometric redshifts

    SciTech Connect

    Castignani, G.; Celotti, A.; Chiaberge, M.; Norman, C.

    2014-09-10

    We describe a new method (Poisson probability method, PPM) to search for high-redshift galaxy clusters and groups by using photometric redshift information and galaxy number counts. The method relies on Poisson statistics and is primarily introduced to search for megaparsec-scale environments around a specific beacon. The PPM is tailored to both the properties of the FR I radio galaxies in the Chiaberge et al. sample, which are selected within the COSMOS survey, and to the specific data set used. We test the efficiency of our method of searching for cluster candidates against simulations. Two different approaches are adopted. (1) We use two z ∼ 1 X-ray detected cluster candidates found in the COSMOS survey and we shift them to higher redshift up to z = 2. We find that the PPM detects the cluster candidates up to z = 1.5, and it correctly estimates both the redshift and size of the two clusters. (2) We simulate spherically symmetric clusters of different size and richness, and we locate them at different redshifts (i.e., z = 1.0, 1.5, and 2.0) in the COSMOS field. We find that the PPM detects the simulated clusters within the considered redshift range with a statistical 1σ redshift accuracy of ∼0.05. The PPM is an efficient alternative method for high-redshift cluster searches that may also be applied to both present and future wide field surveys such as SDSS Stripe 82, LSST, and Euclid. Accurate photometric redshifts and a survey depth similar or better than that of COSMOS (e.g., I < 25) are required.

  11. Judgment under Uncertainty: Heuristics and Biases.

    PubMed

    Tversky, A; Kahneman, D

    1974-09-27

    This article described three heuristics that are employed in making judgements under uncertainty: (i) representativeness, which is usually employed when people are asked to judge the probability that an object or event A belongs to class or process B; (ii) availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development; and (iii) adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available. These heuristics are highly economical and usually effective, but they lead to systematic and predictable errors. A better understanding of these heuristics and of the biases to which they lead could improve judgements and decisions in situations of uncertainty.

  12. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  13. Heuristics for the inversion median problem

    PubMed Central

    2010-01-01

    Background The study of genome rearrangements has become a mainstay of phylogenetics and comparative genomics. Fundamental in such a study is the median problem: given three genomes find a fourth that minimizes the sum of the evolutionary distances between itself and the given three. Many exact algorithms and heuristics have been developed for the inversion median problem, of which the best known is MGR. Results We present a unifying framework for median heuristics, which enables us to clarify existing strategies and to place them in a partial ordering. Analysis of this framework leads to a new insight: the best strategies continue to refer to the input data rather than reducing the problem to smaller instances. Using this insight, we develop a new heuristic for inversion medians that uses input data to the end of its computation and leverages our previous work with DCJ medians. Finally, we present the results of extensive experimentation showing that our new heuristic outperforms all others in accuracy and, especially, in running time: the heuristic typically returns solutions within 1% of optimal and runs in seconds to minutes even on genomes with 25'000 genes--in contrast, MGR can take days on instances of 200 genes and cannot be used beyond 1'000 genes. Conclusion Finding good rearrangement medians, in particular inversion medians, had long been regarded as the computational bottleneck in whole-genome studies. Our new heuristic for inversion medians, ASM, which dominates all others in our framework, puts that issue to rest by providing near-optimal solutions within seconds to minutes on even the largest genomes. PMID:20122203

  14. Optimum tuned mass damper design using harmony search with comparison of classical methods

    NASA Astrophysics Data System (ADS)

    Nigdeli, Sinan Melih; Bekdaş, Gebrail; Sayin, Baris

    2017-07-01

    As known, tuned mass dampers (TMDs) are added to mechanical systems in order to obtain a good vibration damping. The main aim is to reduce the maximum amplitude at the resonance state. In this study, a metaheuristic algorithm called harmony search employed for the optimum design of TMDs. As the optimization objective, the transfer function of the acceleration of the system with respect to ground acceleration was minimized. The numerical trails were conducted for 4 single degree of freedom systems and the results were compared with classical methods. As a conclusion, the proposed method is feasible and more effective than the other documented methods.

  15. Search automation of the generalized method of device operational characteristics improvement

    NASA Astrophysics Data System (ADS)

    Petrova, I. Yu; Puchkova, A. A.; Zaripova, V. M.

    2017-01-01

    The article presents brief results of analysis of existing search methods of the closest patents, which can be applied to determine generalized methods of device operational characteristics improvement. There were observed the most widespread clustering algorithms and metrics for determining the proximity degree between two documents. The article proposes the technique of generalized methods determination; it has two implementation variants and consists of 7 steps. This technique has been implemented in the “Patents search” subsystem of the “Intellect” system. Also the article gives an example of the use of the proposed technique.

  16. Developing new mathematical method for search of the time series periodicity with deletions and insertions

    NASA Astrophysics Data System (ADS)

    Korotkov, E. V.; Korotkova, M. A.

    2017-01-01

    The purpose of this study was to detect latent periodicity in the presence of deletions or insertions in the analyzed data, when the points of deletions or insertions are unknown. A mathematical method was developed to search for periodicity in the numerical series, using dynamic programming and random matrices. The developed method was applied to search for periodicity in the Euro/Dollar (Eu/) exchange rate, since 2001. The presence of periodicity within the period length equal to 24 h in the analyzed financial series was shown. Periodicity can be detected only with insertions and deletions. The results of this study show that periodicity phase shifts, depend on the observation time. The reasons for the existence of the periodicity in the financial ranks are discussed.

  17. Protein structure prediction using hybrid AI methods

    SciTech Connect

    Guan, X.; Mural, R.J.; Uberbacher, E.C.

    1993-11-01

    This paper describes a new approach for predicting protein structures based on Artificial Intelligence methods and genetic algorithms. We combine nearest neighbor searching algorithms, neural networks, heuristic rules and genetic algorithms to form an integrated system to predict protein structures from their primary amino acid sequences. First we describe our methods and how they are integrated, and then apply our methods to several protein sequences. The results are very close to the real structures obtained by crystallography. Parallel genetic algorithms are also implemented.

  18. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    PubMed

    Tien, Shin-Ming; Hsu, Chih-Yuan; Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  19. Efficient strategies for genomic searching using the affected-pedigree-member method of linkage analysis

    SciTech Connect

    Brown, D.L.; Gorin, M.B.; Weeks, D.E. )

    1994-03-01

    The affected-pedigree-member (APM) method of linkage analysis is a nonparametric statistic that tests for nonrandom cosegregation of a disease and marker loci. The APM statistic is based on the observation that if a marker locus is near a disease-susceptibility locus, then affected individuals within a family should be more similar at the marker locus than is expected by chance. The APM statistic measures marker similarity in terms of identity by state (IBS) of marker alleles; that is, two alleles are IBS if they are the same, regardless of their ancestral origin. Since the APM statistic measures increased marker similarity, it makes no assumptions concerning how the disease is inherited; this can be an advantage when dealing with complex diseases for which the mode of inheritance is difficult to determine. The authors investigate here the power of the APM statistic to detect linkage in the context of a genomewide search. In such a search, the APM statistic is evaluated at a grid of markers. Then regions with high APM statistics are investigated more thoroughly by typing more markers in the region. Using simulated data, they investigate various search strategies and recommended an optimal search strategy that maximizes the power to detect linkage while minimizing the false-positive rate and number of markers. They determine an optimal series of three increasing cut-points and an independent criterion for significance. 14 refs., 7 figs., 4 tabs.

  20. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method

    PubMed Central

    Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella’s rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the “brake component” in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each “brake component” were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the “brake component”. Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate “brake component” in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains. PMID:27096615

  1. A new conjugate gradient method with sufficient descent without any line search for unconstrained optimization

    NASA Astrophysics Data System (ADS)

    Omer, Osman; Rivaie, Mohd; Mamat, Mustafa; Amani, Zahrahtul

    2015-02-01

    Conjugate gradient methods are one of the most used methods for solving nonlinear unconstrained optimization problems, especially of large scale. Their wide applications are due to their simplicity and low memory requirement. The sufficient descent property is an important issue in the analyses and implementations of conjugate gradient methods. In this paper, a new conjugate gradient method is proposed for unconstrained optimization problems. The theoretical analysis shows that the directions generated by the new method are always satisfy the sufficient descent property, and this property is independent of the line search used. Furthermore, a numerical experiment based on comparing the new method with other known conjugate gradient methods shows that the new is efficient for some unconstrained optimization problems.

  2. Cryptanalysis of optical encryption: a heuristic approach

    NASA Astrophysics Data System (ADS)

    Gopinathan, Unnikrishnan; Monaghan, David S.; Naughton, Thomas J.; Sheridan, John T.

    2006-10-01

    The Fourier plane encryption algorithm is subjected to a heuristic known-plaintext attack. The simulated annealing algorithm is used to estimate the key using a known plaintext-ciphertext pair which decrypts the ciphertext with arbitrarily low error. The strength of the algorithm is tested by using the key to decrypt a different ciphertext encrypted using the same original key. The Fourier plane encryption algorithm is found to be susceptible to a known-plaintext heuristic attack. It is found that phase only encryption, a variation of Fourier plane encoding algorithm, successfully defends against this attack.

  3. The Search Conference as a Method in Planning Community Health Promotion Actions

    PubMed Central

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-01-01

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives. Significance for public health This article describe and discuss how the Search conference can be used as a method when working with knowledge based health promotion actions in local communities. The article describe the sequences of the conference and shows how this have been adapted when planning and prioritizing health promotion actions in three Norwegian municipalities. The significance of the article is that it shows how central elements in the planning of health promotion actions, as participation and involvements as well as evidence was a fundamental thinking in how the conference were accomplished. The article continue discussing how the method function as both a top-down and a bottom-up strategy, and in what way working evidence based can be in conflict with a bottom-up strategy. The experiences described can be used as guidance planning knowledge based health promotion actions in communities. PMID:27747199

  4. Design of a Pilot-Activated Recovery System Using Genetic Search Methods

    DTIC Science & Technology

    1999-01-01

    Copyright 1999 by Optimal Synthesis . All Rights Reserved. DESIGN OF A PILOT-ACTIVATED RECOVERY SYSTEM USING GENETIC SEARCH METHODS G. D. Sweriduk...and P. K. Menon† M. L. Steinberg‡ Optimal Synthesis Inc. Aeromechanics Division 470 San Antonio Road, Suite 200 Naval Air Systems Command Palo Alto, CA...Flight Control Engineer, Senior Member, AIAA Copyright ©1999 by Optimal Synthesis Inc.. Printed by the American Institute of Aeronautics and

  5. A new correlation parameter extraction method for searching mode sea clutter restraint

    NASA Astrophysics Data System (ADS)

    Yuan, Xujin; Chen, Yong; Wang, Chao; Yin, Hongcheng; Yao, Jingping; Xu, Zhiming; Lu, Yongge

    2014-11-01

    Correlation characteristics are important for sea clutter restraint in radar image processing, which is universal in certain features for both radar and optical image. The spatial-temporal dispersion relation contained in sea clutter, which usually fits into the sum of wind friction linear item and gravity wave item, is proved effective for sea clutter restraint recently. A sea clutter restraint method for mobile searching mode observation point, which is prevalent in maritime air-borne platform, is developed from a former restraint method for shore-based stationary radar. A new extraction method for intrinsic dispersion linear item parameter of sea clutter aiming at range profile of searching mode radar system is proposed, which forms the core of the restraint method. The statistical model is founded on spatial-temporal dispersion relation and radar illumination geometry. Simulation results shows Doppler shift of measured wind linear item have a bias from the sum of sea wind and radar platform velocity. A systematic study suggests the bias could be attributed to interaction between wind friction and gravity wave item, and a rather good fitting result is obtained. CFAR detection processing for a set of experimental clutter data shows that this intrinsic dispersion extraction method formula is found to be effective in detection probability enhancement. The restraint method based on the proposed dispersion extraction method could be utilized in mobile maritime surveillance equipments.

  6. The Gaussian CLs method for searches of new physics

    SciTech Connect

    Qian, X.; Tan, A.; Ling, J. J.; Nakajima, Y.; Zhang, C.

    2016-04-23

    Here we describe a method based on the CLs approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CLs method. Our work provides a self-contained mathematical proof for the Gaussian CLs method, that explicitly outlines the required conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CLs method in an example of searching for a sterile neutrino, where the CLs approach was rarely used before. We also compare data analysis results produced by the Gaussian CLs method and various CI methods to showcase their differences.

  7. The Gaussian CLs method for searches of new physics

    SciTech Connect

    Qian, X.; Tan, A.; Ling, J. J.; Nakajima, Y.; Zhang, C.

    2016-04-23

    Here we describe a method based on the CLs approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CLs method. Our work provides a self-contained mathematical proof for the Gaussian CLs method, that explicitly outlines the required conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CLs method in an example of searching for a sterile neutrino, where the CLs approach was rarely used before. We also compare data analysis results produced by the Gaussian CLs method and various CI methods to showcase their differences.

  8. Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming

    2012-01-01

    In this paper, we propose to solve the constrained optimization problem in two phases. The first phase uses heuristic methods such as the ant colony method, particle swarming optimization, and genetic algorithm to seek a near optimal solution among a list of feasible initial populations. The final optimal solution can be found by using the solution of the first phase as the initial condition to the SQP algorithm. We demonstrate the above problem formulation and optimization schemes with a large-scale network that includes the DSN ground stations and a number of spacecraft of deep space missions.

  9. Heuristic optimization in penumbral image for high resolution reconstructed image

    SciTech Connect

    Azuma, R.; Nozaki, S.; Fujioka, S.; Chen, Y. W.; Namihira, Y.

    2010-10-15

    Penumbral imaging is a technique which uses the fact that spatial information can be recovered from the shadow or penumbra that an unknown source casts through a simple large circular aperture. The size of the penumbral image on the detector can be mathematically determined as its aperture size, object size, and magnification. Conventional reconstruction methods are very sensitive to noise. On the other hand, the heuristic reconstruction method is very tolerant of noise. However, the aperture size influences the accuracy and resolution of the reconstructed image. In this article, we propose the optimization of the aperture size for the neutron penumbral imaging.

  10. Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming

    2012-01-01

    In this paper, we propose to solve the constrained optimization problem in two phases. The first phase uses heuristic methods such as the ant colony method, particle swarming optimization, and genetic algorithm to seek a near optimal solution among a list of feasible initial populations. The final optimal solution can be found by using the solution of the first phase as the initial condition to the SQP algorithm. We demonstrate the above problem formulation and optimization schemes with a large-scale network that includes the DSN ground stations and a number of spacecraft of deep space missions.

  11. Neighbourhood search feature selection method for content-based mammogram retrieval.

    PubMed

    Chandy, D Abraham; Christinal, A Hepzibah; Theodore, Alwyn John; Selvan, S Easter

    2017-03-01

    Content-based image retrieval plays an increasing role in the clinical process for supporting diagnosis. This paper proposes a neighbourhood search method to select the near-optimal feature subsets for the retrieval of mammograms from the Mammographic Image Analysis Society (MIAS) database. The features based on grey level cooccurrence matrix, Daubechies-4 wavelet, Gabor, Cohen-Daubechies-Feauveau 9/7 wavelet and Zernike moments are extracted from mammograms available in the MIAS database to form the combined or fused feature set for testing various feature selection methods. The performance of feature selection methods is evaluated using precision, storage requirement and retrieval time measures. Using the proposed method, a significant improvement is achieved in mean precision rate and feature dimension. The results show that the proposed method outperforms the state-of-the-art feature selection methods.

  12. On methods for correcting for the look-elsewhere effect in searches for new physics

    NASA Astrophysics Data System (ADS)

    Algeri, S.; van Dyk, D. A.; Conrad, J.; Anderson, B.

    2016-12-01

    The search for new significant peaks over a energy spectrum often involves a statistical multiple hypothesis testing problem. Separate tests of hypothesis are conducted at different locations over a fine grid producing an ensemble of local p-values, the smallest of which is reported as evidence for the new resonance. Unfortunately, controlling the false detection rate (type I error rate) of such procedures may lead to excessively stringent acceptance criteria. In the recent physics literature, two promising statistical tools have been proposed to overcome these limitations. In 2005, a method to ``find needles in haystacks'' was introduced by Pilla et al. [1], and a second method was later proposed by Gross and Vitells [2] in the context of the ``look-elsewhere effect'' and trial factors. We show that, although the two methods exhibit similar performance for large sample sizes, for relatively small sample sizes, the method of Pilla et al. leads to an artificial inflation of statistical power that stems from an increase in the false detection rate. This method, on the other hand, becomes particularly useful in multidimensional searches, where the Monte Carlo simulations required by Gross and Vitells are often unfeasible. We apply the methods to realistic simulations of the Fermi Large Area Telescope data, in particular the search for dark matter annihilation lines. Further, we discuss the counter-intuitive scenario where the look-elsewhere corrections are more conservative than much more computationally efficient corrections for multiple hypothesis testing. Finally, we provide general guidelines for navigating the tradeoffs between statistical and computational efficiency when selecting a statistical procedure for signal detection.

  13. Improved Shear Wave Group Velocity Estimation Method Based on Spatiotemporal Peak and Thresholding Motion Search.

    PubMed

    Amador Carrascal, Carolina; Chen, Shigao; Manduca, Armando; Greenleaf, James F; Urban, Matthew W

    2017-04-01

    Quantitative ultrasound elastography is increasingly being used in the assessment of chronic liver disease. Many studies have reported ranges of liver shear wave velocity values for healthy individuals and patients with different stages of liver fibrosis. Nonetheless, ongoing efforts exist to stabilize quantitative ultrasound elastography measurements by assessing factors that influence tissue shear wave velocity values, such as food intake, body mass index, ultrasound scanners, scanning protocols, and ultrasound image quality. Time-to-peak (TTP) methods have been routinely used to measure the shear wave velocity. However, there is still a need for methods that can provide robust shear wave velocity estimation in the presence of noisy motion data. The conventional TTP algorithm is limited to searching for the maximum motion in time profiles at different spatial locations. In this paper, two modified shear wave speed estimation algorithms are proposed. The first method searches for the maximum motion in both space and time [spatiotemporal peak (STP)]; the second method applies an amplitude filter [spatiotemporal thresholding (STTH)] to select points with motion amplitude higher than a threshold for shear wave group velocity estimation. The two proposed methods (STP and STTH) showed higher precision in shear wave velocity estimates compared with TTP in phantom. Moreover, in a cohort of 14 healthy subjects, STP and STTH methods improved both the shear wave velocity measurement precision and the success rate of the measurement compared with conventional TTP.

  14. Hybridization of evolutionary algorithms and local search by means of a clustering method.

    PubMed

    Martínez-Estudillo, Alfonso C; Hervás-Martínez, César; Martínez-Estudillo, Francisco J; García-Pedrajas, Nicolás

    2006-06-01

    This paper presents a hybrid evolutionary algorithm (EA) to solve nonlinear-regression problems. Although EAs have proven their ability to explore large search spaces, they are comparatively inefficient in fine tuning the solution. This drawback is usually avoided by means of local optimization algorithms that are applied to the individuals of the population. The algorithms that use local optimization procedures are usually called hybrid algorithms. On the other hand, it is well known that the clustering process enables the creation of groups (clusters) with mutually close points that hopefully correspond to relevant regions of attraction. Local-search procedures can then be started once in every such region. This paper proposes the combination of an EA, a clustering process, and a local-search procedure to the evolutionary design of product-units neural networks. In the methodology presented, only a few individuals are subject to local optimization. Moreover, the local optimization algorithm is only applied at specific stages of the evolutionary process. Our results show a favorable performance when the regression method proposed is compared to other standard methods.

  15. Go3R - semantic Internet search engine for alternative methods to animal testing.

    PubMed

    Sauer, Ursula G; Wächter, Thomas; Grune, Barbara; Doms, Andreas; Alvers, Michael R; Spielmann, Horst; Schroeder, Michael

    2009-01-01

    Consideration and incorporation of all available scientific information is an important part of the planning of any scientific project. As regards research with sentient animals, EU Directive 86/609/EEC for the protection of laboratory animals requires scientists to consider whether any planned animal experiment can be substituted by other scientifically satisfactory methods not entailing the use of animals or entailing less animals or less animal suffering, before performing the experiment. Thus, collection of relevant information is indispensable in order to meet this legal obligation. However, no standard procedures or services exist to provide convenient access to the information required to reliably determine whether it is possible to replace, reduce or refine a planned animal experiment in accordance with the 3Rs principle. The search engine Go3R, which is available free of charge under http://Go3R.org, runs up to become such a standard service. Go3R is the world-wide first search engine on alternative methods building on new semantic technologies that use an expert-knowledge based ontology to identify relevant documents. Due to Go3R's concept and design, the search engine can be used without lengthy instructions. It enables all those involved in the planning, authorisation and performance of animal experiments to determine the availability of non-animal methodologies in a fast, comprehensive and transparent manner. Thereby, Go3R strives to significantly contribute to the avoidance and replacement of animal experiments.

  16. Cooperative unmanned aerial vehicle (UAV) search in dynamic environments using stochastic methods

    NASA Astrophysics Data System (ADS)

    Flint, Matthew D.

    Within this dissertation, the problem of the control of the decentralized path planning decision processes of multiple cooperating autonomous aerial vehicles engaged in search of an uncertain environment is considered. The environment is modeled in a probabilistic fashion, such that both a priori and dynamic information about it can be incorporated. The components of the environment include both target information and threat information. Using the information about the environment, a computationally feasible decision process is formulated that can decide; in a near optimal fashion, which path a searching vehicle should take, using a dynamic programming algorithm with a limited look ahead horizon, with the possibility to extend the horizon using Approximate Dynamic Programming. A planning vehicle trust take into account the effects of its (local) actions on meeting global goals. This is accomplished using a passive and predictive cooperation scheme among the vehicles. Lastly, a flexible simulator has been developed, using sound simulation analysis methods, to simulate a UAV search team, which can be used to create statistically valid results demonstrating the effectiveness of the model and solution methods.

  17. Midtrimester termination of pregnancy--search for a better method continues.

    PubMed

    Chhabra, S; Menon, G

    1991-11-01

    Every day a new method of termination of second trimester pregnancy clearly indicates that we have still not found a simple, safe, effective and economic method of termination of pregnancy in second trimester. Present study of 855 cases aims at searching out something better from available modalities. Age old hypertonic saline and ethacridine lactate were used with adjuvants like hyaluronidase and a preparation containing isapgol husk to reduce injection abortion interval and failures. Life threatening dangers of hypertonic saline are known. Ethacridine lactate seems to be safe. By giving it intra-amniotically with these adjuvants its major disadvantages could be minimised. There was no mortality. However, there was morbidity in the series.

  18. Discovery of novel mesangial cell proliferation inhibitors using a three-dimensional database searching method.

    PubMed

    Kurogi, Y; Miyata, K; Okamura, T; Hashimoto, K; Tsutsumi, K; Nasu, M; Moriyasu, M

    2001-07-05

    A three-dimensional pharmacophore model of mesangial cell (MC) proliferation inhibitors was generated from a training set of 4-(diethoxyphosphoryl)methyl-N-(3-phenyl-[1,2,4]thiadiazol-5-yl)benzamide, 2, and its derivatives using the Catalyst/HIPHOP software program. On the basis of the in vitro MC proliferation inhibitory activity, a pharmacophore model was generated as seven features consisting of two hydrophobic regions, two hydrophobic aromatic regions, and three hydrogen bond acceptors. Using this model as a three-dimensional query to search the Maybridge database, structurally novel 41 compounds were identified. The evaluation of MC proliferation inhibitory activity using available samples from the 41 identified compounds exhibited over 50% inhibitory activity at the 100 nM range. Interestingly, the newly identified compounds by the 3D database searching method exhibited the reduced inhibition of normal proximal tubular epithelial cell proliferation compared to a training set of compounds.

  19. Local search methods based on variable focusing for random K -satisfiability

    NASA Astrophysics Data System (ADS)

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed.

  20. A Heuristic for the Teaching of Persuasion.

    ERIC Educational Resources Information Center

    Schell, John F.

    Interpreting Aristotle's criteria for persuasive writing--ethos, logos, and pathos--as a concern for writer, language, and audience creates both an effective model for persuasive writing and a structure around which to organize discussions of relevant rhetorical issues. Use of this heuristic to analyze writing style, organization, and content…

  1. Fourth Graders' Heuristic Problem-Solving Behavior.

    ERIC Educational Resources Information Center

    Lee, Kil S.

    1982-01-01

    Eight boys and eight girls from a rural elementary school participated in the investigation. Specific heuristics were adopted from Polya; and the students selected represented two substages of Piaget's concrete operational stage. Five hypotheses were generated, based on observed results and the study's theoretical rationale. (MP)

  2. Investigating Heuristic Evaluation: A Case Study.

    ERIC Educational Resources Information Center

    Goldman, Kate Haley; Bendoly, Laura

    When museum professionals speak of evaluating a web site, they primarily mean formative evaluation, and by that they primarily mean testing the usability of the site. In the for-profit world, usability testing is a multi-million dollar industry, while non-profits often rely on far too few dollars to do too much. Hence, heuristic evaluation is one…

  3. The Heuristic Interpretation of Box Plots

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2013-01-01

    Box plots are frequently used, but are often misinterpreted by students. Especially the area of the box in box plots is often misinterpreted as representing number or proportion of observations, while it actually represents their density. In a first study, reaction time evidence was used to test whether heuristic reasoning underlies this…

  4. Teaching a Heuristic Approach to Information Retrieval.

    ERIC Educational Resources Information Center

    Ury, Connie Jo; And Others

    1997-01-01

    Discusses lifelong learning and the need for information retrieval skills, and describes how Northwest Missouri State University incorporates a heuristic model of library instruction in which students continually evaluate and refine information-seeking practices while progressing through all levels of courses in diverse disciplines. (Author/LRW)

  5. Evaluating Persuasive Messages: Systematic and Heuristic Strategies.

    ERIC Educational Resources Information Center

    White, H. Allen; Miller, M. Mark

    One hundred undergraduate students at a large southern university were the subjects of a study to determine whether the persuasion process encompasses two mutually exclusive strategies--systematic or heuristic processing of information--or whether the two processes are, in fact, independent. Subjects participated in groups of about l5 and were…

  6. The Heuristic Interpretation of Box Plots

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2013-01-01

    Box plots are frequently used, but are often misinterpreted by students. Especially the area of the box in box plots is often misinterpreted as representing number or proportion of observations, while it actually represents their density. In a first study, reaction time evidence was used to test whether heuristic reasoning underlies this…

  7. Search Control Algorithm Based on Random Step Size Hill-Climbing Method for Adaptive PMD Compensation

    NASA Astrophysics Data System (ADS)

    Tanizawa, Ken; Hirose, Akira

    Adaptive polarization mode dispersion (PMD) compensation is required for the speed-up and advancement of the present optical communications. The combination of a tunable PMD compensator and its adaptive control method achieves adaptive PMD compensation. In this paper, we report an effective search control algorithm for the feedback control of the PMD compensator. The algorithm is based on the hill-climbing method. However, the step size changes randomly to prevent the convergence from being trapped at a local maximum or a flat, unlike the conventional hill-climbing method. The randomness depends on the Gaussian probability density functions. We conducted transmission simulations at 160Gb/s and the results show that the proposed method provides more optimal compensator control than the conventional hill-climbing method.

  8. Heuristics Made Easy: An Effort-Reduction Framework

    ERIC Educational Resources Information Center

    Shah, Anuj K.; Oppenheimer, Daniel M.

    2008-01-01

    In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…

  9. Heuristic Diagrams as a Tool to Teach History of Science

    ERIC Educational Resources Information Center

    Chamizo, Jose A.

    2012-01-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…

  10. Heuristic Diagrams as a Tool to Teach History of Science

    ERIC Educational Resources Information Center

    Chamizo, Jose A.

    2012-01-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…

  11. Heuristics Made Easy: An Effort-Reduction Framework

    ERIC Educational Resources Information Center

    Shah, Anuj K.; Oppenheimer, Daniel M.

    2008-01-01

    In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…

  12. A Variable-Selection Heuristic for K-Means Clustering.

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Cradit, J. Dennis

    2001-01-01

    Presents a variable selection heuristic for nonhierarchical (K-means) cluster analysis based on the adjusted Rand index for measuring cluster recovery. Subjected the heuristic to Monte Carlo testing across more than 2,200 datasets. Results indicate that the heuristic is extremely effective at eliminating masking variables. (SLD)

  13. Comparing the Precision of Information Retrieval of MeSH-Controlled Vocabulary Search Method and a Visual Method in the Medline Medical Database.

    PubMed

    Hariri, Nadjla; Ravandi, Somayyeh Nadi

    2014-01-01

    Medline is one of the most important databases in the biomedical field. One of the most important hosts for Medline is Elton B. Stephens CO. (EBSCO), which has presented different search methods that can be used based on the needs of the users. Visual search and MeSH-controlled search methods are among the most common methods. The goal of this research was to compare the precision of the retrieved sources in the EBSCO Medline base using MeSH-controlled and visual search methods. This research was a semi-empirical study. By holding training workshops, 70 students of higher education in different educational departments of Kashan University of Medical Sciences were taught MeSH-Controlled and visual search methods in 2012. Then, the precision of 300 searches made by these students was calculated based on Best Precision, Useful Precision, and Objective Precision formulas and analyzed in SPSS software using the independent sample T Test, and three precisions obtained with the three precision formulas were studied for the two search methods. The mean precision of the visual method was greater than that of the MeSH-Controlled search for all three types of precision, i.e. Best Precision, Useful Precision, and Objective Precision, and their mean precisions were significantly different (P <0.001). Sixty-five percent of the researchers indicated that, although the visual method was better than the controlled method, the control of keywords in the controlled method resulted in finding more proper keywords for the searches. Fifty-three percent of the participants in the research also mentioned that the use of the combination of the two methods produced better results. For users, it is more appropriate to use a natural, language-based method, such as the visual method, in the EBSCO Medline host than to use the controlled method, which requires users to use special keywords. The potential reason for their preference was that the visual method allowed them more freedom of action.

  14. Efficient Globally Optimal Consensus Maximisation with Tree Search.

    PubMed

    Chin, Tat-Jun; Purkait, Pulak; Eriksson, Anders; Suter, David

    2017-04-01

    Maximum consensus is one of the most popular criteria for robust estimation in computer vision. Despite its widespread use, optimising the criterion is still customarily done by randomised sample-and-test techniques, which do not guarantee optimality of the result. Several globally optimal algorithms exist, but they are too slow to challenge the dominance of randomised methods. Our work aims to change this state of affairs by proposing an efficient algorithm for global maximisation of consensus. Under the framework of LP-type methods, we show how consensus maximisation for a wide variety of vision tasks can be posed as a tree search problem. This insight leads to a novel algorithm based on A* search. We propose efficient heuristic and support set updating routines that enable A* search to efficiently find globally optimal results. On common estimation problems, our algorithm is much faster than previous exact methods. Our work identifies a promising direction for globally optimal consensus maximisation.

  15. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm.

    PubMed

    Cuevas, Erik; Díaz, Margarita

    2015-01-01

    In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC) algorithm and the evolutionary method harmony search (HS). With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples) are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness.

  16. Spectrum-based method to generate good decoy libraries for spectral library searching in peptide identifications.

    PubMed

    Cheng, Chia-Ying; Tsai, Chia-Feng; Chen, Yu-Ju; Sung, Ting-Yi; Hsu, Wen-Lian

    2013-05-03

    As spectral library searching has received increasing attention for peptide identification, constructing good decoy spectra from the target spectra is the key to correctly estimating the false discovery rate in searching against the concatenated target-decoy spectral library. Several methods have been proposed to construct decoy spectral libraries. Most of them construct decoy peptide sequences and then generate theoretical spectra accordingly. In this paper, we propose a method, called precursor-swap, which directly constructs decoy spectral libraries directly at the "spectrum level" without generating decoy peptide sequences by swapping the precursors of two spectra selected according to a very simple rule. Our spectrum-based method does not require additional efforts to deal with ion types (e.g., a, b or c ions), fragment mechanism (e.g., CID, or ETD), or unannotated peaks, but preserves many spectral properties. The precursor-swap method is evaluated on different spectral libraries and the results of obtained decoy ratios show that it is comparable to other methods. Notably, it is efficient in time and memory usage for constructing decoy libraries. A software tool called Precursor-Swap-Decoy-Generation (PSDG) is publicly available for download at http://ms.iis.sinica.edu.tw/PSDG/.

  17. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm

    PubMed Central

    Cuevas, Erik; Díaz, Margarita

    2015-01-01

    In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC) algorithm and the evolutionary method harmony search (HS). With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples) are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness. PMID:26339228

  18. Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.

    PubMed

    Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven

    2009-01-01

    The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.

  19. Usability of a Patient Education and Motivation Tool Using Heuristic Evaluation

    PubMed Central

    Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew

    2009-01-01

    Background Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. Objective The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. Methods PEMT was evaluated by three usability experts using Nielsen’s usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. Results A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations

  20. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    PubMed

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice.

  1. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  2. [Searching for dwarf nova candidates with automatic methods in massive spectra].

    PubMed

    Wang, Wen-Yu; Wang, Xin-Jun; Pan, Jing-Chang

    2013-12-01

    In the present paper, an automatic and efficient method for searching for dwarf nova candidates is presented. The methods PCA (principal component analysis) and SVM (support vector machine) are applied in the newly released SDSS-DR9 spectra. The final dimensions of the feature space are determined by the identification accuracy of training samples with different dimensions constrained by SVM. The massive spectra are dimension reduced by PCA at first and classified by the best SVM clas sifier. The final less number of candidates can be identified manually. A total number of 276 dwarf nova candidates are selected by the method and 6 of them are new discoveries which prove that our approach to finding special celestial bodies in massive spectra data is feasible. The new discoveries of this paper are added in the current dwarf nova template library which can contribute to constructing a more accurate feature space. The method proposed in this paper can also be used for special objects searching in other sky survey telescopes like Guoshoujing (Large Sky Area Multi-Object Fiber Spectroscopic Telescope -LAMOST) telescope.

  3. Heuristical Feature Extraction from LIDAR Data and Their Visualization

    NASA Astrophysics Data System (ADS)

    Ghosh, S.; Lohani, B.

    2011-09-01

    Extraction of landscape features from LiDAR data has been studied widely in the past few years. These feature extraction methodologies have been focussed on certain types of features only, namely the bare earth model, buildings principally containing planar roofs, trees and roads. In this paper, we present a methodology to process LiDAR data through DBSCAN, a density based clustering method, which extracts natural and man-made clusters. We then develop heuristics to process these clusters and simplify them to be sent to a visualization engine.

  4. Heuristic algorithm for off-lattice protein folding problem*

    PubMed Central

    Chen, Mao; Huang, Wen-qi

    2006-01-01

    Enlightened by the law of interactions among objects in the physical world, we propose a heuristic algorithm for solving the three-dimensional (3D) off-lattice protein folding problem. Based on a physical model, the problem is converted from a nonlinear constraint-satisfied problem to an unconstrained optimization problem which can be solved by the well-known gradient method. To improve the efficiency of our algorithm, a strategy was introduced to generate initial configuration. Computational results showed that this algorithm could find states with lower energy than previously proposed ground states obtained by nPERM algorithm for all chains with length ranging from 13 to 55. PMID:16365919

  5. Cell Searching and DoA Estimation Methods for Mobile Relay Stations with a Uniform Linear Array

    NASA Astrophysics Data System (ADS)

    Ko, Yo-Han; Park, Chang-Hwan; Kwon, Soon-Jik; Cho, Yong-Soo

    In this paper, cell searching and direction-of-arrival (DoA) estimation methods are proposed for mobile relay stations with a uniform linear arrays in OFDM-based cellular systems. The proposed methods can improve the performance of cell searching and DoA estimation, even when there exist symbol timing offsets among the signals received from adjacent base stations and Doppler frequency shifts caused by the movement of the mobile relay station. The performances and computational complexities of the proposed cell searching and DoA estimation methods are evaluated by computer simulation under a mobile WiMAX environment.

  6. Fast orthogonal search method to estimate upper arm Hill-based muscle model parameters.

    PubMed

    Mountjoy, Katherine C; Hashtrudi-Zaad, Keyvan; Morin, Evelyn L

    2008-01-01

    We propose a methodology to estimate subject-specific physiological parameters of Hill-based models of upper arm muscles. The methodology uses Hill-type candidate functions in the Fast Orthogonal Search (FOS) method to predict force at the wrist during elbow flexion and extension. To this end, surface EMG data from three muscles of the upper arm were recorded from 5 subjects as they performed isometric contractions at different elbow joint angles. Estimated muscle activation level and joint angle were utilized as inputs to the FOS model to obtain subject-specific estimates of optimal joint angle the Gaussian shape parameter for the force-length relationship for each muscle.

  7. Local Search Method for a Parallel Machine Scheduling Problemof Minimizing the Number of Machines Operated

    NASA Astrophysics Data System (ADS)

    Yamana, Takashi; Iima, Hitoshi; Sannomiya, Nobuo

    Although there have been many studies on parallel machine scheduling problems, the number of machines operated is fixed in these studies. It is desirable to generate a schedule with fewer machines operated from the viewpoint of the operation cost of machines. In this paper, we cope with a problem of minimizing the number of parallel machines subject to the constraint that the total tardiness is not greater than the value given in advance. For this problem, we introduce a local search method in which the number of machines operated is changed efficiently and appropriately in a short time as well as reducing the total tardiness.

  8. Search Complexities for HTN Planning

    DTIC Science & Technology

    2013-01-01

    perfect strategy for n× n chess requires time exponential in n. Journal of Combinatorial Theory, Series A, 31(2):199–214, 1981. Thomas Geier and Pascal...Intelligence Research, 20:291–341, 2003. J. Hoffmann and Bernhard Nebel. The FF planning system: Fast plan generation through heuristic search. Journal

  9. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    PubMed Central

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2010-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer interactions and relationships, social problem solving and communication, social-affective and cognitive-executive processes, and their neural substrates. The model is illustrated by research on a specific form of childhood brain disorder, traumatic brain injury. The heuristic model may promote research regarding the neural and cognitive-affective substrates of children’s social development. It also may engender more precise methods of measuring impairments and disabilities in children with brain disorder and suggest ways to promote their social adaptation. PMID:17469991

  10. Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning

    PubMed Central

    Payne, Velma L.; Crowley, Rebecca S.

    2008-01-01

    We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors. PMID:18999140

  11. On the heuristic nature of medical decision-support systems.

    PubMed

    Aliferis, C F; Miller, R A

    1995-03-01

    In the realm of medical decision-support systems, the term "heuristic systems" is often considered to be synonymous with "medical artificial intelligence systems" or with "systems employing informal model(s) of problem solving". Such a view may be inaccurate and possibly impede the conceptual development of future systems. This article examines the nature of heuristics and the levels at which heuristic solutions are introduced during system design and implementation. The authors discuss why heuristics are ubiquitous in all medical decision-support systems operating at non-trivial domains, and propose a unifying definition of heuristics that encompasses formal and ad hoc systems. System developers should be aware of the heuristic nature of all problem solving done in complex real world domains, and characterize their own use of heuristics in describing system development and implementation.

  12. Micro-seismic waveform matching inversion based on gravitational search algorithm and parallel computation

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Xing, H. L.

    2016-12-01

    Micro-seismic events induced by water injection, mining activity or oil/gas extraction are quite informative, the interpretation of which can be applied for the reconstruction of underground stress and monitoring of hydraulic fracturing progress in oil/gas reservoirs. The source characterises and locations are crucial parameters that required for these purposes, which can be obtained through the waveform matching inversion (WMI) method. Therefore it is imperative to develop a WMI algorithm with high accuracy and convergence speed. Heuristic algorithm, as a category of nonlinear method, possesses a very high convergence speed and good capacity to overcome local minimal values, and has been well applied for many areas (e.g. image processing, artificial intelligence). However, its effectiveness for micro-seismic WMI is still poorly investigated; very few literatures exits that addressing this subject. In this research an advanced heuristic algorithm, gravitational search algorithm (GSA) , is proposed to estimate the focal mechanism (angle of strike, dip and rake) and source locations in three dimension. Unlike traditional inversion methods, the heuristic algorithm inversion does not require the approximation of green function. The method directly interacts with a CPU parallelized finite difference forward modelling engine, and updating the model parameters under GSA criterions. The effectiveness of this method is tested with synthetic data form a multi-layered elastic model; the results indicate GSA can be well applied on WMI and has its unique advantages. Keywords: Micro-seismicity, Waveform matching inversion, gravitational search algorithm, parallel computation

  13. Effect of advanced location methods on search and rescue duration for general aviation aircraft accidents in the contiguous United States

    NASA Astrophysics Data System (ADS)

    Wallace, Ryan J.

    The purpose of this study was to determine the impact of advanced search and rescue devices and techniques on search duration for general aviation aircraft crashes. The study assessed three categories of emergency locator transmitters, including 121.5 MHz, 406 MHz, and GPS-Assisted 406 MHz devices. The impact of the COSPAS-SARSAT organization ceasing satellite monitoring for 121.5 MHz ELTs in 2009 was factored into the study. Additionally, the effect of using radar forensic analysis and cellular phone forensic search methods were also assessed. The study's data was derived from an Air Force Rescue Coordination Center database and included 365 historical general aviation search and rescue missions conducted between 2006 and 2011. Highly skewed data was transformed to meet normality requirements for parametric testing. The significance of each ELT model was assessed using a combination of Brown-Forsythe Means Testing or Orthogonal Contrast Testing. ANOVA and Brown-Forsythe Means testing was used to evaluate cellular phone and radar forensic search methods. A Spearman's Rho test was used to determine if the use of multiple search methods produced an additive effect in search efficiency. Aircraft which utilized an Emergency Locator Transmitter resulted in a shorter search duration than those which did not use such devices. Aircraft utilizing GPS-Aided 406 MHz ELTs appeared to require less time to locate than if equipped with other ELT models, however, this assessment requires further study due to limited data. Aircraft equipped with 406 MHz ELTs required slightly less time to locate than aircraft equipped with older 121.5 MHz ELTs. The study found no substantial difference in the search durations for 121.5 MHz ELTs monitored by COSPAS-SARSAT verses those which were not. Significance testing revealed that the use of cellular phone forensic data and radar forensic data both resulted in substantially higher mission search durations. Some possible explanations for this

  14. Structuralism and Its Heuristic Implications.

    ERIC Educational Resources Information Center

    Greene, Ruth M.

    1984-01-01

    The author defines structuralism (a method for modeling and analyzing event systems in a space-time framework), traces its origins to the work of J. Piaget and M. Fourcault, and discusses its implications for learning. (CL)

  15. Structuralism and Its Heuristic Implications.

    ERIC Educational Resources Information Center

    Greene, Ruth M.

    1984-01-01

    The author defines structuralism (a method for modeling and analyzing event systems in a space-time framework), traces its origins to the work of J. Piaget and M. Fourcault, and discusses its implications for learning. (CL)

  16. KnotSeeker: heuristic pseudoknot detection in long RNA sequences.

    PubMed

    Sperschneider, Jana; Datta, Amitava

    2008-04-01

    Pseudoknots are folded structures in RNA molecules that perform essential functions as part of cellular transcription machinery and regulatory processes. The prediction of these structures in RNA molecules has important implications in antiviral drug design. It has been shown that the prediction of pseudoknots is an NP-complete problem. Practical structure prediction algorithms based on free energy minimization employ a restricted problem class and dynamic programming. However, these algorithms are computationally very expensive, and their accuracy deteriorates if the input sequence containing the pseudoknot is too long. Heuristic methods can be more efficient, but do not guarantee an optimal solution in regards to the minimum free energy model. We present KnotSeeker, a new heuristic algorithm for the detection of pseudoknots in RNA sequences as a preliminary step for structure prediction. Our method uses a hybrid sequence matching and free energy minimization approach to perform a screening of the primary sequence. We select short sequence fragments as possible candidates that may contain pseudoknots and verify them by using an existing dynamic programming algorithm and a minimum weight independent set calculation. KnotSeeker is significantly more accurate in detecting pseudoknots compared to other common methods as reported in the literature. It is very efficient and therefore a practical tool, especially for long sequences. The algorithm has been implemented in Python and it also uses C/C++ code from several other known techniques. The code is available from http://www.csse.uwa.edu.au/~datta/pseudoknot.

  17. An R-peak detection method that uses an SVD filter and a search back system.

    PubMed

    Jung, Woo-Hyuk; Lee, Sang-Goog

    2012-12-01

    In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively.

  18. Methodologie de communication, methode de communication globale et theories heuristiques dans la perspective de l'acquisition du langage (Communication Methodology, the Global Communication Method, and Heuristic Theories in the Perspective of Language Learning).

    ERIC Educational Resources Information Center

    Vaucher, Marius

    1980-01-01

    This study defines the notion of communication methodology, situates the context in which it operates, and concentrates on the problem of the acquisition of knowledge in general, and of language acquisition, in particular. From the notion of methodology, the study moves to the method of global communication, that is, a method comprising four…

  19. Methodologie de communication, methode de communication globale et theories heuristiques dans la perspective de l'acquisition du langage (Communication Methodology, the Global Communication Method, and Heuristic Theories in the Perspective of Language Learning).

    ERIC Educational Resources Information Center

    Vaucher, Marius

    1980-01-01

    This study defines the notion of communication methodology, situates the context in which it operates, and concentrates on the problem of the acquisition of knowledge in general, and of language acquisition, in particular. From the notion of methodology, the study moves to the method of global communication, that is, a method comprising four…

  20. An Approach to Protein Name Extraction Using Heuristics and a Dictionary.

    ERIC Educational Resources Information Center

    Seki, Kazuhiro; Mostafa, Javed

    2003-01-01

    Proposes a method for protein name extraction from biological texts. The method exploits hand-crafted rules based on heuristics and a set of protein names (dictionary). The approach avoids use of natural language processing tools so as to improve processing speed. Evaluation experiments were conducted in terms of: accuracy, generalizability, and…

  1. An Update on Teaching the Employment Search.

    ERIC Educational Resources Information Center

    Andrews, Deborah, Ed.; Dyrud, Marilyn A., Ed.

    1997-01-01

    Presents five articles dealing with teaching job search strategies: (1) "Preparing a Scannable Resume" (Carol Roever); (2) "Preparing an Online Resume" (Tim Krause); (3) "Using the World Wide Web to Teach Employment Communication" (K. Virginia Hemby); (4) "A Visual Heuristic for Promoting a Rhetorically Based Job Search" (Helen Foster); and (5)…

  2. An Update on Teaching the Employment Search.

    ERIC Educational Resources Information Center

    Andrews, Deborah, Ed.; Dyrud, Marilyn A., Ed.

    1997-01-01

    Presents five articles dealing with teaching job search strategies: (1) "Preparing a Scannable Resume" (Carol Roever); (2) "Preparing an Online Resume" (Tim Krause); (3) "Using the World Wide Web to Teach Employment Communication" (K. Virginia Hemby); (4) "A Visual Heuristic for Promoting a Rhetorically Based Job Search" (Helen Foster); and (5)…

  3. The optimal code searching method with an improved criterion of coded exposure for remote sensing image restoration

    NASA Astrophysics Data System (ADS)

    He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2015-03-01

    Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.

  4. Heuristics for the Hodgkin-Huxley system.

    PubMed

    Hoppensteadt, Frank

    2013-09-01

    Hodgkin and Huxley (HH) discovered that voltages control ionic currents in nerve membranes. This led them to describe electrical activity in a neuronal membrane patch in terms of an electronic circuit whose characteristics were determined using empirical data. Due to the complexity of this model, a variety of heuristics, including relaxation oscillator circuits and integrate-and-fire models, have been used to investigate activity in neurons, and these simpler models have been successful in suggesting experiments and explaining observations. Connections between most of the simpler models had not been made clear until recently. Shown here are connections between these heuristics and the full HH model. In particular, we study a new model (Type III circuit): It includes the van der Pol-based models; it can be approximated by a simple integrate-and-fire model; and it creates voltages and currents that correspond, respectively, to the h and V components of the HH system.

  5. Fairness heuristic theory: valid but not empirical.

    PubMed

    Arnadóttir, Steinvör Pöll

    2002-09-01

    Fairness heuristic theory is concerned with how people react to outcomes of their dealings with authorities, and makes some predictions concerning the relationship between perceived fairness of procedures, perceived fairness of outcomes and acceptance of outcomes. Although considerable effort has been put into establishing empirical evidence for the theory, it is argued that such efforts have no bearing upon the truth of the theory. Central propositions of fairness heuristic theory that have recently been tested empirically are examined and found to be nonempirical and noncontingent. The propositions, it is argued, are necessary truths of commonsense psychology that are not falsifiable by empirical outcomes. Hence, empirical research designed to test them, it is argued, is fruitless and misguided.

  6. Addressing Authorship Issues Prospectively: A Heuristic Approach.

    PubMed

    Roberts, Laura Weiss

    2017-02-01

    Collaborative writing in academic medicine gives rise to more richly informed scholarship, and yet challenging ethical issues surrounding authorship are commonly encountered. International guidelines on authorship help clarify whether individuals who have contributed to a completed scholarly work have been correctly included as authors, but these guidelines do not facilitate intentional and proactive authorship planning or decisions regarding authorship order.In this Commentary, the author presents a heuristic approach to help collaborators clarify, anticipate, and resolve practical and ethically important authorship issues as they engage in the process of developing manuscripts. As this approach illustrates, assignment of authorship should balance work effort and professional responsibility, reflecting the effort and intellectual contribution and the public accountability of the individuals who participate in the work. Using a heuristic approach for managing authorship issues prospectively can foster an ethical, collaborative writing process in which individuals are properly recognized for their contributions.

  7. Reporting quality of search methods in systematic reviews of HIV behavioral interventions (2000-2010): are the searches clearly explained, systematic and reproducible?

    PubMed

    Mullins, Mary M; DeLuca, Julia B; Crepaz, Nicole; Lyles, Cynthia M

    2014-06-01

    Systematic reviews are an essential tool for researchers, prevention providers and policy makers who want to remain current with the evidence in the field. Systematic review must adhere to strict standards, as the results can provide a more objective appraisal of evidence for making scientific decisions than traditional narrative reviews. An integral component of a systematic review is the development and execution of a comprehensive systematic search to collect available and relevant information. A number of reporting guidelines have been developed to ensure quality publications of systematic reviews. These guidelines provide the essential elements to include in the review process and report in the final publication for complete transparency. We identified the common elements of reporting guidelines and examined the reporting quality of search methods in HIV behavioral intervention literature. Consistent with the findings from previous evaluations of reporting search methods of systematic reviews in other fields, our review shows a lack of full and transparent reporting within systematic reviews even though a plethora of guidelines exist. This review underscores the need for promoting the completeness of and adherence to transparent systematic search reporting within systematic reviews.

  8. Radial-searching contour extraction method based on a modified active contour model for mammographic masses.

    PubMed

    Nakagawa, Toshiaki; Hara, Takeshi; Fujita, Hiroshi; Horita, Katsuhei; Iwase, Takuji; Endo, Tokiko

    2008-07-01

    In this study, we developed an automatic extraction scheme for the precise recognition of the contours of masses on digital mammograms in order to improve a computer-aided diagnosis (CAD) system. We propose a radial-searching contour extraction method based on a modified active contour model (ACM). In this technique, after determining the central point of a mass by searching for the direction of the density gradient, we arranged an initial contour at the central point, and the movement of a control point was limited to directions radiating from the central point. Moreover, it became possible to increase the extraction accuracy by sorting out the pixel used for processing and using two images-an edge-intensity image and a degree-of-separation image defined based on the pixel-value histogram-for calculation of the image forces used for constraints on deformation of the ACM. We investigated the accuracy of the automated extraction method by using 53 masses with several "difficult contours" on 53 digitized mammograms. The extraction results were compared quantitatively with the "correct segmentation" represented by an experienced physician's sketches. The numbers of cases in which the extracted region corresponded to the correct region with overlap ratios of more than 81 and 61% were 30 and 45, respectively. The initial results obtained with this technique show that it will be useful for the segmentation of masses in CAD schemes.

  9. [Searching for WDMS Candidates In SDSS-DR10 With Automatic Method].

    PubMed

    Jiang, Bin; Wang, Cheng-you; Wang, Wen-yu; Wang, Wei

    2015-05-01

    The Sloan Digital Sky Survey (SDSS) has released the latest data (DR10) which covers the first APOGEE spectra. The massive spectra can be used for large sample research inscluding the structure and evolution of the Galaxy and multi-wave-band identi cation. In addition, the spectra are also ideal for searching for rare and special objects like white dwarf main-sequence star (WDMS). WDMS consist of a white dwarf primary and a low-mass main-sequence (MS) companion which has positive significance to the study of evolution and parameter of close binaries. WDMS is generally discovered by repeated imaging of the same area of sky, measuring light curves for objects or through photometric selection with follow-up observations. These methods require significant manual processing time with low accuracy and the real-time processing requirements can not be satisfied. In this paper, an automatic and efficient method for searching for WDMS candidates is presented. The method Genetic Algorithm (GA) is applied in the newly released SDSS-DR10 spectra. A total number of 4 140 WDMS candidates are selected by the method and 24 of them are new discoveries which prove that our approach of finding special celestial bodies in massive spectra data is feasible. In addition, this method is also applicable to mining other special celestial objects in sky survey telescope data. We report the identfication of 24 new WDMS with spectra. A compendium of positions, mjd, plate and fiberid of these new discoveries is presented which enrich the spectral library and will be useful to the research of binary evolution models.

  10. A hybrid solar panel maximum power point search method that uses light and temperature sensors

    NASA Astrophysics Data System (ADS)

    Ostrowski, Mariusz

    2016-04-01

    Solar cells have low efficiency and non-linear characteristics. To increase the output power solar cells are connected in more complex structures. Solar panels consist of series of connected solar cells with a few bypass diodes, to avoid negative effects of partial shading conditions. Solar panels are connected to special device named the maximum power point tracker. This device adapt output power from solar panels to load requirements and have also build in a special algorithm to track the maximum power point of solar panels. Bypass diodes may cause appearance of local maxima on power-voltage curve when the panel surface is illuminated irregularly. In this case traditional maximum power point tracking algorithms can find only a local maximum power point. In this article the hybrid maximum power point search algorithm is presented. The main goal of the proposed method is a combination of two algorithms: a method that use temperature sensors to track maximum power point in partial shading conditions and a method that use illumination sensor to track maximum power point in equal illumination conditions. In comparison to another methods, the proposed algorithm uses correlation functions to determinate the relationship between values of illumination and temperature sensors and the corresponding values of current and voltage in maximum power point. In partial shading condition the algorithm calculates local maximum power points bases on the value of temperature and the correlation function and after that measures the value of power on each of calculated point choose those with have biggest value, and on its base run the perturb and observe search algorithm. In case of equal illumination algorithm calculate the maximum power point bases on the illumination value and the correlation function and on its base run the perturb and observe algorithm. In addition, the proposed method uses a special coefficient modification of correlation functions algorithm. This sub

  11. Exploring genomic dark matter: A critical assessment of the performance of homology search methods on noncoding RNA

    PubMed Central

    Freyhult, Eva K.; Bollback, Jonathan P.; Gardner, Paul P.

    2007-01-01

    Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer, and Infernal. Surprisingly, the most popular homology search methods are often the least accurate. As a result, many studies have used inappropriate tools for their analyses. On the basis of our results, we suggest homology search strategies using the currently available tools and some directions for future development. PMID:17151342

  12. Structuring Decisions. The Role of Structuring Heuristics

    DTIC Science & Technology

    1981-08-01

    system . This is always the case, whether decision analysis is viewed as an engineering science or as a clinical art ( Buede , 1979). Models...produc- tion system representation of heuristics designed to optimize such efficiency. First, though, we will describe revisions and extensions of the ...axiomatically within decision theory, designed to buffer these inputs. The general specifica- tion of each of these three buffer systems is

  13. Heuristics and Biases in Military Decision Making

    DTIC Science & Technology

    2010-10-01

    to embrace improvisation and reflection.3 The theory of reflection-in-action requires practitioners to question the structure of assumptions within...how we make decisions shape these heuristics and their accompanying biases. The theory of reflection-in-action and its implications for decision... theory ) which sought to describe human behavior as a rational maximization of cost-benefit decisions, Kahne- man and Tversky provided a simple

  14. A Heuristic for Deriving Loop Functions.

    DTIC Science & Technology

    1981-10-01

    At m, Air )-- ,n Thefc-,ni)= A(m,A(.ira ,n)) . ....... .... c ,rogra ccmputes Ac -nrn ff’unt’.on us ".’ a A Heuristic For Deriving Loop Functions...8217 be a loop functicn w;’ich :,a’ "inconsistent" across al! values of the Ioon inputs "’icn" coulP : only be in7errel fron the ccnstraint "unctions with

  15. Heuristic Automation for Decluttering Tactical Displays

    DTIC Science & Technology

    2007-01-01

    2001b). Heuristic Automation for Decluttering Tactical Displays Mark St. John, Harvey S. Smallman, and Daniel I. Manes , Pacific Science...an ill-defined and com- plex function of many aircraft attributes and requires years of experience to train (Kaempf, Wolf , & Miller,1993; Liebhaber...best judgment. According to this design strategy (e.g., Parasuraman & Riley, 1997, pp. 244, 249; St. John & Manes , 2002; St. John, Oonk, & Osga, 2000

  16. LITERATURE SEARCH FOR METHODS FOR HAZARD ANALYSES OF AIR CARRIER OPERATIONS.

    SciTech Connect

    MARTINEZ - GURIDI,G.; SAMANTA,P.

    2002-07-01

    Representatives of the Federal Aviation Administration (FAA) and several air carriers under Title 14 of the Code of Federal Regulations (CFR) Part 121 developed a system-engineering model of the functions of air-carrier operations. Their analyses form the foundation or basic architecture upon which other task areas are based: hazard analyses, performance measures, and risk indicator design. To carry out these other tasks, models may need to be developed using the basic architecture of the Air Carrier Operations System Model (ACOSM). Since ACOSM encompasses various areas of air-carrier operations and can be used to address different task areas with differing but interrelated objectives, the modeling needs are broad. A literature search was conducted to identify and analyze the existing models that may be applicable for pursuing the task areas in ACOSM. The intent of the literature search was not necessarily to identify a specific model that can be directly used, but rather to identify relevant ones that have similarities with the processes and activities defined within ACOSM. Such models may provide useful inputs and insights in structuring ACOSM models. ACOSM simulates processes and activities in air-carrier operation, but, in a general framework, it has similarities with other industries where attention also has been paid to hazard analyses, emphasizing risk management, and in designing risk indicators. To assure that efforts in other industries are adequately considered, the literature search includes publications from other industries, e.g., chemical, nuclear, and process industries. This report discusses the literature search, the relevant methods identified and provides a preliminary assessment of their use in developing the models needed for the ACOSM task areas. A detailed assessment of the models has not been made. Defining those applicable for ACOSM will need further analyses of both the models and tools identified. The report is organized in four chapters

  17. The affect heuristic in occupational safety.

    PubMed

    Savadori, Lucia; Caovilla, Jessica; Zaniboni, Sara; Fraccaroli, Franco

    2015-07-08

    The affect heuristic is a rule of thumb according to which, in the process of making a judgment or decision, people use affect as a cue. If a stimulus elicits positive affect then risks associated to that stimulus are viewed as low and benefits as high; conversely, if the stimulus elicits negative affect, then risks are perceived as high and benefits as low. The basic tenet of this study is that affect heuristic guides worker's judgment and decision making in a risk situation. The more the worker likes her/his organization the less she/he will perceive the risks as high. A sample of 115 employers and 65 employees working in small family agricultural businesses completed a questionnaire measuring perceived safety costs, psychological safety climate, affective commitment and safety compliance. A multi-sample structural analysis supported the thesis that safety compliance can be explained through an affect-based heuristic reasoning, but only for employers. Positive affective commitment towards their family business reduced employers' compliance with safety procedures by increasing the perceived cost of implementing them.

  18. Partition search

    SciTech Connect

    Ginsberg, M.L.

    1996-12-31

    We introduce a new form of game search called partition search that incorporates dependency analysis, allowing substantial reductions in the portion of the tree that needs to be expanded. Both theoretical results and experimental data are presented. For the game of bridge, partition search provides approximately as much of an improvement over existing methods as {alpha}-{beta} pruning provides over minimax.

  19. Three Dimensional Defect Reconstruction Using State Space Search and Woodbury's Substructure Method

    NASA Astrophysics Data System (ADS)

    Liu, X.; Deng, Y.; Li, Y.; Udpa, L.; Udpa, S. S.

    2010-02-01

    This paper introduces a model-based approach to reconstruct the three-dimensional defect profiles using eddy-current heat exchanger tube inspection signals. The method uses a Woodbury's substructure finite element forward model to simulate the underlying physics, a state space defect representation, and a tree search algorithm to solve the inverse problem. The advantage of the substructure method is that it divides the whole solution domain into two substructures and only the region of interest (ROI) with dramatic material changes will be updated in each iterative step. Since the number of elements inside the ROI is very small compared with the number of elements in the entire mesh, the computational effort needed in both LU factorization and coefficient matrix assembly is reduced. Therefore, the execution time is reduced significantly making the inversion very efficient. The initial inversion results are presented to confirm the validity of the approach.

  20. Heuristic RNA pseudoknot prediction including intramolecular kissing hairpins

    PubMed Central

    Sperschneider, Jana; Datta, Amitava; Wise, Michael J.

    2011-01-01

    Pseudoknots are an essential feature of RNA tertiary structures. Simple H-type pseudoknots have been studied extensively in terms of biological functions, computational prediction, and energy models. Intramolecular kissing hairpins are a more complex and biologically important type of pseudoknot in which two hairpin loops form base pairs. They are hard to predict using free energy minimization due to high computational requirements. Heuristic methods that allow arbitrary pseudoknots strongly depend on the quality of energy parameters, which are not yet available for complex pseudoknots. We present an extension of the heuristic pseudoknot prediction algorithm DotKnot, which covers H-type pseudoknots and intramolecular kissing hairpins. Our framework allows for easy integration of advanced H-type pseudoknot energy models. For a test set of RNA sequences containing kissing hairpins and other types of pseudoknot structures, DotKnot outperforms competing methods from the literature. DotKnot is available as a web server under http://dotknot.csse.uwa.edu.au. PMID:21098139