Science.gov

Sample records for heuristic search method

  1. Search and heuristics

    SciTech Connect

    Pearl, J.

    1983-01-01

    This work is comprised of articles which are representative of current research on search and heuristics. The general theme is the quest for understanding the workings of heuristic knowledge; how it is acquired, stored and used by people, how it can be represented and utilized by machines and what makes one heuristic succeed where others fail. Topics covered include the following: search and reasoning in problem solving; theory formation by heuristic search; the nature of heuristics II: background and examples; Eurisko: a program that learns new heuristics and domain concepts; the nature of heuristics III: program design and results; searching for an optimal path in a tree with random costs; search rearrangement backtracking and polynomial average time; consistent-labeling problems and their algorithms: expected-complexities and theory-based heuristics; general branch and bound formulation for understanding and synthesizing and/or tree search procedures; a minimax algorithm better than alpha-beta. yes and no; and pathology on game trees revisited, and an alternative to minimaxing.

  2. Heuristic method for searches on large data-sets organised using network models

    NASA Astrophysics Data System (ADS)

    Ruiz-Fernández, D.; Quintana-Pacheco, Y.

    2016-05-01

    Searches on large data-sets have become an important issue in recent years. An alternative, which has achieved good results, is the use of methods relying on data mining techniques, such as cluster-based retrieval. This paper proposes a heuristic search that is based on an organisational model that reflects similarity relationships among data elements. The search is guided by using quality estimators of model nodes, which are obtained by the progressive evaluation of the given target function for the elements associated with each node. The results of the experiments confirm the effectiveness of the proposed algorithm. High-quality solutions are obtained evaluating a relatively small percentage of elements in the data-sets.

  3. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  4. Object-oriented implementation of heuristic search methods for graph coloring, maximum clique and satisfiability

    SciTech Connect

    Ferland, J.; Fleurent, C.

    1994-12-31

    Using object-oriented design and the C++ programming language, generic operators are developed for tabu search and genetic algorithms. These operators are used for the graph coloring, maximum clique and satisfiability problems. The availability of all methods for each problem permits to consider hybrid schemes.

  5. A Graph Search Heuristic for Shortest Distance Paths

    SciTech Connect

    Chow, E

    2005-03-24

    This paper presents a heuristic for guiding A* search for finding the shortest distance path between two vertices in a connected, undirected, and explicitly stored graph. The heuristic requires a small amount of data to be stored at each vertex. The heuristic has application to quickly detecting relationships between two vertices in a large information or knowledge network. We compare the performance of this heuristic with breadth-first search on graphs with various topological properties. The results show that one or more orders of magnitude improvement in the number of vertices expanded is possible for large graphs, including Poisson random graphs.

  6. Automated discovery of local search heuristics for satisfiability testing.

    PubMed

    Fukunaga, Alex S

    2008-01-01

    The development of successful metaheuristic algorithms such as local search for a difficult problem such as satisfiability testing (SAT) is a challenging task. We investigate an evolutionary approach to automating the discovery of new local search heuristics for SAT. We show that several well-known SAT local search algorithms such as Walksat and Novelty are composite heuristics that are derived from novel combinations of a set of building blocks. Based on this observation, we developed CLASS, a genetic programming system that uses a simple composition operator to automatically discover SAT local search heuristics. New heuristics discovered by CLASS are shown to be competitive with the best Walksat variants, including Novelty+. Evolutionary algorithms have previously been applied to directly evolve a solution for a particular SAT instance. We show that the heuristics discovered by CLASS are also competitive with these previous, direct evolutionary approaches for SAT. We also analyze the local search behavior of the learned heuristics using the depth, mobility, and coverage metrics proposed by Schuurmans and Southey. PMID:18386995

  7. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1989-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  8. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1988-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  9. A heuristic method for reactive power planning

    SciTech Connect

    Mantovani, J.R.S.

    1996-02-01

    An approach for solving reactive power planning problems is presented, which is based on binary search techniques and the use of a special heuristic to obtain a discrete solution. Two versions were developed, one to run on conventional (sequential) computers and the other to run on a distributed memory (hypercube) machine. This latter parallel processing version employs an asynchronous programming model. Once the set of candidate bases has been defined, the program gives the location and size of the reactive sources needed (if any) in keeping with operating and security constraints.

  10. A comparison of heuristic search algorithms for molecular docking.

    PubMed

    Westhead, D R; Clark, D E; Murray, C W

    1997-05-01

    This paper describes the implementation and comparison of four heuristic search algorithms (genetic algorithm, evolutionary programming, simulated annealing and tabu search) and a random search procedure for flexible molecular docking. To our knowledge, this is the first application of the tabu search algorithm in this area. The algorithms are compared using a recently described fast molecular recognition potential function and a diverse set of five protein-ligand systems. Statistical analysis of the results indicates that overall the genetic algorithm performs best in terms of the median energy of the solutions located. However, tabu search shows a better performance in terms of locating solutions close to the crystallographic ligand conformation. These results suggest that a hybrid search algorithm may give superior results to any of the algorithms alone. PMID:9263849

  11. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  12. Minimizing conflicts: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steve; Johnston, Mark; Philips, Andrew; Laird, Phil

    1992-01-01

    This paper describes a simple heuristic approach to solving large-scale constraint satisfaction and scheduling problems. In this approach one starts with an inconsistent assignment for a set of variables and searches through the space of possible repairs. The search can be guided by a value-ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. The heuristic can be used with a variety of different search strategies. We demonstrate empirically that on the n-queens problem, a technique based on this approach performs orders of magnitude better than traditional backtracking techniques. We also describe a scheduling application where the approach has been used successfully. A theoretical analysis is presented both to explain why this method works well on certain types of problems and to predict when it is likely to be most effective.

  13. A Library of Local Search Heuristics for the Vehicle Routing Problem

    SciTech Connect

    Groer, Christopher S; Golden, Bruce; Edward, Wasil

    2010-01-01

    The vehicle routing problem (VRP) is a difficult and well-studied combinatorial optimization problem. Real-world instances of the VRP can contain hundreds and even thousands of customer locations and can involve many complicating constraints, necessitating the use of heuristic methods. We present a software library of local search heuristics that allow one to quickly generate good solutions to VRP instances. The code has a logical, object-oriented design and uses efficient data structures to store and modify solutions. The core of the library is the implementation of seven local search operators that share a similar interface and are designed to be extended to handle additional options with minimal code change. The code is well-documented, is straightforward to compile, and is freely available for download at http://sites.google.com/site/vrphlibrary/ . The distribution of the code contains several applications that can be used to generate solutions to instances of the capacitated VRP.

  14. Worst-case analysis of self-organizing sequential search heuristics

    SciTech Connect

    Bentley, J.L.; McGeoch, C.C.

    1983-03-01

    The performance of sequential search can be enhanced by the use of heuristics that move elements closer to the front of the list as they are found. Previous analyses have characterized the performance of such heuristics probabilitically. In this paper we show that the heuristics can also be analyzed in the worst-case sense, and that the relative merit of the heuristics under this analysis is different than in the probabilistic analyses. Simulations show that the relative merit of the heuristics on real data is closer to that of the new worst-case analyses rather than that of the previous probabilistic analyses.

  15. Geometry optimization of bimetallic clusters using an efficient heuristic method

    NASA Astrophysics Data System (ADS)

    Lai, Xiangjing; Xu, Ruchu; Huang, Wenqi

    2011-10-01

    In this paper, an efficient heuristic algorithm for geometry optimization of bimetallic clusters is proposed. The algorithm is mainly composed of three ingredients: the monotonic basin-hopping method with guided perturbation (MBH-GP), surface optimization method, and iterated local search (ILS) method, where MBH-GP and surface optimization method are used to optimize the geometric structure of a cluster, and the ILS method is used to search the optimal homotop for a fixed geometric structure. The proposed method is applied to Cu38-nAun (0 ≤ n ≤ 38), Ag55-nAun (0 ≤ n ≤ 55), and Cu55-nAun (0 ≤ n ≤ 55) clusters modeled by the many-body Gupta potential. Comparison with the results reported in the literature indicates that the present method is highly efficient and a number of new putative global minima missed in the previous papers are found. The present method should be a promising tool for the theoretical determination of ground-state structure of bimetallic clusters. Additionally, some key elements and properties of the present method are also analyzed.

  16. Performance of heuristic methods driven by chaotic dynamics for ATSP and applications to DNA fragment assembly

    NASA Astrophysics Data System (ADS)

    Kato, Tomohiro; Hasegawa, Mikio

    Chaotic dynamics has been shown to be effective in improving the performance of combinatorial optimization algorithms. In this paper, the performance of chaotic dynamics in the asymmetric traveling salesman problem (ATSP) is investigated by introducing three types of heuristic solution update methods. Numerical simulation has been carried out to compare its performance with simulated annealing and tabu search; thus, the effectiveness of the approach using chaotic dynamics for driving heuristic methods has been shown. The chaotic method is also evaluated in the case of a combinatorial optimization problem in the real world, which can be solved by the same heuristic operation as that for the ATSP. We apply the chaotic method to the DNA fragment assembly problem, which involves building a DNA sequence from several hundred fragments obtained by the genome sequencer. Our simulation results show that the proposed algorithm using chaotic dynamics in a block shift operation exhibits the best performance for the DNA fragment assembly problem.

  17. Divergence of Scientific Heuristic Method and Direct Algebraic Instruction

    ERIC Educational Resources Information Center

    Calucag, Lina S.

    2016-01-01

    This is an experimental study, made used of the non-randomized experimental and control groups, pretest-posttest designs. The experimental and control groups were two separate intact classes in Algebra. For a period of twelve sessions, the experimental group was subjected to the scientific heuristic method, but the control group instead was given…

  18. Heuristic-based tabu search algorithm for folding two-dimensional AB off-lattice model proteins.

    PubMed

    Liu, Jingfa; Sun, Yuanyuan; Li, Gang; Song, Beibei; Huang, Weibo

    2013-12-01

    The protein structure prediction problem is a classical NP hard problem in bioinformatics. The lack of an effective global optimization method is the key obstacle in solving this problem. As one of the global optimization algorithms, tabu search (TS) algorithm has been successfully applied in many optimization problems. We define the new neighborhood conformation, tabu object and acceptance criteria of current conformation based on the original TS algorithm and put forward an improved TS algorithm. By integrating the heuristic initialization mechanism, the heuristic conformation updating mechanism, and the gradient method into the improved TS algorithm, a heuristic-based tabu search (HTS) algorithm is presented for predicting the two-dimensional (2D) protein folding structure in AB off-lattice model which consists of hydrophobic (A) and hydrophilic (B) monomers. The tabu search minimization leads to the basins of local minima, near which a local search mechanism is then proposed to further search for lower-energy conformations. To test the performance of the proposed algorithm, experiments are performed on four Fibonacci sequences and two real protein sequences. The experimental results show that the proposed algorithm has found the lowest-energy conformations so far for three shorter Fibonacci sequences and renewed the results for the longest one, as well as two real protein sequences, demonstrating that the HTS algorithm is quite promising in finding the ground states for AB off-lattice model proteins. PMID:24077543

  19. A simple heuristic for Internet-based evidence search in primary care: a randomized controlled trial

    PubMed Central

    Eberbach, Andreas; Becker, Annette; Rochon, Justine; Finkemeler, Holger; Wagner, Achim; Donner-Banzhoff, Norbert

    2016-01-01

    Background General practitioners (GPs) are confronted with a wide variety of clinical questions, many of which remain unanswered. Methods In order to assist GPs in finding quick, evidence-based answers, we developed a learning program (LP) with a short interactive workshop based on a simple three-step-heuristic to improve their search and appraisal competence (SAC). We evaluated the LP effectiveness with a randomized controlled trial (RCT). Participants (intervention group [IG] n=20; control group [CG] n=31) rated acceptance and satisfaction and also answered 39 knowledge questions to assess their SAC. We controlled for previous knowledge in content areas covered by the test. Results Main outcome – SAC: within both groups, the pre–post test shows significant (P=0.00) improvements in correctness (IG 15% vs CG 11%) and confidence (32% vs 26%) to find evidence-based answers. However, the SAC difference was not significant in the RCT. Other measures Most workshop participants rated “learning atmosphere” (90%), “skills acquired” (90%), and “relevancy to my practice” (86%) as good or very good. The LP-recommendations were implemented by 67% of the IG, whereas 15% of the CG already conformed to LP recommendations spontaneously (odds ratio 9.6, P=0.00). After literature search, the IG showed a (not significantly) higher satisfaction regarding “time spent” (IG 80% vs CG 65%), “quality of information” (65% vs 54%), and “amount of information” (53% vs 47%). Conclusion Long-standing established GPs have a good SAC. Despite high acceptance, strong learning effects, positive search experience, and significant increase of SAC in the pre–post test, the RCT of our LP showed no significant difference in SAC between IG and CG. However, we suggest that our simple decision heuristic merits further investigation. PMID:27563264

  20. Designing safe job rotation schedules using optimization and heuristic search.

    PubMed

    Carnahan, B J; Redfern, M S; Norman, B

    2000-04-01

    Job rotation is one method that is sometimes used to reduce exposure to strenuous materials handling; however, developing effective rotation schedules can be complex in even moderate sized facilities. The purpose of this research is to develop methods of incorporating safety criteria into scheduling algorithms to produce job rotation schedules that reduce the potential for injury. Integer programming and a genetic algorithm were used to construct job rotation schedules. Schedules were comprised of lifting tasks whose potential for causing injury was assessed with the Job Severity Index. Each method was used to design four job rotation schedules that met specified safety criteria in a working environment where the object weight, horizontal distance and repetition rate varied over time. Each rotation was assigned to a specific gender/lifting capacity group. Five versions of the integer programming search method were applied to this problem. Each version generated one job rotation schedule. The genetic algorithm model was able to create a population of 437 feasible solutions to the rotation problem. Utilizing cluster analysis, a rule set was derived from the genetic algorithm generated solutions. These rules provided guidelines for designing safe job rotation schedules without the use of a computer. The advantages and limitations of these approaches in developing administrative controls for the prevention of back injury are discussed. PMID:10801086

  1. An extended abstract: A heuristic repair method for constraint-satisfaction and scheduling problems

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Johnston, Mark D.; Philips, Andrew B.; Laird, Philip

    1992-01-01

    The work described in this paper was inspired by a surprisingly effective neural network developed for scheduling astronomical observations on the Hubble Space Telescope. Our heuristic constraint satisfaction problem (CSP) method was distilled from an analysis of the network. In the process of carrying out the analysis, we discovered that the effectiveness of the network has little to do with its connectionist implementation. Furthermore, the ideas employed in the network can be implemented very efficiently within a symbolic CSP framework. The symbolic implementation is extremely simple. It also has the advantage that several different search strategies can be employed, although we have found that hill-climbing methods are particularly well-suited for the applications that we have investigated. We begin the paper with a brief review of the neural network. Following this, we describe our symbolic method for heuristic repair.

  2. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  3. Amoeba-Inspired Heuristic Search Dynamics for Exploring Chemical Reaction Paths

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Wakabayashi, Masamitsu

    2015-09-01

    We propose a nature-inspired model for simulating chemical reactions in a computationally resource-saving manner. The model was developed by extending our previously proposed heuristic search algorithm, called "AmoebaSAT [Aono et al. 2013]," which was inspired by the spatiotemporal dynamics of a single-celled amoeboid organism that exhibits sophisticated computing capabilities in adapting to its environment efficiently [Zhu et al. 2013]. AmoebaSAT is used for solving an NP-complete combinatorial optimization problem [Garey and Johnson 1979], "the satisfiability problem," and finds a constraint-satisfying solution at a speed that is dramatically faster than one of the conventionally known fastest stochastic local search methods [Iwama and Tamaki 2004] for a class of randomly generated problem instances [http://www.cs.ubc.ca/~hoos/5/benchm.html]. In cases where the problem has more than one solution, AmoebaSAT exhibits dynamic transition behavior among a variety of the solutions. Inheriting these features of AmoebaSAT, we formulate "AmoebaChem," which explores a variety of metastable molecules in which several constraints determined by input atoms are satisfied and generates dynamic transition processes among the metastable molecules. AmoebaChem and its developed forms will be applied to the study of the origins of life, to discover reaction paths for which expected or unexpected organic compounds may be formed via unknown unstable intermediates and to estimate the likelihood of each of the discovered paths.

  4. Amoeba-Inspired Heuristic Search Dynamics for Exploring Chemical Reaction Paths.

    PubMed

    Aono, Masashi; Wakabayashi, Masamitsu

    2015-09-01

    We propose a nature-inspired model for simulating chemical reactions in a computationally resource-saving manner. The model was developed by extending our previously proposed heuristic search algorithm, called "AmoebaSAT [Aono et al. 2013]," which was inspired by the spatiotemporal dynamics of a single-celled amoeboid organism that exhibits sophisticated computing capabilities in adapting to its environment efficiently [Zhu et al. 2013]. AmoebaSAT is used for solving an NP-complete combinatorial optimization problem [Garey and Johnson 1979], "the satisfiability problem," and finds a constraint-satisfying solution at a speed that is dramatically faster than one of the conventionally known fastest stochastic local search methods [Iwama and Tamaki 2004] for a class of randomly generated problem instances [ http://www.cs.ubc.ca/~hoos/5/benchm.html ]. In cases where the problem has more than one solution, AmoebaSAT exhibits dynamic transition behavior among a variety of the solutions. Inheriting these features of AmoebaSAT, we formulate "AmoebaChem," which explores a variety of metastable molecules in which several constraints determined by input atoms are satisfied and generates dynamic transition processes among the metastable molecules. AmoebaChem and its developed forms will be applied to the study of the origins of life, to discover reaction paths for which expected or unexpected organic compounds may be formed via unknown unstable intermediates and to estimate the likelihood of each of the discovered paths. PMID:26129639

  5. Fitness landscapes, heuristics and technological paradigms: A critique on random search models in evolutionary economics

    NASA Astrophysics Data System (ADS)

    Frenken, Koen

    2001-06-01

    The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.

  6. Scatter search heuristic for least-cost design of water distribution networks

    NASA Astrophysics Data System (ADS)

    Lin, Min-Der; Liu, Yu-Hsin; Liu, Gee-Fon; Chu, Chien-Wei

    2007-10-01

    The optimization problems of water distribution networks are complex, multi-modal and discrete-variable problems that cannot be easily solved with conventional optimization algorithms. Heuristic algorithms such as genetic algorithms, simulated annealing, tabu search and ant colony optimization have been extensively employed over the last decade. This article proposed an optimization procedure based on the scatter search (SS) framework, which is also a heuristic algorithm, to obtain the least-cost designs of three well-known looped water distribution networks (two-loop, Hanoi and New York networks). The computational results obtained with the three benchmark instances indicate that SS is able to find solutions comparable to those provided by some of the most competitive algorithms published in the literature.

  7. A Hybrid Tabu Search Heuristic for a Bilevel Competitive Facility Location Model

    NASA Astrophysics Data System (ADS)

    Küçükaydın, Hande; Aras, Necati; Altınel, I. Kuban

    We consider a problem in which a firm or franchise enters a market by locating new facilities where there are existing facilities belonging to a competitor. The firm aims at finding the location and attractiveness of each facility to be opened so as to maximize its profit. The competitor, on the other hand, can react by adjusting the attractiveness of its existing facilities, opening new facilities and/or closing existing ones with the objective of maximizing its own profit. The demand is assumed to be aggregated at certain points in the plane and the facilities of the firm can be located at prespecified candidate sites. We employ Huff's gravity-based rule in modeling the behavior of the customers where the fraction of customers at a demand point that visit a certain facility is proportional to the facility attractiveness and inversely proportional to the distance between the facility site and demand point. We formulate a bilevel mixed-integer nonlinear programming model where the firm entering the market is the leader and the competitor is the follower. In order to find a feasible solution of this model, we develop a hybrid tabu search heuristic which makes use of two exact methods as subroutines: a gradient ascent method and a branch-and-bound algorithm with nonlinear programming relaxation.

  8. Parameter Identification of Robot Manipulators: A Heuristic Particle Swarm Search Approach

    PubMed Central

    Yan, Danping; Lu, Yongzhong; Levy, David

    2015-01-01

    Parameter identification of robot manipulators is an indispensable pivotal process of achieving accurate dynamic robot models. Since these kinetic models are highly nonlinear, it is not easy to tackle the matter of identifying their parameters. To solve the difficulty effectively, we herewith present an intelligent approach, namely, a heuristic particle swarm optimization (PSO) algorithm, which we call the elitist learning strategy (ELS) and proportional integral derivative (PID) controller hybridized PSO approach (ELPIDSO). A specified PID controller is designed to improve particles’ local and global positions information together with ELS. Parameter identification of robot manipulators is conducted for performance evaluation of our proposed approach. Experimental results clearly indicate the following findings: Compared with standard PSO (SPSO) algorithm, ELPIDSO has improved a lot. It not only enhances the diversity of the swarm, but also features better search effectiveness and efficiency in solving practical optimization problems. Accordingly, ELPIDSO is superior to least squares (LS) method, genetic algorithm (GA), and SPSO algorithm in estimating the parameters of the kinetic models of robot manipulators. PMID:26039090

  9. Parameter identification of robot manipulators: a heuristic particle swarm search approach.

    PubMed

    Yan, Danping; Lu, Yongzhong; Levy, David

    2015-01-01

    Parameter identification of robot manipulators is an indispensable pivotal process of achieving accurate dynamic robot models. Since these kinetic models are highly nonlinear, it is not easy to tackle the matter of identifying their parameters. To solve the difficulty effectively, we herewith present an intelligent approach, namely, a heuristic particle swarm optimization (PSO) algorithm, which we call the elitist learning strategy (ELS) and proportional integral derivative (PID) controller hybridized PSO approach (ELPIDSO). A specified PID controller is designed to improve particles' local and global positions information together with ELS. Parameter identification of robot manipulators is conducted for performance evaluation of our proposed approach. Experimental results clearly indicate the following findings: Compared with standard PSO (SPSO) algorithm, ELPIDSO has improved a lot. It not only enhances the diversity of the swarm, but also features better search effectiveness and efficiency in solving practical optimization problems. Accordingly, ELPIDSO is superior to least squares (LS) method, genetic algorithm (GA), and SPSO algorithm in estimating the parameters of the kinetic models of robot manipulators. PMID:26039090

  10. Heuristic search in robot configuration space using variable metric

    NASA Technical Reports Server (NTRS)

    Verwer, Ben J. H.

    1987-01-01

    A method to generate obstacle free trajectories for both mobile robots and linked robots is proposed. The approach generates the shortest paths in a configuration space. The metric in the configuration space can be adjusted to obtain a tradeoff between safety and velocity by imposing extra costs on paths near obstacles.

  11. Hierarchical heuristic search using a Gaussian mixture model for UAV coverage planning.

    PubMed

    Lin, Lanny; Goodrich, Michael A

    2014-12-01

    During unmanned aerial vehicle (UAV) search missions, efficient use of UAV flight time requires flight paths that maximize the probability of finding the desired subject. The probability of detecting the desired subject based on UAV sensor information can vary in different search areas due to environment elements like varying vegetation density or lighting conditions, making it likely that the UAV can only partially detect the subject. This adds another dimension of complexity to the already difficult (NP-Hard) problem of finding an optimal search path. We present a new class of algorithms that account for partial detection in the form of a task difficulty map and produce paths that approximate the payoff of optimal solutions. The algorithms use the mode goodness ratio heuristic that uses a Gaussian mixture model to prioritize search subregions. The algorithms search for effective paths through the parameter space at different levels of resolution. We compare the performance of the new algorithms against two published algorithms (Bourgault's algorithm and LHC-GW-CONV algorithm) in simulated searches with three real search and rescue scenarios, and show that the new algorithms outperform existing algorithms significantly and can yield efficient paths that yield payoffs near the optimal. PMID:24691199

  12. Comparing three stochastic search algorithms for computational protein design: Monte Carlo, replica exchange Monte Carlo, and a multistart, steepest-descent heuristic.

    PubMed

    Mignon, David; Simonson, Thomas

    2016-07-15

    Computational protein design depends on an energy function and an algorithm to search the sequence/conformation space. We compare three stochastic search algorithms: a heuristic, Monte Carlo (MC), and a Replica Exchange Monte Carlo method (REMC). The heuristic performs a steepest-descent minimization starting from thousands of random starting points. The methods are applied to nine test proteins from three structural families, with a fixed backbone structure, a molecular mechanics energy function, and with 1, 5, 10, 20, 30, or all amino acids allowed to mutate. Results are compared to an exact, "Cost Function Network" method that identifies the global minimum energy conformation (GMEC) in favorable cases. The designed sequences accurately reproduce experimental sequences in the hydrophobic core. The heuristic and REMC agree closely and reproduce the GMEC when it is known, with a few exceptions. Plain MC performs well for most cases, occasionally departing from the GMEC by 3-4 kcal/mol. With REMC, the diversity of the sequences sampled agrees with exact enumeration where the latter is possible: up to 2 kcal/mol above the GMEC. Beyond, room temperature replicas sample sequences up to 10 kcal/mol above the GMEC, providing thermal averages and a solution to the inverse protein folding problem. © 2016 Wiley Periodicals, Inc. PMID:27197555

  13. Parsing heuristic and forward search in first-graders' game-play behavior.

    PubMed

    Paz, Luciano; Goldin, Andrea P; Diuk, Carlos; Sigman, Mariano

    2015-07-01

    Seventy-three children between 6 and 7 years of age were presented with a problem having ambiguous subgoal ordering. Performance in this task showed reliable fingerprints: (a) a non-monotonic dependence of performance as a function of the distance between the beginning and the end-states of the problem, (b) very high levels of performance when the first move was correct, and (c) states in which accuracy of the first move was significantly below chance. These features are consistent with a non-Markov planning agent, with an inherently inertial decision process, and that uses heuristics and partial problem knowledge to plan its actions. We applied a statistical framework to fit and test the quality of a proposed planning model (Monte Carlo Tree Search). Our framework allows us to parse out independent contributions to problem-solving based on the construction of the value function and on general mechanisms of the search process in the tree of solutions. We show that the latter are correlated with children's performance on an independent measure of planning, while the former is highly domain specific. PMID:25302559

  14. Heuristic Search for Planning with Different Forced Goal-Ordering Constraints

    PubMed Central

    Zhang, Weiming; Cui, Jing; Zhu, Cheng; Huang, Jincai; Liu, Zhong

    2013-01-01

    Planning with forced goal-ordering (FGO) constraints has been proposed many times over the years, but there are still major difficulties in realizing these FGOs in plan generation. In certain planning domains, all the FGOs exist in the initial state. No matter which approach is adopted to achieve a subgoal, all the subgoals should be achieved in a given sequence from the initial state. Otherwise, the planning may arrive at a deadlock. For some other planning domains, there is no FGO in the initial state. However, FGO may occur during the planning process if certain subgoal is achieved by an inappropriate approach. This paper contributes to illustrate that it is the excludable constraints among the goal achievement operations (GAO) of different subgoals that introduce the FGOs into the planning problem, and planning with FGO is still a challenge for the heuristic search based planners. Then, a novel multistep forward search algorithm is proposed which can solve the planning problem with different FGOs efficiently. PMID:23935443

  15. An adaptive large neighborhood search heuristic for Two-Echelon Vehicle Routing Problems arising in city logistics

    PubMed Central

    Hemmelmayr, Vera C.; Cordeau, Jean-François; Crainic, Teodor Gabriel

    2012-01-01

    In this paper, we propose an adaptive large neighborhood search heuristic for the Two-Echelon Vehicle Routing Problem (2E-VRP) and the Location Routing Problem (LRP). The 2E-VRP arises in two-level transportation systems such as those encountered in the context of city logistics. In such systems, freight arrives at a major terminal and is shipped through intermediate satellite facilities to the final customers. The LRP can be seen as a special case of the 2E-VRP in which vehicle routing is performed only at the second level. We have developed new neighborhood search operators by exploiting the structure of the two problem classes considered and have also adapted existing operators from the literature. The operators are used in a hierarchical scheme reflecting the multi-level nature of the problem. Computational experiments conducted on several sets of instances from the literature show that our algorithm outperforms existing solution methods for the 2E-VRP and achieves excellent results on the LRP. PMID:23483764

  16. Development of a core collection for ramie by heuristic search based on SSR markers

    PubMed Central

    Luan, Ming-Bao; Zou, Zi-Zheng; Zhu, Juan-Juan; Wang, Xiao-Fei; Xu, Ying; Ma, Qing-Hua; Sun, Zhi-Min; Chen, Jian-Hua

    2014-01-01

    There are more than 2000 ramie germplasms in the National Ramie Germplasm Nursery affiliated with the Institute of Bast Fiber Crops, Chinese Academy of Agricultural Science, China. As it is difficult to perform effective conservation, management, evaluation, and utilization of redundant genetic resources, it is necessary to construct a core collection by using molecular markers. In this study, a core collection of ramie consisting of 22 germplasms was constructed from 108 accessions by heuristic search based on 21 Simple Sequence Repeat (SSR) marker combinations. The results showed that there is a poor relationship between the core collection and the geographic distribution. The number of amplification bands for the core collection was the same as that for the entire collection. Shannon's index for three of the SSR primers (14%) and Nei's index for nine of the SSR primers (19%) were lower in the core collection than in the entire collection. The true core collection had wider genetic diversity compared with the random core collection. Collectively, the core collection constructed in this study is reliable and represents the genetic diversity of all the 108 accessions. PMID:26019563

  17. Perturbation method for probabilistic search for the traveling salesperson problem

    NASA Astrophysics Data System (ADS)

    Cohoon, James P.; Karro, John E.; Martin, Worthy N.; Niebel, William D.; Nagel, Klaus

    1998-10-01

    The Traveling Salesperson Problem (TSP), is an MP-complete combinatorial optimization problem of substantial importance in many scheduling applications. Here we show the viability of SPAN, a hybrid approach to solving the TSP that incorporates a perturbation method applied to a classic heuristic in the overall context of a probabilistic search control strategy. In particular, the heuristic for the TSP is based on the minimal spanning tree of the city locations, the perturbation method is a simple modification of the city locations, and the control strategy is a genetic algorithm (GA). The crucial concept here is that the perturbation of the problem allows variant solutions to be generated by the heuristic and applied to the original problem, thus providing the GA with capabilities for both exploration in its search process. We demonstrate that SPAN outperforms, with regard to solution quality, one of the best GA system reported in the literature.

  18. QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm.

    PubMed

    Bao, Ying; Lei, Weimin; Zhang, Wei; Zhan, Yuzhuo

    2016-01-01

    At present, to realize or improve the quality of experience (QoE) is a major goal for network media transmission service, and QoE evaluation is the basis for adjusting the transmission control mechanism. Therefore, a kind of QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm is proposed in this paper, which is concentrated on service score calculation at the server side. The server side collects network transmission quality of service (QoS) parameter, node location data, and user expectation value from client feedback information. Then it manages the historical data in database through the "big data" process mode, and predicts user score according to heuristic rules. On this basis, it completes fuzzy clustering analysis, and generates service QoE score and management message, which will be finally fed back to clients. Besides, this paper mainly discussed service evaluation generative rules, heuristic evaluation rules and fuzzy clustering analysis methods, and presents service-based QoE evaluation processes. The simulation experiments have verified the effectiveness of QoE collaborative evaluation method based on fuzzy clustering heuristic rules. PMID:27398281

  19. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings with HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.

  20. BinAligner: a heuristic method to align biological networks.

    PubMed

    Yang, Jialiang; Li, Jun; Grünewald, Stefan; Wan, Xiu-Feng

    2013-01-01

    The advances in high throughput omics technologies have made it possible to characterize molecular interactions within and across various species. Alignments and comparison of molecular networks across species will help detect orthologs and conserved functional modules and provide insights on the evolutionary relationships of the compared species. However, such analyses are not trivial due to the complexity of network and high computational cost. Here we develop a mixture of global and local algorithm, BinAligner, for network alignments. Based on the hypotheses that the similarity between two vertices across networks would be context dependent and that the information from the edges and the structures of subnetworks can be more informative than vertices alone, two scoring schema, 1-neighborhood subnetwork and graphlet, were introduced to derive the scoring matrices between networks, besides the commonly used scoring scheme from vertices. Then the alignment problem is formulated as an assignment problem, which is solved by the combinatorial optimization algorithm, such as the Hungarian method. The proposed algorithm was applied and validated in aligning the protein-protein interaction network of Kaposi's sarcoma associated herpesvirus (KSHV) and that of varicella zoster virus (VZV). Interestingly, we identified several putative functional orthologous proteins with similar functions but very low sequence similarity between the two viruses. For example, KSHV open reading frame 56 (ORF56) and VZV ORF55 are helicase-primase subunits with sequence identity 14.6%, and KSHV ORF75 and VZV ORF44 are tegument proteins with sequence identity 15.3%. These functional pairs can not be identified if one restricts the alignment into orthologous protein pairs. In addition, BinAligner identified a conserved pathway between two viruses, which consists of 7 orthologous protein pairs and these proteins are connected by conserved links. This pathway might be crucial for virus packing and

  1. Utilization of Tabu search heuristic rules in sampling-based motion planning

    NASA Astrophysics Data System (ADS)

    Khaksar, Weria; Hong, Tang Sai; Sahari, Khairul Salleh Mohamed; Khaksar, Mansoor

    2015-05-01

    Path planning in unknown environments is one of the most challenging research areas in robotics. In this class of path planning, the robot acquires the information from its sensory system. Sampling-based path planning is one of the famous approaches with low memory and computational requirements that has been studied by many researchers during the past few decades. We propose a sampling-based algorithm for path planning in unknown environments using Tabu search. The Tabu search component of the proposed method guides the sampling to find the samples in the most promising areas and makes the sampling procedure more intelligent. The simulation results show the efficient performance of the proposed approach in different types of environments. We also compare the performance of the algorithm with some of the well-known path planning approaches, including Bug1, Bug2, PRM, RRT and the Visibility Graph. The comparison results support the claim of superiority of the proposed algorithm.

  2. Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    PubMed Central

    Siswantoro, Joko; Idrus, Bahari

    2014-01-01

    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method. PMID:24892069

  3. Extraction of microcracks in rock images based on heuristic graph searching and application

    NASA Astrophysics Data System (ADS)

    Luo, Zhihua; Zhu, Zhende; Ruan, Huaining; Shi, Chong

    2015-12-01

    In this paper, we propose a new method, based on a graph searching technique, for microcrack extraction from scanning electron microscopic images of rocks. This method mainly focuses on how to detect the crack and extract it, and then quantify some basic geometrical features. The crack can be detected automatically with the aid of two endpoints of the crack. The algorithm involves the following process: the A* graph searching technique is first used to find a path throughout the crack region, defined by the initial two endpoints; the pixels of the path will be used as the seeds for the region growing method to restore the primary crack area; then, an automatic filling holes' operation is used to remove the possible holes in the region growing result; the medial axis and distance transformation of the crack area are acquired, and then the final crack is rebuilt by painting disks along a medial axis without branches. The crack result is separated without interaction. In the remaining parts, the crack features are quantified, such as the length, width, angle and area, and error analysis shows that the error percentage of the proposed approach reduces to a low level with actual width increases, and results of some example images are illustrated. The algorithm is efficient and can also be used for image detection of other linear structural objects.

  4. Molecular systematics and biogeography of the southern South american freshwater "crabs" Aegla (decapoda: Anomura: Aeglidae) using multiple heuristic tree search approaches.

    PubMed

    Pérez-Losada, Marcos; Bond-Buckup, Georgina; Jara, Carlos G; Crandall, Keith A

    2004-10-01

    Recently new heuristic genetic algorithms such as Treefinder and MetaGA have been developed to search for optimal trees in a maximum likelihood (ML) framework. In this study we combined these methods with other standard heuristic approaches such as ML and maximum parsimony hill-climbing searches and Bayesian inference coupled with Markov chain Monte Carlo techniques under homogeneous and mixed models of evolution to conduct an extensive phylogenetic analysis of the most abundant and widely distributed southern South American freshwater"crab,"the Aegla(Anomura: Aeglidae). A total of 167 samples representing 64 Aegla species and subspecies were sequenced for one nuclear (28S rDNA) and four mitochondrial (12S and 16S rDNA, COI, and COII) genes (5352 bp total). Additionally, six other anomuran species from the genera Munida,Pachycheles, and Uroptychus(Galatheoidea), Lithodes(Paguroidea), and Lomis(Lomisoidea) and the nuclear 18S rDNA gene (1964 bp) were included in preliminary analyses for rooting the Aegla tree. Nonsignificantly different phylogenetic hypotheses resulted from all the different heuristic methods used here, although the best scored topologies found under the ML hill-climbing, Bayesian, and MetaGA approaches showed considerably better likelihood scores (Delta> 54) than those found under the MP and Treefinder approaches. Our trees provided strong support for most of the recognized Aegla species except for A. cholchol,A. jarai,A. parana,A. marginata, A. platensis, and A. franciscana, which may actually represent multiple species. Geographically, the Aegla group was divided into a basal western clade (21 species and subspecies) composed of two subclades with overlapping distributions, and a more recent central-eastern clade (43 species) composed of three subclades with fairly well-recognized distributions. This result supports the Pacific-Origin Hypothesis postulated for the group; alternative hypotheses of Atlantic or multiple origins were significantly

  5. Job-shop scheduling with a combination of evolutionary and heuristic methods

    NASA Astrophysics Data System (ADS)

    Patkai, Bela; Torvinen, Seppo

    1999-08-01

    Since almost all of the scheduling problems are NP-hard-- cannot be solved in polynomial time--those companies that need a realistic scheduling system face serious limitations of available methods for finding an optimal schedule, especially if the given environment requires adaptation to dynamic variations. Exact methods do find an optimal schedule, but the size of the problem they can solve is very limited, excluding this way the required scalability. The solution presented in this paper is a simple, multi-pass heuristic method, which aims to avoid the limitations of other well-known formulations. Even though the dispatching rules are fast and provide near-optimal solutions in most cases, they are severely limited in efficiency--especially in case the schedule builder satisfies a significant number of constraints. That is the main motivation for adding a simplified genetic algorithm to the dispatching rules, which--due to its stochastic nature--belongs to heuristic, too. The scheduling problem is of a middle size Finnish factory, throughout the investigations their up-to-date manufacturing data has been used for the sake of realistic calculations.

  6. Heuristic-biased stochastic sampling

    SciTech Connect

    Bresina, J.L.

    1996-12-31

    This paper presents a search technique for scheduling problems, called Heuristic-Biased Stochastic Sampling (HBSS). The underlying assumption behind the HBSS approach is that strictly adhering to a search heuristic often does not yield the best solution and, therefore, exploration off the heuristic path can prove fruitful. Within the HBSS approach, the balance between heuristic adherence and exploration can be controlled according to the confidence one has in the heuristic. By varying this balance, encoded as a bias function, the HBSS approach encompasses a family of search algorithms of which greedy search and completely random search are extreme members. We present empirical results from an application of HBSS to the realworld problem of observation scheduling. These results show that with the proper bias function, it can be easy to outperform greedy search.

  7. An efficient heuristic method for dynamic portfolio selection problem under transaction costs and uncertain conditions

    NASA Astrophysics Data System (ADS)

    Najafi, Amir Abbas; Pourahmadi, Zahra

    2016-04-01

    Selecting the optimal combination of assets in a portfolio is one of the most important decisions in investment management. As investment is a long term concept, looking into a portfolio optimization problem just in a single period may cause loss of some opportunities that could be exploited in a long term view. Hence, it is tried to extend the problem from single to multi-period model. We include trading costs and uncertain conditions to this model which made it more realistic and complex. Hence, we propose an efficient heuristic method to tackle this problem. The efficiency of the method is examined and compared with the results of the rolling single-period optimization and the buy and hold method which shows the superiority of the proposed method.

  8. Heuristic status polling

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Heidelberger, Philip; Kumar, Sameer; Parker, Jeffrey J.; Ratterman, Joseph D.

    2011-06-07

    Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.

  9. Lagrangian heuristic method for the wireless sensor network design problem in railway structural health monitoring

    NASA Astrophysics Data System (ADS)

    Hada, Akio; Soga, Kenichi; Liu, Ruoshui; Wassell, Ian J.

    2012-04-01

    In this paper, we study a design method for minimizing the total cost of a wireless sensor network (WSN) used for health monitoring of railway structures. First, we present the problem, that is to simultaneously determine the number of relays and their deployment locations, the transmission power level for each sensor and relay, and the routes for transferring sensor data to a gateway using multi-hop wireless communication. Second, we formulate this task as a mathematical programming problem, and to solve this problem, we propose a near optimal algorithm based on the Lagrangian heuristic method. Finally, we verify the effectiveness of our algorithm through computational experiments carried out using data acquired from a real WSN used for railway structural health monitoring.

  10. Iterative-deepening heuristic search for optimal and semi-optimal resource allocation

    NASA Technical Reports Server (NTRS)

    Bridges, Susan M.; Johannes, James D.

    1987-01-01

    It is demonstrated that when iterative-deepening A asterisk (IDA asterisk) is applied to one type of resource allocation problem, it uses far less storage than A asterisk, but opens far more nodes and thus has unacceptable time complexity. This is shown to be due, at least in part, to the low-valued effective branching factor that is a characteristic of problems with real-valued cost functions. The semi-optimal, epsilon-admissible IDA asterisk sub epsilon search algorithm that the authors described was shown to open fewer nodes than both A asterisk and IDA asterisk with storage complexity proportional to the depth of the search tree.

  11. Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses

    ERIC Educational Resources Information Center

    Ings, Welby

    2011-01-01

    This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…

  12. Simultaneous determination of aquifer parameters and zone structures with fuzzy c-means clustering and meta-heuristic harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Ayvaz, M. Tamer

    2007-11-01

    This study proposes an inverse solution algorithm through which both the aquifer parameters and the zone structure of these parameters can be determined based on a given set of observations on piezometric heads. In the zone structure identification problem fuzzy c-means ( FCM) clustering method is used. The association of the zone structure with the transmissivity distribution is accomplished through an optimization model. The meta-heuristic harmony search ( HS) algorithm, which is conceptualized using the musical process of searching for a perfect state of harmony, is used as an optimization technique. The optimum parameter zone structure is identified based on three criteria which are the residual error, parameter uncertainty, and structure discrimination. A numerical example given in the literature is solved to demonstrate the performance of the proposed algorithm. Also, a sensitivity analysis is performed to test the performance of the HS algorithm for different sets of solution parameters. Results indicate that the proposed solution algorithm is an effective way in the simultaneous identification of aquifer parameters and their corresponding zone structures.

  13. Using tree diversity to compare phylogenetic heuristics

    PubMed Central

    Sul, Seung-Jin; Matthews, Suzanne; Williams, Tiffani L

    2009-01-01

    Background Evolutionary trees are family trees that represent the relationships between a group of organisms. Phylogenetic heuristics are used to search stochastically for the best-scoring trees in tree space. Given that better tree scores are believed to be better approximations of the true phylogeny, traditional evaluation techniques have used tree scores to determine the heuristics that find the best scores in the fastest time. We develop new techniques to evaluate phylogenetic heuristics based on both tree scores and topologies to compare Pauprat and Rec-I-DCM3, two popular Maximum Parsimony search algorithms. Results Our results show that although Pauprat and Rec-I-DCM3 find the trees with the same best scores, topologically these trees are quite different. Furthermore, the Rec-I-DCM3 trees cluster distinctly from the Pauprat trees. In addition to our heatmap visualizations of using parsimony scores and the Robinson-Foulds distance to compare best-scoring trees found by the two heuristics, we also develop entropy-based methods to show the diversity of the trees found. Overall, Pauprat identifies more diverse trees than Rec-I-DCM3. Conclusion Overall, our work shows that there is value to comparing heuristics beyond the parsimony scores that they find. Pauprat is a slower heuristic than Rec-I-DCM3. However, our work shows that there is tremendous value in using Pauprat to reconstruct trees—especially since it finds identical scoring but topologically distinct trees. Hence, instead of discounting Pauprat, effort should go in improving its implementation. Ultimately, improved performance measures lead to better phylogenetic heuristics and will result in better approximations of the true evolutionary history of the organisms of interest. PMID:19426451

  14. A heuristic method for consumable resource allocation in multi-class dynamic PERT networks

    NASA Astrophysics Data System (ADS)

    Yaghoubi, Saeed; Noori, Siamak; Mazdeh, Mohammad Mahdavi

    2013-06-01

    This investigation presents a heuristic method for consumable resource allocation problem in multi-class dynamic Project Evaluation and Review Technique (PERT) networks, where new projects from different classes (types) arrive to system according to independent Poisson processes with different arrival rates. Each activity of any project is operated at a devoted service station located in a node of the network with exponential distribution according to its class. Indeed, each project arrives to the first service station and continues its routing according to precedence network of its class. Such system can be represented as a queuing network, while the discipline of queues is first come, first served. On the basis of presented method, a multi-class system is decomposed into several single-class dynamic PERT networks, whereas each class is considered separately as a minisystem. In modeling of single-class dynamic PERT network, we use Markov process and a multi-objective model investigated by Azaron and Tavakkoli-Moghaddam in 2007. Then, after obtaining the resources allocated to service stations in every minisystem, the final resources allocated to activities are calculated by the proposed method.

  15. SP-100 shield design automation process using expert system and heuristic search techniques

    NASA Astrophysics Data System (ADS)

    Marcille, Thomas F.; Protsik, Robert; Deane, Nelson A.; Hoover, Darryl G.

    1993-01-01

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability.

  16. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  17. Effective heuristics and meta-heuristics for the quadratic assignment problem with tuned parameters and analytical comparisons

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Karimi, Hossein

    2012-07-01

    Quadratic assignment problem (QAP) is a well-known problem in the facility location and layout. It belongs to the NP-complete class. There are many heuristic and meta-heuristic methods, which are presented for QAP in the literature. In this paper, we applied 2-opt, greedy 2-opt, 3-opt, greedy 3-opt, and VNZ as heuristic methods and tabu search (TS), simulated annealing, and particle swarm optimization as meta-heuristic methods for the QAP. This research is dedicated to compare the relative percentage deviation of these solution qualities from the best known solution which is introduced in QAPLIB. Furthermore, a tuning method is applied for meta-heuristic parameters. Results indicate that TS is the best in 31%of QAPs, and the IFLS method, which is in the literature, is the best in 58 % of QAPs; these two methods are the same in 11 % of test problems. Also, TS has a better computational time among heuristic and meta-heuristic methods.

  18. A Tabu-Search Heuristic for Deterministic Two-Mode Blockmodeling of Binary Network Matrices

    ERIC Educational Resources Information Center

    Brusco, Michael; Steinley, Douglas

    2011-01-01

    Two-mode binary data matrices arise in a variety of social network contexts, such as the attendance or non-attendance of individuals at events, the participation or lack of participation of groups in projects, and the votes of judges on cases. A popular method for analyzing such data is two-mode blockmodeling based on structural equivalence, where…

  19. Heuristic decision making in medicine

    PubMed Central

    Marewski, Julian N.; Gigerenzer, Gerd

    2012-01-01

    Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care. PMID:22577307

  20. Development of a cell formation heuristic by considering realistic data using principal component analysis and Taguchi's method

    NASA Astrophysics Data System (ADS)

    Kumar, Shailendra; Sharma, Rajiv Kumar

    2015-12-01

    Over the last four decades of research, numerous cell formation algorithms have been developed and tested, still this research remains of interest to this day. Appropriate manufacturing cells formation is the first step in designing a cellular manufacturing system. In cellular manufacturing, consideration to manufacturing flexibility and production-related data is vital for cell formation. The consideration to this realistic data makes cell formation problem very complex and tedious. It leads to the invention and implementation of highly advanced and complex cell formation methods. In this paper an effort has been made to develop a simple and easy to understand/implement manufacturing cell formation heuristic procedure with considerations to the number of production and manufacturing flexibility-related parameters. The heuristic minimizes inter-cellular movement cost/time. Further, the proposed heuristic is modified for the application of principal component analysis and Taguchi's method. Numerical example is explained to illustrate the approach. A refinement in the results is observed with adoption of principal component analysis and Taguchi's method.

  1. The Impact of Using the Heuristic Teaching Method on Jordanian Mathematics Students

    ERIC Educational Resources Information Center

    Al-Fayez, Mona Qutefan; Jubran, Sereen Mousa

    2012-01-01

    This study investigates the impact of using the heuristic teaching approach for teaching mathematics to tenth grade students in Jordan. The researchers followed the equivalent pre/post T test two group designs. To achieve the goal of the study, a pre/post- test was constructed to measure student achievement in mathematics. The sample for this…

  2. Heuristics as a Basis for Assessing Creative Potential: Measures, Methods, and Contingencies

    ERIC Educational Resources Information Center

    Vessey, William B.; Mumford, Michael D.

    2012-01-01

    Studies of creative thinking skills have generally measured a single aspect of creativity, divergent thinking. A number of other processes involved in creative thought have been identified. Effective execution of these processes is held to depend on the strategies applied in process execution, or heuristics. In this article, we review prior…

  3. Twilight of the Slogans: A Heuristic Investigation of Linguistic Memes Using Mixed Methods

    ERIC Educational Resources Information Center

    Duffy, Curt Paul

    2013-01-01

    Slogans, or linguistic memes, are short, memorable phrases that are present in commercial, political, and everyday discourse. Slogans propagate similarly to other memes, or cultural units, through an evolutionary mechanism first proposed by Dawkins (1976). Heuristic inquiry, as presented by Moustakas (1990), provided a template from which to…

  4. The min-conflicts heuristic: Experimental and theoretical results

    NASA Technical Reports Server (NTRS)

    Minton, Steven; Philips, Andrew B.; Johnston, Mark D.; Laird, Philip

    1991-01-01

    This paper describes a simple heuristic method for solving large-scale constraint satisfaction and scheduling problems. Given an initial assignment for the variables in a problem, the method operates by searching through the space of possible repairs. The search is guided by an ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. We demonstrate empirically that the method performs orders of magnitude better than traditional backtracking techniques on certain standard problems. For example, the one million queens problem can be solved rapidly using our approach. We also describe practical scheduling applications where the method has been successfully applied. A theoretical analysis is presented to explain why the method works so well on certain types of problems and to predict when it is likely to be most effective.

  5. Meta-heuristic algorithms as tools for hydrological science

    NASA Astrophysics Data System (ADS)

    Yoo, Do Guen; Kim, Joong Hoon

    2014-12-01

    In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.

  6. The double-assignment method for the exponential chaotic tabu search in quadratic assignment problems

    NASA Astrophysics Data System (ADS)

    Shibata, Kazuaki; Horio, Yoshihiko; Aihara, Kazuyuki

    The quadratic assignment problem (QAP) is one of the NP-hard combinatorial optimization problems. An exponential chaotic tabu search using a 2-opt algorithm driven by chaotic neuro-dynamics has been proposed as one heuristic method for solving QAPs. In this paper we first propose a new local search, the double-assignment method, suitable for the exponential chaotic tabu search, which adopts features of the Lin-Kernighan algorithm. We then introduce chaotic neuro-dynamics into the double-assignment method to propose a novel exponential chaotic tabu search. We further improve the proposed exponential chaotic tabu search with the double-assignment method by enhancing the effect of chaotic neuro-dynamics.

  7. Heuristic Learning and Programmed Learning

    ERIC Educational Resources Information Center

    Okon, Wincenty

    1969-01-01

    Flexibility in the application of instructional methods is the best approach to a teaching situation. Insofar as possible, a combination of traditional learning methods with heuristic (problem solving) and programed ones should be used. (CK)

  8. A parallel graph coloring heuristic

    SciTech Connect

    Jones, M.T.; Plassmann, P.E. )

    1993-05-01

    The problem of computing good graph colorings arises in many diverse applications, such as in the estimation of sparse Jacobians and in the development of efficient, parallel iterative methods for solving sparse linear systems. This paper presents an asynchronous graph coloring heuristic well suited to distributed memory parallel computers. Experimental results obtained on an Intel iPSC/860 are presented, which demonstrate that, for graphs arising from finite element applications, the heuristic exhibits scalable performance and generates colorings usually within three or four colors of the best-known linear time sequential heuristics. For bounded degree graphs, it is shown that the expected running time of the heuristic under the P-Ram computation model is bounded by EO(log(n)/log log(n)). This bound is an improvement over the previously known best upper bound for the expected running time of a random heuristic for the graph coloring problem.

  9. A hybrid color space for skin detection using genetic algorithm heuristic search and principal component analysis technique.

    PubMed

    Maktabdar Oghaz, Mahdi; Maarof, Mohd Aizaini; Zainal, Anazida; Rohani, Mohd Foad; Yaghoubyan, S Hadi

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377

  10. A Hybrid Color Space for Skin Detection Using Genetic Algorithm Heuristic Search and Principal Component Analysis Technique

    PubMed Central

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377

  11. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite. PMID:24951713

  12. A System for Automatically Generating Scheduling Heuristics

    NASA Technical Reports Server (NTRS)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  13. Numerical Solution to Generalized Burgers'-Fisher Equation Using Exp-Function Method Hybridized with Heuristic Computation

    PubMed Central

    Malik, Suheel Abdullah; Qureshi, Ijaz Mansoor; Amir, Muhammad; Malik, Aqdas Naveed; Haq, Ihsanul

    2015-01-01

    In this paper, a new heuristic scheme for the approximate solution of the generalized Burgers'-Fisher equation is proposed. The scheme is based on the hybridization of Exp-function method with nature inspired algorithm. The given nonlinear partial differential equation (NPDE) through substitution is converted into a nonlinear ordinary differential equation (NODE). The travelling wave solution is approximated by the Exp-function method with unknown parameters. The unknown parameters are estimated by transforming the NODE into an equivalent global error minimization problem by using a fitness function. The popular genetic algorithm (GA) is used to solve the minimization problem, and to achieve the unknown parameters. The proposed scheme is successfully implemented to solve the generalized Burgers'-Fisher equation. The comparison of numerical results with the exact solutions, and the solutions obtained using some traditional methods, including adomian decomposition method (ADM), homotopy perturbation method (HPM), and optimal homotopy asymptotic method (OHAM), show that the suggested scheme is fairly accurate and viable for solving such problems. PMID:25811858

  14. Applications of Meta-heuristics to Power and Energy Fields

    NASA Astrophysics Data System (ADS)

    Fukuyama, Yoshikazu

    Considering deregulation in power systems and the energy conservation law, power and energy systems require more cost and energy reduction for system planning, operation, and control. Optimization techniques such as linear and nonlinear programming techniques have been utilized as one of the methods for realization of the reduction. Recently, meta-heuristics such as genetic algorithm, simulated annealing, tabu search, and particle swarm optimization have been paid attention as other options for realization of the reduction. In power and energy society, we had one technical committee and one special issue on applications of meta-heuristics for power system. Moreover, panel sessions and tutorials have been held in IEEE and IFAC. This paper presents applications of meta-heuristics to power and energy fields from the practical application point of view.

  15. The Development of a Highly Interactive Searching Technique for MORPHS--Minicomputer Operated Retrieval (Partially Heuristic) System.

    ERIC Educational Resources Information Center

    Bell, C. L. M.; Jones, K. P.

    1980-01-01

    Explains, with supporting figures and flowcharts of programing logic, two search strategies introduced to the MORPHS System since 1976: one that employs the normal Boolean operators in strings without bracketing or in the form of marked steps, and one that treats a string of keywords as a compound word. (Author/JD)

  16. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  17. Heuristics for chemical compound matching.

    PubMed

    Hattori, Masahiro; Okuno, Yasushi; Goto, Susumu; Kanehisa, Minoru

    2003-01-01

    We have developed an efficient algorithm for comparing two chemical compounds, where the chemical structure is treated as a 2D graph consisting of atoms as vertices and covalent bonds as edges. Based on the concept of functional groups in chemistry, 68 atom types (vertex types) are defined for carbon, nitrogen, oxygen, and other atomic species with different environments, which has enabled detection of biochemically meaningful features. Maximal common subgraphs of two graphs can be found by searching for maximal cliques in the association graph, and we have introduced heuristics to accelerate the clique finding. Our heuristic procedure is controlled by some adjustable parameters. Here we applied our procedure to the latest KEGG/LIGAND database with different sets of parameters, and demonstrated the correlation of parameters in our algorithm with the distribution of similarity scores and/or the execution time. Finally, we showed the effectiveness of our heuristics for compound pairs along metabolic pathways. PMID:15706529

  18. Search systems and computer-implemented search methods

    DOEpatents

    Payne, Deborah A.; Burtner, Edwin R.; Bohn, Shawn J.; Hampton, Shawn D.; Gillen, David S.; Henry, Michael J.

    2015-12-22

    Search systems and computer-implemented search methods are described. In one aspect, a search system includes a communications interface configured to access a plurality of data items of a collection, wherein the data items include a plurality of image objects individually comprising image data utilized to generate an image of the respective data item. The search system may include processing circuitry coupled with the communications interface and configured to process the image data of the data items of the collection to identify a plurality of image content facets which are indicative of image content contained within the images and to associate the image objects with the image content facets and a display coupled with the processing circuitry and configured to depict the image objects associated with the image content facets.

  19. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  20. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search.

    PubMed

    Wheeler, Ward C

    2003-06-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed. PMID:12901383

  1. A Meta-heuristic Approach for Variants of VRP in Terms of Generalized Saving Method

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki

    Global logistic design is becoming a keen interest to provide an essential infrastructure associated with modern societal provision. For examples, we can designate green and/or robust logistics in transportation systems, smart grids in electricity utilization systems, and qualified service in delivery systems, and so on. As a key technology for such deployments, we engaged in practical vehicle routing problem on a basis of the conventional saving method. This paper extends such idea and gives a general framework available for various real-world applications. It can cover not only delivery problems but also two kind of pick-up problems, i.e., straight and drop-by routings. Moreover, multi-depot problem is considered by a hybrid approach with graph algorithm and its solution method is realized in a hierarchical manner. Numerical experiments have been taken place to validate effectiveness of the proposed method.

  2. Paging Doctor Google! Heuristics vs. technology.

    PubMed

    Jhaveri, Kenar D; Schrier, Peter B; Mattana, Joseph

    2013-01-01

    The most dramatic development in medical decision-making technology has been the advent of the Internet. This has had an impact not only on clinicians, but has also become an important resource for patients who often approach their doctors with medical information they have obtained from the Internet.  Increasingly, medical students, residents and attending physicians have been using the Internet as a tool for diagnosing and treating disease. Internet-based resources that are available take various forms, including informational websites, online journals and textbooks, and social media.  Search engines such as Google have been increasingly used to help in making diagnoses of disease entities. Do these search methods fare better than experienced heuristic methods? In a small study, we examined the comparative role of heuristics versus the 'Google' mode of thinking. Internal medicine residents were asked to "google" key words to come up with a diagnosis. Their results were compared to experienced nephrology faculty and fellows in training using heuristics and no additional help of internet. Overall, with the aid of Google, the novices (internal medicine residents) correctly diagnosed renal diseases less often than the experts (the attendings) but with the same frequency as the intermediates (nephrology fellows).  However, in a subgroup analysis of both common diseases and rare diseases, the novices correctly diagnosed renal diseases less often than the experts but more often than the intermediates in each analysis.  The novices correctly diagnosed renal diseases with the same frequency as nephrology fellows in training. PMID:24627777

  3. Paging Doctor Google! Heuristics vs. technology

    PubMed Central

    Jhaveri, Kenar D

    2013-01-01

    The most dramatic development in medical decision-making technology has been the advent of the Internet. This has had an impact not only on clinicians, but has also become an important resource for patients who often approach their doctors with medical information they have obtained from the Internet.  Increasingly, medical students, residents and attending physicians have been using the Internet as a tool for diagnosing and treating disease. Internet-based resources that are available take various forms, including informational websites, online journals and textbooks, and social media.  Search engines such as Google have been increasingly used to help in making diagnoses of disease entities. Do these search methods fare better than experienced heuristic methods? In a small study, we examined the comparative role of heuristics versus the 'Google' mode of thinking. Internal medicine residents were asked to “google” key words to come up with a diagnosis. Their results were compared to experienced nephrology faculty and fellows in training using heuristics and no additional help of internet. Overall, with the aid of Google, the novices (internal medicine residents) correctly diagnosed renal diseases less often than the experts (the attendings) but with the same frequency as the intermediates (nephrology fellows).  However, in a subgroup analysis of both common diseases and rare diseases, the novices correctly diagnosed renal diseases less often than the experts but more often than the intermediates in each analysis.  The novices correctly diagnosed renal diseases with the same frequency as nephrology fellows in training. PMID:24627777

  4. Paradigms or Toolkits? Philosophical and Methodological Positions as Heuristics for Mixed Methods Research

    ERIC Educational Resources Information Center

    Maxwell, Joseph A.

    2011-01-01

    In this article, the author challenges the validity and usefulness of the concept of "paradigm," as this term has been used in the social sciences generally, and specifically in the debates over research methods. He emphasizes that in criticizing what he sees as the misuse of the paradigm concept, he is not arguing for dismissing or ignoring…

  5. Fixing Dataset Search

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    Three current search engines are queried for ozone data at the GES DISC. The results range from sub-optimal to counter-intuitive. We propose a method to fix dataset search by implementing a robust relevancy ranking scheme. The relevancy ranking scheme is based on several heuristics culled from more than 20 years of helping users select datasets.

  6. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  7. Debris flow susceptibility mapping using a qualitative heuristic method and Flow-R along the Yukon Alaska Highway Corridor, Canada

    NASA Astrophysics Data System (ADS)

    Blais-Stevens, A.; Behnia, P.

    2016-02-01

    This research activity aimed at reducing risk to infrastructure, such as a proposed pipeline route roughly parallel to the Yukon Alaska Highway Corridor (YAHC), by filling geoscience knowledge gaps in geohazards. Hence, the Geological Survey of Canada compiled an inventory of landslides including debris flow deposits, which were subsequently used to validate two different debris flow susceptibility models. A qualitative heuristic debris flow susceptibility model was produced for the northern region of the YAHC, from Kluane Lake to the Alaska border, by integrating data layers with assigned weights and class ratings. These were slope angle, slope aspect, surficial geology, plan curvature, and proximity to drainage system. Validation of the model was carried out by calculating a success rate curve which revealed a good correlation with the susceptibility model and the debris flow deposit inventory compiled from air photos, high-resolution satellite imagery, and field verification. In addition, the quantitative Flow-R method was tested in order to define the potential source and debris flow susceptibility for the southern region of Kluane Lake, an area where documented debris flow events have blocked the highway in the past (e.g. 1988). Trial and error calculations were required for this method because there was not detailed information on the debris flows for the YAHC to allow us to define threshold values for some parameters when calculating source areas, spreading, and runout distance. Nevertheless, correlation with known documented events helped define these parameters and produce a map that captures most of the known events and displays debris flow susceptibility in other, usually smaller, steep channels that had not been previously documented.

  8. Automatic Generation of Heuristics for Scheduling

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Bresina, John L.; Rodgers, Stuart M.

    1997-01-01

    This paper presents a technique, called GenH, that automatically generates search heuristics for scheduling problems. The impetus for developing this technique is the growing consensus that heuristics encode advice that is, at best, useful in solving most, or typical, problem instances, and, at worst, useful in solving only a narrowly defined set of instances. In either case, heuristic problem solvers, to be broadly applicable, should have a means of automatically adjusting to the idiosyncrasies of each problem instance. GenH generates a search heuristic for a given problem instance by hill-climbing in the space of possible multi-attribute heuristics, where the evaluation of a candidate heuristic is based on the quality of the solution found under its guidance. We present empirical results obtained by applying GenH to the real world problem of telescope observation scheduling. These results demonstrate that GenH is a simple and effective way of improving the performance of an heuristic scheduler.

  9. Method for Loss Minimum Re-configuration Problem of Distribution System by Tabu Search

    NASA Astrophysics Data System (ADS)

    Mishima, Yuji; Nara, Koichi; Satoh, Taiji; Ito, Takamitsu; Kaneda, Hirotoshi

    This paper proposes a loss minimum reconfiguration method by Tabu Search for open loop radial distribution system with distributed generators. The problem is to find the optimal normal open sectionalizing switch positions which minimize the total distribution line losses subjected to the line/transformer capacity constraints and voltage constraint. Generally, the problem is mathematically formulated as a complex combinatorial optimization problem or mixed integer programming problem, and is solved by using mathematical programming method, heuristic algorithm, intelligent method, etc. However, satisfactory algorithm for power companies has not yet been attained both in computational burden and solution accuracy. Thus, in this paper, the authors propose a method to solve the above problem by using Tabu Search (TS) method. Reverse power flow caused by distributed generators can be included in the solution algorithm. TS is one of meta-heuristic algorithms, and sometimes evaluated to be better compared with Genetic Algorithm (GA) or Simulated Annealing (SA) from viewpoints of both computational speed and solution accuracy. In order to evaluate the validity and efficiency of the algorithm, several numerical examples are shown in this paper.

  10. Impacts of Learning Inventive Problem-Solving Principles: Students' Transition from Systematic Searching to Heuristic Problem Solving

    ERIC Educational Resources Information Center

    Barak, Moshe

    2013-01-01

    This paper presents the outcomes of teaching an inventive problem-solving course in junior high schools in an attempt to deal with the current relative neglect of fostering students' creativity and problem-solving capabilities in traditional schooling. The method involves carrying out systematic manipulation with attributes, functions and…

  11. Prediction-based dynamic load-sharing heuristics

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.

    1993-01-01

    The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.

  12. Multiobjective Tabu Search method used in chemistry

    NASA Astrophysics Data System (ADS)

    Rusu, T.; Bulacovschi, V.

    The use of a combined artificial intelligence method in macromolecular chemistry design is described. This method implies a Back-Propagation (BP) Neural Network, modified for two-dimensional input data and for a system composed of a genetic algorithm extended by a Tabu Search operator used to incorporate high-level chemical knowledge: thermodynamic polymer properties.

  13. Job Search Methods: Internet versus Traditional.

    ERIC Educational Resources Information Center

    Kuhn, Peter; Skuterud, Mikal

    2000-01-01

    In 1998, 15 percent of unemployed job seekers used the Internet to seek jobs, as did half of all job seekers with online access from home. Internet search rates exceeded those of traditional methods, but Internet job seekers were more likely to use traditional methods as well. Unemployed blacks and Hispanics used the Internet least in job…

  14. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  15. SEARCHING FOR RAPID METHODS IN ENVIRONMENTAL BACTERIOLOGY

    EPA Science Inventory

    The search for rapid methods in sanitary bacteriology is more urgent today than ever before because of increased necessity for processing poorer quality source waters and controlling quality of sewage effluent discharges. Selection of criteria for rapid tests involving either mod...

  16. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  17. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  18. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  19. Solving systems of inequalities and equalities by a nonmonotone hybrid tabu search method

    NASA Astrophysics Data System (ADS)

    Ramadas, Gisela C. V.; Fernandes, Edite M. G. P.

    2012-09-01

    This paper presents a derivative-free nonmonotone hybrid tabu search to compute a solution of overdetermined systems of inequalities and equalities through the global optimization of an appropriate merit function. The proposed algorithm combines global and local searches aiming to reduce computational effort. Preliminary numerical results show the effectiveness of the combined heuristic.

  20. Trends in Job Search Methods, 1970-92.

    ERIC Educational Resources Information Center

    Ports, Michelle Harrison

    1993-01-01

    Updates the existing research on job search behaviors and analyzes trends in job search behavior since the 1970s. Tables show data on search methods by age, sex, race, Hispanic origin, type of work sought, and reason for unemployment. (JOW)

  1. Heuristic Inquiry: A Personal Journey of Acculturation and Identity Reconstruction

    ERIC Educational Resources Information Center

    Djuraskovic, Ivana; Arthur, Nancy

    2010-01-01

    Heuristic methodology attempts to discover the nature and meaning of phenomenon through internal self-search, exploration, and discovery. Heuristic methodology encourages the researcher to explore and pursue the creative journey that begins inside one's being and ultimately uncovers its direction and meaning through internal discovery (Douglass &…

  2. Establishing usability heuristics for heuristics evaluation in a specific domain: Is there a consensus?

    PubMed

    Hermawati, Setia; Lawson, Glyn

    2016-09-01

    Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed. PMID:27184309

  3. Method of searching for neutron clusters

    NASA Astrophysics Data System (ADS)

    Dudkin, G. N.; Garapatskii, A. A.; Padalko, V. N.

    2014-10-01

    A new method of searching for neutron clusters (multineutrons) composed of neutrons bound by nuclear forces has been introduced and implemented. The method is based on the search for daughter nuclei that emerge at the nuclei cluster decay of 252Cf to neutron clusters. The effect of long-time build-up of daughter nuclei with a high atomic number and long half-life was utilized. The results are interpreted as evidence of the cluster decay of 252Cf to daughter nucleus 232U (half-life of T1/2= 68.9 years). The emergence of 232U is attributed to emission of neutron clusters consisting of eight neutrons - octaneutrons. The emission probability of octaneutrons against α-decay probability of 252Cf is defined equal to λC/λα=1.74×10-6.

  4. Harmony Search Method: Theory and Applications

    PubMed Central

    Gao, X. Z.; Govindasamy, V.; Xu, H.; Wang, X.; Zenger, K.

    2015-01-01

    The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem. PMID:25945083

  5. Finite difference methods for reducing numerical diffusion in TEACH-type calculations. [Teaching Elliptic Axisymmetric Characteristics Heuristically

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.

    1985-01-01

    A methodological evaluation for two-finite differencing schemes for computer-aided gas turbine design is presented. The two computational schemes include; a Bounded Skewed Finite Differencing Scheme (BSUDS); and a Quadratic Upwind Differencing Scheme (QSDS). In the evaluation, the derivations of the schemes were incorporated into two-dimensional and three-dimensional versions of the Teaching Axisymmetric Characteristics Heuristically (TEACH) computer code. Assessments were made according to performance criteria for the solution of problems of turbulent, laminar, and coannular turbulent flow. The specific performance criteria used in the evaluation were simplicity, accuracy, and computational economy. It is found that the BSUDS scheme performed better with respect to the criteria than the QUDS. Some of the reasons for the more successful performance BSUDS are discussed.

  6. Calibration of Biokinetic and Biological Parameters for a Groundwater Bioremediation Model using Heuristics and Function Approximation Optimization

    NASA Astrophysics Data System (ADS)

    Mugunthan, P.; Shoemaker, C. A.; Regis, R. G.

    2003-12-01

    Heuristics and function approximation optimization methods were applied in calibrating biological and biokinetic parameters for a computationally expensive groundwater bioremediation model for engineered reductive dechlorination of chlorinated ethenes. Multi-species groundwater bioremediation models that use monod type kinetics are often not amenable to traditional derivative based optimization due to stiff biokinetic equations. The performance of three heuristic methods, Stochastic Greedy Search (GS), Real Genetic Algorithm (RGA), Derandomized Evolution Strategy (DES), and, Function Approximation Optimization based on Radial Basis Function (FA-RBF) were compared on three-dimensional hypothetical and field problems. GS was implemented so as to perform a more global search. Optimization results on hypothetical problem indicated that FA-RBF performed statistically significantly better than heuristic based evolutionary algorithms at a 10% significance level. Further, this particular implementation of GS performed well and proved superior to RGA. These heuristic methods and FA-RBF, with the exception of RGA, were applied to calibrate biological and biokinetic parameters using treatability test data for enhanced bioremediation at a Naval Air Station in Alameda Point, CA. All three methods performed well and identified similar solutions. The approximate simulation times for the hypothetical and real problems were 7 min and 2.5 hours respectively. Calibration of such computationally expensive models by heuristic and function approximation methods appears promising.

  7. Fluency Heuristic: A Model of How the Mind Exploits a By-Product of Information Retrieval

    ERIC Educational Resources Information Center

    Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten

    2008-01-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the…

  8. Computational Experiments with the RAVE Heuristic

    NASA Astrophysics Data System (ADS)

    Tom, David; Müller, Martin

    The Monte-Carlo tree search algorithm Upper Confidence bounds applied to Trees (UCT) has become extremely popular in computer games research. The Rapid Action Value Estimation (RAVE) heuristic is a strong estimator that often improves the performance of UCT-based algorithms. However, there are situations where RAVE misleads the search whereas pure UCT search can find the correct solution. Two games, the simple abstract game Sum of Switches (SOS) and the game of Go, are used to study the behavior of the RAVE heuristic. In SOS, RAVE updates are manipulated to mimic game situations where RAVE misleads the search. Such false RAVE updates are used to create RAVE overestimates and underestimates. A study of the distributions of mean and RAVE values reveals great differences between Go and SOS. While the RAVE-max update rule is able to correct extreme cases of RAVE underestimation, it is not effective in closer to practical settings and in Go.

  9. Pitfalls in Teaching Judgment Heuristics

    ERIC Educational Resources Information Center

    Shepperd, James A.; Koch, Erika J.

    2005-01-01

    Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…

  10. ConfGen: a conformational search method for efficient generation of bioactive conformers.

    PubMed

    Watts, K Shawn; Dalal, Pranav; Murphy, Robert B; Sherman, Woody; Friesner, Rich A; Shelley, John C

    2010-04-26

    We describe the methodology, parametrization, and application of a conformational search method, called ConfGen, designed to efficiently generate bioactive conformers. We define efficiency as the ability to generate a bioactive conformation within a small total number of conformations using a reasonable amount of computer time. The method combines physics-based force field calculations with empirically derived heuristics designed to achieve efficient searching and prioritization of the ligand's conformational space. While many parameter settings are supported, four modes spanning a range of speed and quality trades-offs are defined and characterized. The validation set used to test the method is composed of ligands from 667 crystal structures covering a broad array of target and ligand classes. With the fastest mode, ConfGen uses an average of 0.5 s per ligand and generates only 14.3 conformers per ligand, at least one of which lies within 2.0 A root-mean-squared deviation of the crystal structure for 96% of the ligands. The most computationally intensive mode raises this recovery rate to 99%, while taking 8 s per ligand. Combining multiple search modes to "fill-in" holes in the conformation space or energy minimizing using an all-atom force field each lead to improvements in the recovery rates at higher resolutions. Overall, ConfGen is at least as good as competing programs at high resolution and demonstrates higher efficiency at resolutions sufficient for many downstream applications, such as pharmacophore modeling. PMID:20373803

  11. Innovative Meta-Heuristic Approach Application for Parameter Estimation of Probability Distribution Model

    NASA Astrophysics Data System (ADS)

    Lee, T. S.; Yoon, S.; Jeong, C.

    2012-12-01

    The primary purpose of frequency analysis in hydrology is to estimate the magnitude of an event with a given frequency of occurrence. The precision of frequency analysis depends on the selection of an appropriate probability distribution model (PDM) and parameter estimation techniques. A number of PDMs have been developed to describe the probability distribution of the hydrological variables. For each of the developed PDMs, estimated parameters are provided based on alternative estimation techniques, such as the method of moments (MOM), probability weighted moments (PWM), linear function of ranked observations (L-moments), and maximum likelihood (ML). Generally, the results using ML are more reliable than the other methods. However, the ML technique is more laborious than the other methods because an iterative numerical solution, such as the Newton-Raphson method, must be used for the parameter estimation of PDMs. In the meantime, meta-heuristic approaches have been developed to solve various engineering optimization problems (e.g., linear and stochastic, dynamic, nonlinear). These approaches include genetic algorithms, ant colony optimization, simulated annealing, tabu searches, and evolutionary computation methods. Meta-heuristic approaches use a stochastic random search instead of a gradient search so that intricate derivative information is unnecessary. Therefore, the meta-heuristic approaches have been shown to be a useful strategy to solve optimization problems in hydrology. A number of studies focus on using meta-heuristic approaches for estimation of hydrological variables with parameter estimation of PDMs. Applied meta-heuristic approaches offer reliable solutions but use more computation time than derivative-based methods. Therefore, the purpose of this study is to enhance the meta-heuristic approach for the parameter estimation of PDMs by using a recently developed algorithm known as a harmony search (HS). The performance of the HS is compared to the

  12. Intelligent process mapping through systematic improvement of heuristics

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  13. Multiobjective hyper heuristic scheme for system design and optimization

    NASA Astrophysics Data System (ADS)

    Rafique, Amer Farhan

    2012-11-01

    As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.

  14. [Quantitative structure activity relationship models based on heuristic method and gene expression programming for the prediction of the pK(a) values of sulfa drugs].

    PubMed

    Li, Yu-qin; Si, Hong-zong; Xiao, Yu-liang; Liu, Cai-hong; Xia, Cheng-cai; Li, Ke; Qi, Yong-xiu

    2009-05-01

    Quantitative structure-property relationships (QSPR) were developed to predict the pK(a) values of sulfa drugs via heuristic method (HM) and gene expression programming (GEP). The descriptors of 31 sulfa drugs were calculated by the software CODESSA, which can calculate constitutional, topological, geometrical, electrostatic, and quantum chemical descriptors. HM was also used for the preselection of 4 appropriate molecular descriptors. Linear and nonlinear QSPR models were developed based on the HM and GEP separately and two prediction models lead to a good correlation coefficient (R) of 0.90 and 0.95. The two QSPR models are tseful in predicting pK(a) during the discovery of new drugs and providing theory information for studying the new drugs. PMID:19618723

  15. Visual tracking method based on cuckoo search algorithm

    NASA Astrophysics Data System (ADS)

    Gao, Ming-Liang; Yin, Li-Ju; Zou, Guo-Feng; Li, Hai-Tao; Liu, Wei

    2015-07-01

    Cuckoo search (CS) is a new meta-heuristic optimization algorithm that is based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flight behavior of some birds and fruit flies. It has been found to be efficient in solving global optimization problems. An application of CS is presented to solve the visual tracking problem. The relationship between optimization and visual tracking is comparatively studied and the parameters' sensitivity and adjustment of CS in the tracking system are experimentally studied. To demonstrate the tracking ability of a CS-based tracker, a comparative study of tracking accuracy and speed of the CS-based tracker with six "state-of-art" trackers, namely, particle filter, meanshift, PSO, ensemble tracker, fragments tracker, and compressive tracker are presented. Comparative results show that the CS-based tracker outperforms the other trackers.

  16. A framework for structured quantum search

    NASA Astrophysics Data System (ADS)

    Hogg, Tad

    1998-09-01

    A quantum algorithm for general combinatorial search that uses the underlying structure of the search space to increase the probability of finding a solution is presented. This algorithm shows how coherent quantum systems can be matched to the underlying structure of abstract search spaces. The algorithm is evaluated empirically with a variety of search problems, and shown to be particularly effective for searches with many constraints. Furthermore, the algorithm provides a simple framework for utilizing search heuristics. It also exhibits the same phase transition in search difficulty as found for sophisticated classical search methods, indicating that it is effectively using the problem structure.

  17. Parallel Heuristics for Scalable Community Detection

    SciTech Connect

    Lu, Howard; Kalyanaraman, Anantharaman; Halappanavar, Mahantesh; Choudhury, Sutanay

    2014-05-17

    Community detection has become a fundamental operation in numerous graph-theoretic applications. It is used to reveal natural divisions that exist within real world networks without imposing prior size or cardinality constraints on the set of communities. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed by Blondel et al. in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method is also inherently sequential, thereby limiting its scalability to problems that can be solved on desktops. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose multiple heuristics that are designed to break the sequential barrier. Our heuristics are agnostic to the underlying parallel architecture. For evaluation purposes, we implemented our heuristics on shared memory (OpenMP) and distributed memory (MapReduce-MPI) machines, and tested them over real world graphs derived from multiple application domains (internet, biological, natural language processing). Experimental results demonstrate the ability of our heuristics to converge to high modularity solutions comparable to those output by the serial algorithm in nearly the same number of iterations, while also drastically reducing time to solution.

  18. The use of geoscience methods for terrestrial forensic searches

    NASA Astrophysics Data System (ADS)

    Pringle, J. K.; Ruffell, A.; Jervis, J. R.; Donnelly, L.; McKinley, J.; Hansen, J.; Morgan, R.; Pirrie, D.; Harrison, M.

    2012-08-01

    Geoscience methods are increasingly being utilised in criminal, environmental and humanitarian forensic investigations, and the use of such methods is supported by a growing body of experimental and theoretical research. Geoscience search techniques can complement traditional methodologies in the search for buried objects, including clandestine graves, weapons, explosives, drugs, illegal weapons, hazardous waste and vehicles. This paper details recent advances in search and detection methods, with case studies and reviews. Relevant examples are given, together with a generalised workflow for search and suggested detection technique(s) table. Forensic geoscience techniques are continuing to rapidly evolve to assist search investigators to detect hitherto difficult to locate forensic targets.

  19. Cuckoo search epistasis: a new method for exploring significant genetic interactions.

    PubMed

    Aflakparast, M; Salimi, H; Gerami, A; Dubé, M-P; Visweswaran, S; Masoudi-Nejad, A

    2014-06-01

    The advent of high-throughput sequencing technology has resulted in the ability to measure millions of single-nucleotide polymorphisms (SNPs) from thousands of individuals. Although these high-dimensional data have paved the way for better understanding of the genetic architecture of common diseases, they have also given rise to challenges in developing computational methods for learning epistatic relationships among genetic markers. We propose a new method, named cuckoo search epistasis (CSE) for identifying significant epistatic interactions in population-based association studies with a case-control design. This method combines a computationally efficient Bayesian scoring function with an evolutionary-based heuristic search algorithm, and can be efficiently applied to high-dimensional genome-wide SNP data. The experimental results from synthetic data sets show that CSE outperforms existing methods including multifactorial dimensionality reduction and Bayesian epistasis association mapping. In addition, on a real genome-wide data set related to Alzheimer's disease, CSE identified SNPs that are consistent with previously reported results, and show the utility of CSE for application to genome-wide data. PMID:24549111

  20. An intelligent method for geographic Web search

    NASA Astrophysics Data System (ADS)

    Mei, Kun; Yuan, Ying

    2008-10-01

    While the electronically available information in the World-Wide Web is explosively growing and thus increasing, the difficulty to find relevant information is also increasing for search engine user. In this paper we discuss how to constrain web queries geographically. A number of search queries are associated with geographical locations, either explicitly or implicitly. Accurately and effectively detecting the locations where search queries are truly about has huge potential impact on increasing search relevance, bringing better targeted search results, and improving search user satisfaction. Our approach focus on both in the way geographic information is extracted from the web and, as far as we can tell, in the way it is integrated into query processing. This paper gives an overview of a spatially aware search engine for semantic querying of web document. It also illustrates algorithms for extracting location from web documents and query requests using the location ontologies to encode and reason about formal semantics of geographic web search. Based on a real-world scenario of tourism guide search, the application of our approach shows that the geographic information retrieval can be efficiently supported.

  1. Exhaustive search system and method using space-filling curves

    DOEpatents

    Spires, Shannon V.

    2003-10-21

    A search system and method for one agent or for multiple agents using a space-filling curve provides a way to control one or more agents to cover an area of any space of any dimensionality using an exhaustive search pattern. An example of the space-filling curve is a Hilbert curve. The search area can be a physical geography, a cyberspace search area, or an area searchable by computing resources. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace.

  2. ALMA Pipeline Heuristics

    NASA Astrophysics Data System (ADS)

    Lightfoot, J.; Wyrowski, F.; Muders, D.; Boone, F.; Davis, L.; Shepherd, D.; Wilson, C.

    2006-07-01

    The ALMA (Atacama Large Millimeter Array) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics system must capture the expert knowledge required to provide data products that can be used without further processing. Observing modes to be processed by the system include single field interferometry, mosaics and single dish `on-the-fly' maps, and combinations of these modes. The data will be produced by the main ALMA array, the ALMA Compact Array (ACA) and single dish antennas. The Pipeline Heuristics system is being developed as a set of Python scripts. For interferometry these use as data processing engines the CASA/AIPS++ libraries and their bindings as CORBA objects within the ALMA Common Software (ACS). Initial development has used VLA and Plateau de Bure data sets to build and test a heuristic script capable of reducing single field data. In this paper we describe the reduction datapath and the algorithms used at each stage. Test results are presented. The path for future development is outlined.

  3. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems.

    PubMed

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383

  4. Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems

    PubMed Central

    Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique

    2016-01-01

    Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383

  5. A Heuristic Approach for International Crude Oil Transportation Scheduling Problems

    NASA Astrophysics Data System (ADS)

    Yin, Sisi; Nishi, Tatsushi; Izuno, Tsukasa

    In this paper, we propose a heuristic algorithm to solve a practical ship scheduling problem for international crude oil transportation. The problem is considered as a vehicle routing problem with split deliveries. The objective of this paper is to find an optimal assignment of tankers, a sequence of visiting and loading volume simultaneously in order to minimize the total distance satisfying the capacity of tankers. A savings-based meta-heuristic algorithm with lot sizing parameters and volume assignment heuristic is developed. The proposed method is applied to solve a case study with real data. Computational results demonstrate the effectiveness of the heuristic algorithm compared with that of human operators.

  6. A quantum heuristic algorithm for the traveling salesman problem

    NASA Astrophysics Data System (ADS)

    Bang, Jeongho; Ryu, Junghee; Lee, Changhyoup; Yoo, Seokwon; Lim, James; Lee, Jinhyoung

    2012-12-01

    We propose a quantum heuristic algorithm to solve the traveling salesman problem by generalizing the Grover search. Sufficient conditions are derived to greatly enhance the probability of finding the tours with the cheapest costs reaching almost to unity. These conditions are characterized by the statistical properties of tour costs and are shown to be automatically satisfied in the large-number limit of cities. In particular for a continuous distribution of the tours along the cost, we show that the quantum heuristic algorithm exhibits a quadratic speedup compared to its classical heuristic algorithm.

  7. Heuristics for multiextremal optimization on a hyperrectangle: Some experiments

    SciTech Connect

    Hendrix, E.; Roosma, J.

    1994-12-31

    The problem of finding the best point of a multiextremal oracle function on a hyperrectangle given a budget of (relatively expensive) function evaluations, is considered. Global optimization can be seen as combining covering actions and local searches. The combination can be adapted during the search depending on the roughness, number of optima found. Various heuristics are formulated and outcomes of experiments are shown.

  8. Searching for Data: Swarming Agent Method

    NASA Astrophysics Data System (ADS)

    Caputo, D. P.; Dolan, R.

    2012-07-01

    As our ability to produce data grows our ability to examine and find the useful portions of large data sets must grow as well. We present an efficient, agent based search algorithm, based on the behavior of schooling fish in the presence of predators, designed to search and/or map very large data sets. Our algorithm, which belongs to the artificial life family of algorithms, attempts to leverage swarm intelligence against the difficulty of finding valuable data within a sea of data. The agents search the data space based on a small set of simple rules which produces emergent behavior and results in an efficient and flexible algorithm, while at the same time resisting many of the short comings of other artificial life algorithms.

  9. A novel method of wide searching scope and fast searching speed for image block matching

    NASA Astrophysics Data System (ADS)

    Yu, Fei; Li, Chao; Mei, Qiang; Lin, Zhe

    2015-10-01

    When the image matching method is used for motion estimation, the performance parameters like searching scope, searching speed, accuracy and robustness of the method normally are significant and need enhancement. In this paper, a novel method of block matching containing the wide range image block matching strategy and the strategy of multi-start points and parallel searching are presented. In the wide range matching strategy, the size of template block and searching block are same. And the average value of cumulative results by pixels in calculation is taken to ensure matching parameters can accurately represent the matching degree. In the strategy of multi-start points and parallel searching, the way of choosing starting points evenly is presented based on the characteristic of the block matching search, and the adaptive conditions and adaptive schedule is established based on the searching region. In the processing of iteration, the new strategy can not only adapt to the solution that lead the objective to the correct direction, but also adapt to the solution that have a little offset comparing with the objective. Therefore the multi-start points and parallel searching algorithm can be easy to keep from the trap of local minima effectively. The image processing system based on the DSP chip of TMS320C6415 is used to make the experiment for the video image stabilization. The results of experiment show that, the application of two methods can improve the range of motion estimation and reduce the searching computation.

  10. Inferring heuristic classification hierarchies from natural language input

    NASA Technical Reports Server (NTRS)

    Hull, Richard; Gomez, Fernando

    1993-01-01

    A methodology for inferring hierarchies representing heuristic knowledge about the check out, control, and monitoring sub-system (CCMS) of the space shuttle launch processing system from natural language input is explained. Our method identifies failures explicitly and implicitly described in natural language by domain experts and uses those descriptions to recommend classifications for inclusion in the experts' heuristic hierarchies.

  11. A screened automated structural search with semiempirical methods

    NASA Astrophysics Data System (ADS)

    Ota, Yukihiro; Ruiz-Barragan, Sergi; Machida, Masahiko; Shiga, Motoyuki

    2016-03-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  12. Hybridisations of Variable Neighbourhood Search and Modified Simplex Elements to Harmony Search and Shuffled Frog Leaping Algorithms for Process Optimisations

    NASA Astrophysics Data System (ADS)

    Aungkulanon, P.; Luangpaiboon, P.

    2010-10-01

    Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.

  13. A Heuristic Fast Method to Solve the Nonlinear Schroedinger Equation in Fiber Bragg Gratings with Arbitrary Shape Input Pulse

    SciTech Connect

    Emami, F.; Hatami, M.; Keshavarz, A. R.; Jafari, A. H.

    2009-08-13

    Using a combination of Runge-Kutta and Jacobi iterative method, we could solve the nonlinear Schroedinger equation describing the pulse propagation in FBGs. By decomposing the electric field to forward and backward components in fiber Bragg grating and utilizing the Fourier series analysis technique, the boundary value problem of a set of coupled equations governing the pulse propagation in FBG changes to an initial condition coupled equations which can be solved by simple Runge-Kutta method.

  14. Job Search Methods. ERIC Digest No. 121.

    ERIC Educational Resources Information Center

    Wagner, Judith O.

    Steps in preparing and conducting a job search include the following: (1) developing a resume; (2) locating prospective employers; (3) applying for the job; (4) interviewing; and (5) following through. The two types of resumes are the chronological and the functional. Most application forms require some basic information: name, address, and…

  15. Heuristics for scheduling Earth observing satellites

    NASA Astrophysics Data System (ADS)

    Wolfe, William J.; Sorensen, Stephen E.

    1999-09-01

    This paper describes several methods for assigning tasks to Earth Observing Systems Satellites (EOS). We present empirical results for three heuristics, called: Priority Dispatch (PD), Look Ahead (LA), and Genetic Algorithm (GA). These heuristics progress from simple to complex, from less accurate to more accurate, and from fast to slow. We present empirical results as applied to the Window-Constrained Packing problem (WCP). The WCP is a simplified version of the EOS scheduling problem. We discuss the problem of having more than one optimization criteria. We will also discuss the relationship between the WCP and the more traditional Knapsack and Weighted Early/Tardy problems.

  16. Application of heuristic optimization in aircraft design

    NASA Astrophysics Data System (ADS)

    Hu, Zhenning

    Genetic algorithms and the related heuristic optimization strategies are introduced and their applications in the aircraft design are developed. Generally speaking, genetic algorithms belong to non-deterministic direct search methods, which are most powerful in finding optimum or near-optimum solutions of a very complex system where a little priori knowledge is known. Therefore they have a wide application in aerospace systems. Two major aircraft optimal design projects are illustrated in this dissertation. The first is the application of material optimization of aligned fiber laminate composites in the presence of stress concentrations. After a large number of tests on laminates with different layers, genetic algorithms find an alignment pattern in a certain range for the Boeing Co. specified material. The second project is the application of piezoelectric actuator placement on a generic tail skins to reduce the 2nd mode vibration caused by buffet, which is part of a Boeing project to control the buffet effect on aircraft. In this project, genetic algorithms are closely involved with vibration analysis and finite element analysis. Actuator optimization strategies are first tested on the theoretical beam models to gain experience, and then the generic tail model is applied. Genetic algorithms achieve a great success in optimizing up to 888 actuator parameters on the tail skins.

  17. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis

    PubMed Central

    2011-01-01

    Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These

  18. AN EVALUATION OF A METHOD FOR IMPROVING SEARCH STRATEGIES IN A COORDINATE SEARCHING SYSTEM.

    ERIC Educational Resources Information Center

    HEWER, DAVID J.

    SEARCH STRATEGIES WHICH CAN BE CONTINUOUSLY MODIFIED WERE DEVELOPED FOR COORDINATE SEARCHING SYSTEMS. USING THE FILES OF THE NASA TECHNOLOGY UTILIZATION PROGRAM AT THE KNOWLEDGE AVAILABILITY SYSTEMS CENTER, UNIVERSITY OF PITTSBURGH, A STUDY WAS CONDUCTED OF THE RETRIEVAL OF RELEVANT DOCUMENTS BY BOTH MANUAL AND MACHINE METHODS FOR FIVE QUESTIONS…

  19. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  20. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data.

  1. Searching method through biased random walks on complex networks.

    PubMed

    Lee, Sungmin; Yook, Soon-Hyung; Kim, Yup

    2009-07-01

    Information search is closely related to the first-passage property of diffusing particle. The physical properties of diffusing particle is affected by the topological structure of the underlying network. Thus, the interplay between dynamical process and network topology is important to study information search on complex networks. Designing an efficient method has been one of main interests in information search. Both reducing the network traffic and decreasing the searching time have been two essential factors for designing efficient method. Here we propose an efficient method based on biased random walks. Numerical simulations show that the average searching time of the suggested model is more efficient than other well-known models. For a practical interest, we demonstrate how the suggested model can be applied to the peer-to-peer system. PMID:19658839

  2. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  3. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  4. Heuristic Classification. Technical Report Number 12.

    ERIC Educational Resources Information Center

    Clancey, William J.

    A broad range of well-structured problems--embracing forms of diagnosis, catalog selection, and skeletal planning--are solved in expert computer systems by the method of heuristic classification. These programs have a characteristic inference structure that systematically relates data to a pre-enumerated set of solutions by abstraction, heuristic…

  5. Heuristic Presentations: The Role of Structuring.

    ERIC Educational Resources Information Center

    Leron, Uri

    1985-01-01

    Discusses insufficiency of the linear method and some informal practices (or heuristics) used by expositors in trying to alleviate it. Uses the Cantor-Bernstein theorem to illustrate the linear proof, structuring, and the structure proof. Argues that the informal practices considered be consistently applied to the presentation of pivots and…

  6. Structural Functionalism as a Heuristic Device.

    ERIC Educational Resources Information Center

    Chilcott, John H.

    1998-01-01

    Argues that structural functionalism as a method for conducting fieldwork and as a format for the analysis of ethnographic data remains a powerful model, one that is easily understood by professional educators. As a heuristic device, functionalist theory can help in the solution of a problem that is otherwise incapable of theoretical…

  7. Heuristic reusable dynamic programming: efficient updates of local sequence alignment.

    PubMed

    Hong, Changjin; Tewfik, Ahmed H

    2009-01-01

    Recomputation of the previously evaluated similarity results between biological sequences becomes inevitable when researchers realize errors in their sequenced data or when the researchers have to compare nearly similar sequences, e.g., in a family of proteins. We present an efficient scheme for updating local sequence alignments with an affine gap model. In principle, using the previous matching result between two amino acid sequences, we perform a forward-backward alignment to generate heuristic searching bands which are bounded by a set of suboptimal paths. Given a correctly updated sequence, we initially predict a new score of the alignment path for each contour to select the best candidates among them. Then, we run the Smith-Waterman algorithm in this confined space. Furthermore, our heuristic alignment for an updated sequence shows that it can be further accelerated by using reusable dynamic programming (rDP), our prior work. In this study, we successfully validate "relative node tolerance bound" (RNTB) in the pruned searching space. Furthermore, we improve the computational performance by quantifying the successful RNTB tolerance probability and switch to rDP on perturbation-resilient columns only. In our searching space derived by a threshold value of 90 percent of the optimal alignment score, we find that 98.3 percent of contours contain correctly updated paths. We also find that our method consumes only 25.36 percent of the runtime cost of sparse dynamic programming (sDP) method, and to only 2.55 percent of that of a normal dynamic programming with the Smith-Waterman algorithm. PMID:19875856

  8. Method and system for efficiently searching an encoded vector index

    DOEpatents

    Bui, Thuan Quang; Egan, Randy Lynn; Kathmann, Kevin James

    2001-09-04

    Method and system aspects for efficiently searching an encoded vector index are provided. The aspects include the translation of a search query into a candidate bitmap, and the mapping of data from the candidate bitmap into a search result bitmap according to entry values in the encoded vector index. Further, the translation includes the setting of a bit in the candidate bitmap for each entry in a symbol table that corresponds to candidate of the search query. Also included in the mapping is the identification of a bit value in the candidate bitmap pointed to by an entry in an encoded vector.

  9. An Efficient Substring Search Method by Using Delayed Keyword Extraction.

    ERIC Educational Resources Information Center

    Okada, Makoto; Ando, Kazuaki; Lee, Samuel Sangkon; Hayashi, Yoshitaka; Aoe, Jun-ichi

    2001-01-01

    Discusses information retrieval systems and extracting appropriate keywords from documents and proposes an effective substring search method by extending a pattern matching machine for multi-keywords called delayed keyword extraction (DKE). Also proposes a construction algorithm of the retrieval structure for speeding up the substring search.…

  10. Job Search as Goal-Directed Behavior: Objectives and Methods

    ERIC Educational Resources Information Center

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  11. Methods for Measuring Search Engine Performance over Time.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit

    2002-01-01

    Introduces methods for evaluating Web search engine performance over a time period. Describes the necessary setup for such studies, illustrates the use of these measures through a specific example, and recommends the use of the measures as a guideline for testing and improving search engine functionality. (Author/LRW)

  12. Bidirectional citation searching to completion: an exploration of literature searching methods.

    PubMed

    Hinde, Sebastian; Spackman, Eldon

    2015-01-01

    Literature reviews underpin the majority of research projects in the health sciences, and yet relatively little analysis has been published as to the most appropriate method to identify relevant literature, outside of specialist information journals. The method of applying keyword search queries to bibliographic databases using Boolean logic dominates literature reviews due to its easy application to the major online databases. However, it is recognised increasingly as being problematic where the research question cannot be clearly defined or requires an element of exploration, due to its reliance on author's use of titling and keywords and is unable to identify topics other than those defined in the search query. This paper discusses the relative merits of a systematic citation searching approach as both an alternative and a concurrent method to keyword searching. A method of citation searching, both forwards and backwards, which is iterated to form a closed loop solution, is discussed. An illustrative example is presented of both methods, applying them to the topic of the UK National Institute for Health and Care Excellence (NICE) cost-effectiveness threshold. The case study finds the citation searching approach dominates the traditional keyword searching approach, finding 76 papers of relevance, including all 15 found by the alternative approach. Conceptually, and in the example presented, it is demonstrated that the proposed method can represent a dominant strategy to the more traditional approach in some situations, highlighting that, wherever possible, it is preferential to employ multiple methods of searching. However, it is clear that a better understanding is required as to how we can most efficiently search the ever-growing sea of literature. PMID:25145803

  13. Obtaining Maxwell's equations heuristically

    NASA Astrophysics Data System (ADS)

    Diener, Gerhard; Weissbarth, Jürgen; Grossmann, Frank; Schmidt, Rüdiger

    2013-02-01

    Starting from the experimental fact that a moving charge experiences the Lorentz force and applying the fundamental principles of simplicity (first order derivatives only) and linearity (superposition principle), we show that the structure of the microscopic Maxwell equations for the electromagnetic fields can be deduced heuristically by using the transformation properties of the fields under space inversion and time reversal. Using the experimental facts of charge conservation and that electromagnetic waves propagate with the speed of light, together with Galilean invariance of the Lorentz force, allows us to finalize Maxwell's equations and to introduce arbitrary electrodynamics units naturally.

  14. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    ERIC Educational Resources Information Center

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  15. Accelerated Profile HMM Searches

    PubMed Central

    Eddy, Sean R.

    2011-01-01

    Profile hidden Markov models (profile HMMs) and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the “multiple segment Viterbi” (MSV) algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call “sparse rescaling”. These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches. PMID:22039361

  16. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  17. Reexamining Our Bias against Heuristics

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Eva, Kevin W.; Norman, Geoff R.

    2014-01-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources…

  18. A comparison of fully automated methods of data analysis and computer assisted heuristic methods in an electrode kinetic study of the pathologically variable [Fe(CN)6](3-/4-) process by AC voltammetry.

    PubMed

    Morris, Graham P; Simonov, Alexandr N; Mashkina, Elena A; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E; Gavaghan, David J; Bond, Alan M

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6](3-/4-) process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E(0) (reversible potential), k(0) (heterogeneous charge transfer rate constant at E(0)), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm's Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k(0) values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN)6](3-/4-) process, but remarkably, all fit the quasi-reversible model satisfactorily. PMID:24160752

  19. A constrained optimization algorithm based on the simplex search method

    NASA Astrophysics Data System (ADS)

    Mehta, Vivek Kumar; Dasgupta, Bhaskar

    2012-05-01

    In this article, a robust method is presented for handling constraints with the Nelder and Mead simplex search method, which is a direct search algorithm for multidimensional unconstrained optimization. The proposed method is free from the limitations of previous attempts that demand the initial simplex to be feasible or a projection of infeasible points to the nonlinear constraint boundaries. The method is tested on several benchmark problems and the results are compared with various evolutionary algorithms available in the literature. The proposed method is found to be competitive with respect to the existing algorithms in terms of effectiveness and efficiency.

  20. Memory-Based Decision-Making with Heuristics: Evidence for a Controlled Activation of Memory Representations

    ERIC Educational Resources Information Center

    Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank

    2011-01-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…

  1. Algorithm of transferring the load of the faulted substation transformer using the best-first search method

    SciTech Connect

    Kim, H.; Ko, Y.S.; Jung, K.H. )

    1992-07-01

    In this paper, an expert system is developed to provide a quick and best strategy of load transfer for the power system operator. This load transferring problem is then constrained by the firm and normal capacities of a bank, the fault history of a bank, and the feeder priorities. Heuristic rules which are obtained from a substation operator and both DDC (Distribution Dispatch Center) and DDC (Distribution Control Center) engineers, are incorporated in an expert system to improve the solution procedure. Furthermore, the structural rules based on the bus topology are also generated to reduce the number of switching required to reallocate the load from the busbar connected to the faulted bank to the other sections. This expert system is implemented in Prolog, and the best-first search method is adopted. To solve the combinatorial problem, list processing and recursive programming techniques are used. We also employ the pattern matching mechanism to trace the feeder connectivity.

  2. Remarks on search methods for stable, massive, elementary particles

    NASA Astrophysics Data System (ADS)

    Perl, Martin L.

    2001-11-01

    This paper was presented at the 69th birthday celebration of Professor Eugene Commins, honoring his research achievements. These remarks are about the experimental techniques used in the search for new stable, massive particles, particles at least as massive as the electron. A variety of experimental methods such as accelerator experiments, cosmic ray studies, searches for halo particles in the galaxy and searches for exotic particles in bulk matter are described. A summary is presented of the measured limits on the existence of new stable, massive particle. .

  3. Method and System for Object Recognition Search

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A. (Inventor); Duong, Vu A. (Inventor); Stubberud, Allen R. (Inventor)

    2012-01-01

    A method for object recognition using shape and color features of the object to be recognized. An adaptive architecture is used to recognize and adapt the shape and color features for moving objects to enable object recognition.

  4. A Tabu Search WSN Deployment Method for Monitoring Geographically Irregular Distributed Events.

    PubMed

    Aitsaadi, Nadjib; Achir, Nadjib; Boussetta, Khaled; Pujolle, Guy

    2009-01-01

    In this paper, we address the Wireless Sensor Network (WSN) deployment issue. We assume that the observed area is characterized by the geographical irregularity of the sensed events. Formally, we consider that each point in the deployment area is associated a differentiated detection probability threshold, which must be satisfied by our deployment method. Our resulting WSN deployment problem is formulated as a Multi-Objectives Optimization problem, which seeks to reduce the gap between the generated events detection probabilities and the required thresholds while minimizing the number of deployed sensors. To overcome the computational complexity of an exact resolution, we propose an original pseudo-random approach based on the Tabu Search heuristic. Simulations show that our proposal achieves better performances than several other approaches proposed in the literature. In the last part of this paper, we generalize the deployment problem by including the wireless communication network connectivity constraint. Thus, we extend our proposal to ensure that the resulting WSN topology is connected even if a sensor communication range takes small values. PMID:22573977

  5. The Convolution Method in Neutrino Physics Searches

    SciTech Connect

    Tsakstara, V.; Kosmas, T. S.; Chasioti, V. C.; Divari, P. C.; Sinatkas, J.

    2007-12-26

    We concentrate on the convolution method used in nuclear and astro-nuclear physics studies and, in particular, in the investigation of the nuclear response of various neutrino detection targets to the energy-spectra of specific neutrino sources. Since the reaction cross sections of the neutrinos with nuclear detectors employed in experiments are extremely small, very fine and fast convolution techniques are required. Furthermore, sophisticated de-convolution methods are also needed whenever a comparison between calculated unfolded cross sections and existing convoluted results is necessary.

  6. Minimization search method for data inversion

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1975-01-01

    Technique has been developed for determining values of selected subsets of independent variables in mathematical formulations. Required computation time increases with first power of the number of variables. This is in contrast with classical minimization methods for which computational time increases with third power of the number of variables.

  7. A flexible transition state searching method for atmospheric reaction systems

    NASA Astrophysics Data System (ADS)

    Lin, Xiao-Xiao; Liu, Yi-Rong; Huang, Teng; Chen, Jiao; Jiang, Shuai; Huang, Wei

    2015-04-01

    The precise and rapid exploration of transition states (TSs) is a major challenge when studying atmospheric reactions due to their complexity. In this work, a Monte Carlo Transition State Search Method (MCTSSM), which integrates Monte Carlo sampling technique with transition state optimization methods using an efficient computer script, has been developed for transition state searches. The efficiency and the potential application in atmospheric reactions of this method have been demonstrated by three types of test suits related to the reactions of atmospheric volatile organic compounds (VOCs): (1) OH addition, (2) OH hydrogen-abstraction, and (3) the other reactive group (e.g. Cl, O3, NO3), especially for the reaction of β-pinene-sCI (stabilized Criegee Intermediates) with water. It was shown that the application of this method with effective restricted parameters has greatly simplified the time-consuming and tedious manual search procedure for transition state (TS) of the bimolecular reaction systems.

  8. A Flexible Transition State Searching Method for Atmospheric Reaction Systems

    SciTech Connect

    Lin, Xiao-Xiao; Liu, Yi-Rong; Huang, Teng; Chen, Jiao; Jiang, Shuai; Huang, Wei

    2015-04-01

    The precise and rapid exploration of transition states (TSs) is a major challenge when studying atmospheric reactions due to their complexity. In this work, a Monte Carlo Transition State Search Method (MCTSSM), which integrates Monte Carlo sampling technique with transition state optimization methods using an efficient computer script, has been developed for transition state searches. The efficiency and the potential application in atmospheric reactions of this method have been demonstrated by three types of test suits related to the reactions of atmospheric volatile organic compounds (VOCs): (1) OH addition, (2) OH hydrogen-abstraction, and (3) the other reactive group (e.g. Cl, O3, NO3), especially for the reaction of β-pinene-sCI (stabilized Criegee Intermediates) with water. It was shown that the application of this method with effective restricted parameters has greatly simplified the time-consuming and tedious manual search procedure for transition state (TS) of the bimolecular reaction systems.

  9. Heuristic dynamic complexity coding

    NASA Astrophysics Data System (ADS)

    Škorupa, Jozef; Slowack, Jürgen; Mys, Stefaan; Lambert, Peter; Van de Walle, Rik

    2008-04-01

    Distributed video coding is a new video coding paradigm that shifts the computational intensive motion estimation from encoder to decoder. This results in a lightweight encoder and a complex decoder, as opposed to the predictive video coding scheme (e.g., MPEG-X and H.26X) with a complex encoder and a lightweight decoder. Both schemas, however, do not have the ability to adapt to varying complexity constraints imposed by encoder and decoder, which is an essential ability for applications targeting a wide range of devices with different complexity constraints or applications with temporary variable complexity constraints. Moreover, the effect of complexity adaptation on the overall compression performance is of great importance and has not yet been investigated. To address this need, we have developed a video coding system with the possibility to adapt itself to complexity constraints by dynamically sharing the motion estimation computations between both components. On this system we have studied the effect of the complexity distribution on the compression performance. This paper describes how motion estimation can be shared using heuristic dynamic complexity and how distribution of complexity affects the overall compression performance of the system. The results show that the complexity can indeed be shared between encoder and decoder in an efficient way at acceptable rate-distortion performance.

  10. System, Method and Apparatus for Conducting a Keyterm Search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A keyterm search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more keyterms. Next, a gleaning model of the query is created. The gleaning model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  11. System, method and apparatus for conducting a keyterm search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A keyterm search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more keyterms. Next, a gleaning model of the query is created. The gleaning model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  12. System, method and apparatus for conducting a phrase search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A phrase search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more sequences of terms. Next, a relational model of the query is created. The relational model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  13. Reliable Transition State Searches Integrated with the Growing String Method.

    PubMed

    Zimmerman, Paul

    2013-07-01

    The growing string method (GSM) is highly useful for locating reaction paths connecting two molecular intermediates. GSM has often been used in a two-step procedure to locate exact transition states (TS), where GSM creates a quality initial structure for a local TS search. This procedure and others like it, however, do not always converge to the desired transition state because the local search is sensitive to the quality of the initial guess. This article describes an integrated technique for simultaneous reaction path and exact transition state search. This is achieved by implementing an eigenvector following optimization algorithm in internal coordinates with Hessian update techniques. After partial convergence of the string, an exact saddle point search begins under the constraint that the maximized eigenmode of the TS node Hessian has significant overlap with the string tangent near the TS. Subsequent optimization maintains connectivity of the string to the TS as well as locks in the TS direction, all but eliminating the possibility that the local search leads to the wrong TS. To verify the robustness of this approach, reaction paths and TSs are found for a benchmark set of more than 100 elementary reactions. PMID:26583985

  14. Heuristic Programming of Educational - Research Activity

    NASA Astrophysics Data System (ADS)

    Stoev, Alexey

    HEURISTIC PROGRAMMING OF EDUCATIONAL - RESEARCH ACTIVITY OF THE STUDENTS OF ASTRONOMY AT PUBLIC ASTRONOMICAL OBSERVATORIES A.Stoev Yu. Gagarin Public Astronomical Observatory Stara Zagora Bulgaria Seeking for optimal conditions of the students’ investigation skills development is exceptionally actual task in Astronomy school at Public astronomical observatory. The didactic plan of its solving is connected with a realization of the concept of the problematic approach in astronomical education. In addition different means of astronomical educative activity organization are used depending on the didactic task. In some cases they are algorithmic but in others - mainly heuristic. Educational - research skills are defined as skills of scientific method use in the conditions of seeking for educational problem solving the astronomical educational - research task. The influence of the system of heuristic programming didactic means on the process of teaching and the use of system of didactic means for out of the school education on astronomy aimed mainly to this activity rule are analyzed. In conclusion the process of optimization of the didactic conditions for students’ self-organization during the individual or collective completion of the educational - research astronomical tasks at the transition from secondary to high education.

  15. The Use of Resistivity Methods in Terrestrial Forensic Searches

    NASA Astrophysics Data System (ADS)

    Wolf, R. C.; Raisuddin, I.; Bank, C.

    2013-12-01

    The increasing use of near-surface geophysical methods in forensic searches has demonstrated the need for further studies to identify the ideal physical, environmental and temporal settings for each geophysical method. Previous studies using resistivity methods have shown promising results, but additional work is required to more accurately interpret and analyze survey findings. The Ontario Provincial Police's UCRT (Urban Search and Rescue; Chemical, Biolgical, Radiological, Nuclear and Explosives; Response Team) is collaborating with the University of Toronto and two additional universities in a multi-year study investigating the applications of near-surface geophysical methods to terrestrial forensic searches. In the summer of 2012, on a test site near Bolton, Ontario, the OPP buried weapons, drums and pigs (naked, tarped, and clothed) to simulate clandestine graves and caches. Our study aims to conduct repeat surveys using an IRIS Syscal Junior with 48 electrode switching system resistivity-meter. These surveys will monitor changes in resistivity reflecting decomposition of the object since burial, and identify the strengths and weaknesses of resistivity when used in a rural, clandestine burial setting. Our initial findings indicate the usefulness of this method, as prominent resistivity changes have been observed. We anticipate our results will help to assist law enforcement agencies in determining the type of resistivity results to expect based on time since burial, depth of burial and state of dress of the body.

  16. Climate adaptation heuristics and the science/policy divide

    SciTech Connect

    Preston, Benjamin L.; Mustelin, Johanna; Maloney, Megan C.

    2013-09-05

    The adaptation science enterprise has expanded rapidly in recent years, presumably in response to growth in demand for knowledge that can facilitate adaptation policy and practice. However, evidence suggests such investments in adaptation science have not necessarily translated into adaptation implementation. One potential constraint on adaptation may be the underlying heuristics that are used as the foundation for both adaptation research and practice. In this paper, we explore the adaptation academic literature with the objective of identifying adaptation heuristics, assessing the extent to which they have become entrenched within the adaptation discourse, and discussing potential weaknesses in their framing that could undermine adaptation efforts. This investigation is supported by a multi-method analysis that includes both a quantitative content analysis of the adaptation literature that evidences the use of adaptation heuristics and a qualitative analysis of the implications of such heuristics for enhancing or hindering the implementation of adaptation. Results demonstrate that a number of heuristic devices are commonly used in both the peer-reviewed adaptation literature as well as within grey literature designed to inform adaptation practitioners. Furthermore, the apparent lack of critical reflection upon the robustness of these heuristics for diverse contexts may contribute to potential cognitive bias with respect to the framing of adaptation by both researchers and practitioners. Finally, we discuss this phenomenon by drawing upon heuristic-analytic theory, which has explanatory utility in understanding both the origins of such heuristics as well as the measures that can be pursued toward the co-generation of more robust approaches to adaptation problem-solving.

  17. Climate adaptation heuristics and the science/policy divide

    DOE PAGESBeta

    Preston, Benjamin L.; Mustelin, Johanna; Maloney, Megan C.

    2013-09-05

    The adaptation science enterprise has expanded rapidly in recent years, presumably in response to growth in demand for knowledge that can facilitate adaptation policy and practice. However, evidence suggests such investments in adaptation science have not necessarily translated into adaptation implementation. One potential constraint on adaptation may be the underlying heuristics that are used as the foundation for both adaptation research and practice. In this paper, we explore the adaptation academic literature with the objective of identifying adaptation heuristics, assessing the extent to which they have become entrenched within the adaptation discourse, and discussing potential weaknesses in their framing thatmore » could undermine adaptation efforts. This investigation is supported by a multi-method analysis that includes both a quantitative content analysis of the adaptation literature that evidences the use of adaptation heuristics and a qualitative analysis of the implications of such heuristics for enhancing or hindering the implementation of adaptation. Results demonstrate that a number of heuristic devices are commonly used in both the peer-reviewed adaptation literature as well as within grey literature designed to inform adaptation practitioners. Furthermore, the apparent lack of critical reflection upon the robustness of these heuristics for diverse contexts may contribute to potential cognitive bias with respect to the framing of adaptation by both researchers and practitioners. Finally, we discuss this phenomenon by drawing upon heuristic-analytic theory, which has explanatory utility in understanding both the origins of such heuristics as well as the measures that can be pursued toward the co-generation of more robust approaches to adaptation problem-solving.« less

  18. Generating effective project scheduling heuristics by abstraction and reconstitution

    NASA Technical Reports Server (NTRS)

    Janakiraman, Bhaskar; Prieditis, Armand

    1992-01-01

    A project scheduling problem consists of a finite set of jobs, each with fixed integer duration, requiring one or more resources such as personnel or equipment, and each subject to a set of precedence relations, which specify allowable job orderings, and a set of mutual exclusion relations, which specify jobs that cannot overlap. No job can be interrupted once started. The objective is to minimize project duration. This objective arises in nearly every large construction project--from software to hardware to buildings. Because such project scheduling problems are NP-hard, they are typically solved by branch-and-bound algorithms. In these algorithms, lower-bound duration estimates (admissible heuristics) are used to improve efficiency. One way to obtain an admissible heuristic is to remove (abstract) all resources and mutual exclusion constraints and then obtain the minimal project duration for the abstracted problem; this minimal duration is the admissible heuristic. Although such abstracted problems can be solved efficiently, they yield inaccurate admissible heuristics precisely because those constraints that are central to solving the original problem are abstracted. This paper describes a method to reconstitute the abstracted constraints back into the solution to the abstracted problem while maintaining efficiency, thereby generating better admissible heuristics. Our results suggest that reconstitution can make good admissible heuristics even better.

  19. Photovoltaic maximum power point search method using a light sensor

    NASA Astrophysics Data System (ADS)

    Ostrowski, Mariusz

    2015-05-01

    The main disadvantage of PV panels is their low efficiency and non-linear current-voltage characteristic. Both of the above depend on the insolation and the temperature. That is why, it is necessary to use the maximum power point search systems. Commonly used solutions vary not only in complexity and accuracy but also in the speed of searching the maximum power point. Usually, the measurement of current and voltage is used to determine the maximum power point. The most common in literature are the perturb and observe and incremental conductance methods. The disadvantage of these solutions is the need to search across the whole current-voltage curve, which results in a significant power loss. In order to prevent it, the techniques mentioned above are combined with other methods. This procedure determines the starting point of one of the above methods and results in shortening the search time. Modern solutions use the temperature measurement to determine the open circuit voltage. The simulations show that the voltage in the maximum power point depends mainly on the temperature of the photovoltaic panel, and the current depends mainly on the lighting conditions. The proposed method uses the measurement of illuminance and calculates the current at the maximum power point, which is used as a reference signal in power conversion system. Due to the non-linearity of the light sensor and of the photovoltaic panel, the relation between them cannot be determined directly. Therefore, the proposed method use the modified correlation function to calculate current corresponding to the light.

  20. An explicit-solvent conformation search method using open software

    PubMed Central

    Gaalswyk, Kari

    2016-01-01

    Computer modeling is a popular tool to identify the most-probable conformers of a molecule. Although the solvent can have a large effect on the stability of a conformation, many popular conformational search methods are only capable of describing molecules in the gas phase or with an implicit solvent model. We have developed a work-flow for performing a conformation search on explicitly-solvated molecules using open source software. This method uses replica exchange molecular dynamics (REMD) to sample the conformational states of the molecule efficiently. Cluster analysis is used to identify the most probable conformations from the simulated trajectory. This work-flow was tested on drug molecules α-amanitin and cabergoline to illustrate its capabilities and effectiveness. The preferred conformations of these molecules in gas phase, implicit solvent, and explicit solvent are significantly different. PMID:27280078

  1. An explicit-solvent conformation search method using open software.

    PubMed

    Gaalswyk, Kari; Rowley, Christopher N

    2016-01-01

    Computer modeling is a popular tool to identify the most-probable conformers of a molecule. Although the solvent can have a large effect on the stability of a conformation, many popular conformational search methods are only capable of describing molecules in the gas phase or with an implicit solvent model. We have developed a work-flow for performing a conformation search on explicitly-solvated molecules using open source software. This method uses replica exchange molecular dynamics (REMD) to sample the conformational states of the molecule efficiently. Cluster analysis is used to identify the most probable conformations from the simulated trajectory. This work-flow was tested on drug molecules α-amanitin and cabergoline to illustrate its capabilities and effectiveness. The preferred conformations of these molecules in gas phase, implicit solvent, and explicit solvent are significantly different. PMID:27280078

  2. Regarding Chilcott's "Structural Functionalism as a Heuristic Device" Heuristically.

    ERIC Educational Resources Information Center

    Blot, Richard K.

    1998-01-01

    The heuristic value of Chilcott's essay lies less in its support for structural functionalism and more in its concern to reexamine theory in the work of earlier educational anthropologists for what earlier theories and practices can add to current research. (SLD)

  3. Cumulative Query Method for Influenza Surveillance Using Search Engine Data

    PubMed Central

    Seo, Dong-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-01-01

    Background Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. Objectives The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Methods Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson’s correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. Results In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Conclusions Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation

  4. Alpha-beta coordination method for collective search

    DOEpatents

    Goldsmith, Steven Y.

    2002-01-01

    The present invention comprises a decentralized coordination strategy called alpha-beta coordination. The alpha-beta coordination strategy is a family of collective search methods that allow teams of communicating agents to implicitly coordinate their search activities through a division of labor based on self-selected roles and self-determined status. An agent can play one of two complementary roles. An agent in the alpha role is motivated to improve its status by exploring new regions of the search space. An agent in the beta role is also motivated to improve its status, but is conservative and tends to remain aggregated with other agents until alpha agents have clearly identified and communicated better regions of the search space. An agent can select its role dynamically based on its current status value relative to the status values of neighboring team members. Status can be determined by a function of the agent's sensor readings, and can generally be a measurement of source intensity at the agent's current location. An agent's decision cycle can comprise three sequential decision rules: (1) selection of a current role based on the evaluation of the current status data, (2) selection of a specific subset of the current data, and (3) determination of the next heading using the selected data. Variations of the decision rules produce different versions of alpha and beta behaviors that lead to different collective behavior properties.

  5. A method of searching LDAP directories using XQuery

    NASA Astrophysics Data System (ADS)

    Hesselroth, Ted

    2011-12-01

    A method by which an LDAP directory can be searched using XQuery is described. The strategy behind the tool consists of four steps. First the XQuery script is examined and relevant XPath expressions are extracted, determined to be sufficient to define all information needed to perform the query. Then the XPath expressions are converted into their equivalent LDAP search filters by use of the published LDAP schema of the service, and search requests are made to the LDAP host. The search results are then merged and converted to an XML document that conforms to the hierarchy of the LDAP schema. Finally, the XQuery script is executed on the working XML document by conventional means. Examples are given of application of the tool in the Open Science Grid, which for discovery purposes operates an LDAP server that contains Glue schema-based information on site configuration and authorization policies. The XQuery scripts compactly replace hundreds of lines of custom python code that relied on the unix ldapsearch utility. Installation of the tool is available through the Virtual Data Toolkit.

  6. Doppler methods of search and monitoring of exoplanets

    NASA Astrophysics Data System (ADS)

    Panchuk, V. E.; Klochkova, V. G.; Sachkov, M. E.; Yushkin, M. V.

    2015-12-01

    The main stages of the development of Doppler methods of search and study of extrasolar planetary systems (exoplanets) are described. The main instrumental and methodological effects that influence the measurement accuracy of spectral line positions in the study of exoplanets are considered. The development of the domestic spectrograph for spectroscopic monitoring with high-precision determination of radial velocities is reported. Directions for further development of high-resolution spectroscopy are discussed.

  7. SMMH - A Parallel Heuristic for Combinatorial Optimization Problems

    SciTech Connect

    Domingues, Guilherme; Morie, Yoshiyuki; Gu, Feng Long; Nanri, Takeshi; Murakami, Kazuaki

    2007-12-26

    The process of finding one or more optimal solutions for answering combinatorial optimization problems bases itself on the use of algorithms instances. Those instances usually have to explore a very large search spaces. Heuristics search focusing on the use of High-Order Hopfield neural networks is a largely deployed technique for very large search space. It can be established a very powerful analogy towards the dynamics evolution of a physics spin-glass system while minimizing its own energy and the energy function of the network. This paper presents a new approach for solving combinatorial optimization problems through parallel simulations, based on a High-Order Hopfield neural network using MPI specification.

  8. SMMH--A Parallel Heuristic for Combinatorial Optimization Problems

    SciTech Connect

    Domingues, Guilherme; Morie, Yoshiyuki; Gu, Feng Long; Nanri, Takeshi; Murakami, Kazuaki

    2007-12-26

    The process of finding one or more optimal solutions for answering combinatorial optimization problems bases itself on the use of algorithms instances. Those instances usually have to explore a very large search spaces. Heuristics search focusing on the use of High-Order Hopfield neural networks is a largely deployed technique for very large search space. It can be established a very powerful analogy towards the dynamics evolution of a physics spin-glass system while minimizing its own energy and the energy function of the network. This paper presents a new approach for solving combinatorial optimization problems through parallel simulations, based on a High-Order Hopfield neural network using MPI specification.

  9. The Gaussian CLs method for searches of new physics

    NASA Astrophysics Data System (ADS)

    Qian, X.; Tan, A.; Ling, J. J.; Nakajima, Y.; Zhang, C.

    2016-08-01

    We describe a method based on the CLs approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CLs method. Our work provides a self-contained mathematical proof for the Gaussian CLs method that explicitly outlines the required conditions. These conditions are milder than that required by Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CLs method in an example of searching for a sterile neutrino, where the CLs approach was rarely used before. We also compare data analysis results produced by the Gaussian CLs method and various CI methods to showcase their differences.

  10. An automated efficient conformation search of L-serine by the scaled hypersphere search method

    NASA Astrophysics Data System (ADS)

    Kishimoto, Naoki; Harayama, Manami; Ohno, Koichi

    2016-05-01

    Stable conformers of L-serine were automatically explored by applications of the scaled hypersphere search (SHS) method to equilibrium structures maintaining the chemical bond skeletons of serine. Energy barriers for conformational changes of L-serine were estimated from the heights of obtained transition structures. Zero-point-corrected electronic energies and Gibbs free energies of the 24 lowest energy conformers and 21 transition structures were calculated at 100, 298, and 400 K by a composite quantum chemistry method (Gaussian-4). Relative populations of 24 conformers including nine new conformers were calculated from the Gibbs energies assuming thermal equilibrium.

  11. Comparison of double-ended transition state search methods

    NASA Astrophysics Data System (ADS)

    Koslover, Elena F.; Wales, David J.

    2007-10-01

    While a variety of double-ended transition state search methods have been developed, their relative performance in characterizing complex multistep pathways between structurally disparate molecular conformations remains unclear. Three such methods (doubly-nudged elastic band, a string method, and a growing string method) are compared for a series of benchmarks ranging from permutational isomerizations of the seven-atom Lennard-Jones cluster (LJ7) to highly cooperative LJ38 and LJ75 rearrangements, and the folding pathways of two peptides. A database of short paths between LJ13 local minima is used to explore the effects of parameters and suggest reasonable default values. Each double-ended method was employed within the framework of a missing connection network flow algorithm to construct more complicated multistep pathways. We find that in our implementation none of the three methods definitively outperforms the others, and that their relative effectiveness is strongly system and parameter dependent.

  12. Comparison of double-ended transition state search methods.

    PubMed

    Koslover, Elena F; Wales, David J

    2007-10-01

    While a variety of double-ended transition state search methods have been developed, their relative performance in characterizing complex multistep pathways between structurally disparate molecular conformations remains unclear. Three such methods (doubly-nudged elastic band, a string method, and a growing string method) are compared for a series of benchmarks ranging from permutational isomerizations of the seven-atom Lennard-Jones cluster (LJ(7)) to highly cooperative LJ(38) and LJ(75) rearrangements, and the folding pathways of two peptides. A database of short paths between LJ(13) local minima is used to explore the effects of parameters and suggest reasonable default values. Each double-ended method was employed within the framework of a missing connection network flow algorithm to construct more complicated multistep pathways. We find that in our implementation none of the three methods definitively outperforms the others, and that their relative effectiveness is strongly system and parameter dependent. PMID:17919006

  13. A heuristic approach to incremental and reactive scheduling

    NASA Technical Reports Server (NTRS)

    Odubiyi, Jide B.; Zoch, David R.

    1989-01-01

    An heuristic approach to incremental and reactive scheduling is described. Incremental scheduling is the process of modifying an existing schedule if the initial schedule does not meet its stated initial goals. Reactive scheduling occurs in near real-time in response to changes in available resources or the occurrence of targets of opportunity. Only minor changes are made during both incremental and reactive scheduling because a goal of re-scheduling procedures is to minimally impact the schedule. The described heuristic search techniques, which are employed by the Request Oriented Scheduling Engine (ROSE), a prototype generic scheduler, efficiently approximate the cost of reaching a goal from a given state and effective mechanisms for controlling search.

  14. A groundwater-based, objective-heuristic parameter optimisation method for a precipitation-runoff model and its application to a semi-arid basin

    NASA Astrophysics Data System (ADS)

    Mazi, K.; Koussis, A. D.; Restrepo, P. J.; Koutsoyiannis, D.

    2004-05-01

    A hydrologic model calibration methodology that is based on groundwater data is developed and implemented using the US Geological Survey's precipitation-runoff modelling system (PRMS) and the modular modelling system (MMS), which performs automatic calibration of parameters. The developed methodology was tested in the Akrotiri basin, Cyprus. The necessity for the groundwater-based model calibration, rather than a typical runoff-based one, arose from the very intermittent character of the runoff in the Akrotiri basin, a case often met in semi-arid regions. Introducing a datum and converting groundwater storage to head made the observable groundwater level the calibration indicator. The modelling of the Akrotiri basin leads us to conclude that groundwater level is a useful indicator for hydrological model calibration that can be potentially used in other similar situations in the absence of river flow measurements. However, the option of an automatic calibration of the complex hydrologic model PRMS by MMS did not ensure a good outcome. On the other hand, automatic optimisation, combined with heuristic expert intervention, enabled achievement of good calibration and constitutes a valuable means for saving effort and improving modelling performance. To this end, results must be scrutinised, melding the viewpoint of physical sense with mathematical efficiency criteria. Thus optimised, PRMS achieved a low simulation error, good reproduction of the historic trend of the aquifer water level evolution and reasonable physical behaviour (good hydrologic balance, Reasonable match of aquifer level evolution, good estimation of mean natural recharge rate).

  15. On the asymptotic optimality and improved strategies of SPTB heuristic for open-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Bai, Danyu; Zhang, Zhihai

    2014-08-01

    This article investigates the open-shop scheduling problem with the optimal criterion of minimising the sum of quadratic completion times. For this NP-hard problem, the asymptotic optimality of the shortest processing time block (SPTB) heuristic is proven in the sense of limit. Moreover, three different improvements, namely, the job-insert scheme, tabu search and genetic algorithm, are introduced to enhance the quality of the original solution generated by the SPTB heuristic. At the end of the article, a series of numerical experiments demonstrate the convergence of the heuristic, the performance of the improvements and the effectiveness of the quadratic objective.

  16. Hegemony, hermeneutics, and the heuristic of hope.

    PubMed

    Dorcy, Kathleen Shannon

    2010-01-01

    Hope has become a commodity, one that society expects those who suffer to invest in and one that healthcare providers are expected to promote as an outcome. In nursing research, a single hegemonic epistemology/ontology has been implemented through an exclusive hermeneutic (interpretation of data) and has resulted in hope being designated as an external objective heuristic for those who suffer. Evidence is articulated in this article for adopting a broader method of analysis and interpretation (genealogy) that can facilitate fuller apprehension of hope in the human experience of suffering and despair. PMID:20154528

  17. Non-uniform cosine modulated filter banks using meta-heuristic algorithms in CSD space.

    PubMed

    Kalathil, Shaeen; Elias, Elizabeth

    2015-11-01

    This paper presents an efficient design of non-uniform cosine modulated filter banks (CMFB) using canonic signed digit (CSD) coefficients. CMFB has got an easy and efficient design approach. Non-uniform decomposition can be easily obtained by merging the appropriate filters of a uniform filter bank. Only the prototype filter needs to be designed and optimized. In this paper, the prototype filter is designed using window method, weighted Chebyshev approximation and weighted constrained least square approximation. The coefficients are quantized into CSD, using a look-up-table. The finite precision CSD rounding, deteriorates the filter bank performances. The performances of the filter bank are improved using suitably modified meta-heuristic algorithms. The different meta-heuristic algorithms which are modified and used in this paper are Artificial Bee Colony algorithm, Gravitational Search algorithm, Harmony Search algorithm and Genetic algorithm and they result in filter banks with less implementation complexity, power consumption and area requirements when compared with those of the conventional continuous coefficient non-uniform CMFB. PMID:26644921

  18. Heuristic Identification of Biological Architectures for Simulating Complex Hierarchical Genetic Interactions

    PubMed Central

    Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C

    2015-01-01

    Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175

  19. Using Heuristic Evaluation to Foster Visualization Analysis and Design Skills.

    PubMed

    Sousa Santos, Beatriz; Quintino Ferreira, Beatriz; Dias, Paulo

    2016-01-01

    In an effort to develop visualization analysis and design skills in master's level information visualization students, the authors use a well-known analytical usability evaluation method, heuristic evaluation, with different sets of heuristics to teach students to analyze visualization applications. The proposed approach, used for three consecutive years, has successfully stimulated critical analysis and discussion sessions as well as helped raise students' awareness concerning the benefit of using systematic analysis methods and the strategies and guidelines that should be used to design visualization applications. PMID:26780763

  20. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain. PMID:25635698

  1. A hybrid approach using chaotic dynamics and global search algorithms for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Igeta, Hideki; Hasegawa, Mikio

    Chaotic dynamics have been effectively applied to improve various heuristic algorithms for combinatorial optimization problems in many studies. Currently, the most used chaotic optimization scheme is to drive heuristic solution search algorithms applicable to large-scale problems by chaotic neurodynamics including the tabu effect of the tabu search. Alternatively, meta-heuristic algorithms are used for combinatorial optimization by combining a neighboring solution search algorithm, such as tabu, gradient, or other search method, with a global search algorithm, such as genetic algorithms (GA), ant colony optimization (ACO), or others. In these hybrid approaches, the ACO has effectively optimized the solution of many benchmark problems in the quadratic assignment problem library. In this paper, we propose a novel hybrid method that combines the effective chaotic search algorithm that has better performance than the tabu search and global search algorithms such as ACO and GA. Our results show that the proposed chaotic hybrid algorithm has better performance than the conventional chaotic search and conventional hybrid algorithms. In addition, we show that chaotic search algorithm combined with ACO has better performance than when combined with GA.

  2. Social Outcomes in Childhood Brain Disorder: A Heuristic Integration of Social Neuroscience and Developmental Psychology

    ERIC Educational Resources Information Center

    Yeates, Keith Owen; Bigler, Erin D.; Dennis, Maureen; Gerhardt, Cynthia A.; Rubin, Kenneth H.; Stancin, Terry; Taylor, H. Gerry; Vannatta, Kathryn

    2007-01-01

    The authors propose a heuristic model of the social outcomes of childhood brain disorder that draws on models and methods from both the emerging field of social cognitive neuroscience and the study of social competence in developmental psychology/psychopathology. The heuristic model characterizes the relationships between social adjustment, peer…

  3. Automated unit-level testing with heuristic rules

    NASA Technical Reports Server (NTRS)

    Carlisle, W. Homer; Chang, Kai-Hsiung; Cross, James H.; Keleher, William; Shackelford, Keith

    1990-01-01

    Software testing plays a significant role in the development of complex software systems. Current testing methods generally require significant effort to generate meaningful test cases. The QUEST/Ada system is a prototype system designed using CLIPS to experiment with expert system based test case generation. The prototype is designed to test for condition coverage, and attempts to generate test cases to cover all feasible branches contained in an Ada program. This paper reports on heuristics sued by the system. These heuristics vary according to the amount of knowledge obtained by preprocessing and execution of the boolean conditions in the program.

  4. Social biases determine spatiotemporal sparseness of ciliate mating heuristics.

    PubMed

    Clark, Kevin B

    2012-01-01

    Ciliates become highly social, even displaying animal-like qualities, in the joint presence of aroused conspecifics and nonself mating pheromones. Pheromone detection putatively helps trigger instinctual and learned courtship and dominance displays from which social judgments are made about the availability, compatibility, and fitness representativeness or likelihood of prospective mates and rivals. In earlier studies, I demonstrated the heterotrich Spirostomum ambiguum improves mating competence by effecting preconjugal strategies and inferences in mock social trials via behavioral heuristics built from Hebbian-like associative learning. Heuristics embody serial patterns of socially relevant action that evolve into ordered, topologically invariant computational networks supporting intra- and intermate selection. S. ambiguum employs heuristics to acquire, store, plan, compare, modify, select, and execute sets of mating propaganda. One major adaptive constraint over formation and use of heuristics involves a ciliate's initial subjective bias, responsiveness, or preparedness, as defined by Stevens' Law of subjective stimulus intensity, for perceiving the meaningfulness of mechanical pressures accompanying cell-cell contacts and additional perimating events. This bias controls durations and valences of nonassociative learning, search rates for appropriate mating strategies, potential net reproductive payoffs, levels of social honesty and deception, successful error diagnosis and correction of mating signals, use of insight or analysis to solve mating dilemmas, bioenergetics expenditures, and governance of mating decisions by classical or quantum statistical mechanics. I now report this same social bias also differentially affects the spatiotemporal sparseness, as measured with metric entropy, of ciliate heuristics. Sparseness plays an important role in neural systems through optimizing the specificity, efficiency, and capacity of memory representations. The present

  5. Meta-Heuristic Combining Prior Online and Offline Information for the Quadratic Assignment Problem.

    PubMed

    Sun, Jianyong; Zhang, Qingfu; Yao, Xin

    2014-03-01

    The construction of promising solutions for NP-hard combinatorial optimization problems (COPs) in meta-heuristics is usually based on three types of information, namely a priori information, a posteriori information learned from visited solutions during the search procedure, and online information collected in the solution construction process. Prior information reflects our domain knowledge about the COPs. Extensive domain knowledge can surely make the search effective, yet it is not always available. Posterior information could guide the meta-heuristics to globally explore promising search areas, but it lacks local guidance capability. On the contrary, online information can capture local structures, and its application can help exploit the search space. In this paper, we studied the effects of using this information on metaheuristic's algorithmic performances for the COPs. The study was illustrated by a set of heuristic algorithms developed for the quadratic assignment problem. We first proposed an improved scheme to extract online local information, then developed a unified framework under which all types of information can be combined readily. Finally, we studied the benefits of the three types of information to meta-heuristics. Conclusions were drawn from the comprehensive study, which can be used as principles to guide the design of effective meta-heuristic in the future. PMID:23757559

  6. New results in searching for axions by astronomical methods

    NASA Astrophysics Data System (ADS)

    Gnedin, Yu. N.; Piotrovich, M. Yu.

    2016-01-01

    We discuss the astronomical methods of searching for light Goldstone bosons (axions and arions). The basic idea is to use processes of coupling between axions and photons: a) the axion decay into two photons; b) the transformation process of photons into axions (arions) in the magnetic fields of stars and also of interstellar and intergalactic media; c) the inverse process of transformations of axions (arions) which are generated into cores of stars into X-ray photons. The decaying axions affect upon the diffuse extragalactic background radiation, the brightness of the night sky and especially on the intergalactic light of clusters of galaxies due to generation of the axion radiative decay emission line. The processes (b) and (c) are strongly dependent on polarization state of photon and may produce a noticeable amount of linear polarization.

  7. An iterative search method for strain measurement in EFPI sensors

    NASA Astrophysics Data System (ADS)

    Ebel, W. J.; Mitchell, K. K.

    2012-04-01

    In this paper, a new method is given for estimating strain in extrinsic, Fabry-Perot, interferometric (EFPI) fiber-optic sensors under sinusoidal excitation at the sensor. The algorithm has a low complexity and is appropriate for low-cost applications. It is an iterative search algorithm based upon a known, sinusoidal excitation and a mean-square-error objective function. The algorithm provides an estimate of the maximum time-varying strain due to the excitation. It is shown that, for a broad range of parameters, the algorithm converges to the global minima with a high degree of probability. Empirical test results for two fiber-optic sensors with different gauge lengths along with corresponding measurements from a resistive strain gauge are given and shown to compare very well.

  8. Heuristical Strategies on the Study Theme "The Unsaturated Hydrocarbons -- Alkenes"

    ERIC Educational Resources Information Center

    Naumescu, Adrienne Kozan; Pasca, Roxana-Diana

    2011-01-01

    The influence of heuristical strategies upon the level of two experimental classes is studied in this paper. The didactic experiment took place at secondary school in Cluj-Napoca, in 2008-2009 school year. The study theme "The Unsaturated Hydrocarbons--Alkenes" has been efficiently learned by using the most active methods: laboratory…

  9. A New Method to Web Knowledge Searching and Organizating

    NASA Astrophysics Data System (ADS)

    Li, Shengqi

    One of the fundamental support of the web knowledge vision is a agent system that enables knowledge to be published to a searchable knowledge base and later retrieved by potential users. This is the basic motivation for the UDDI standard, one of the three standards fundation current web knowledge technology. However, this aspect of the technology has been the least successful, and the few web sites that today attempt to provide a web knowledge agent facility do so using a simple cataloguing method rather than UDDI. In this paper we analyze why the agent aspect of the web knowledge vision has proven so difficult to realize in practice and outline the technical difficulties involved in setting up and maintaining useful knowledge base of web knowledge. We then describe a practical method to web knowledge agent based on automated indexing and discuss the required technological foundations. We also suggest some ideas for improving the existing standards to better support this method and web knowledge searching in general.

  10. Heuristics Applied in the Development of Advanced Space Mission Concepts

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.

    1998-01-01

    Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.

  11. Extragradient methods for searching for equilibrium points in the parametric problem of equilibrium programming

    NASA Astrophysics Data System (ADS)

    Artem'eva, L. A.

    2014-12-01

    The parametric problem of equilibrium programming is examined. The mathematical programming problem, the search for a saddle-point, the multicriteria search for a Pareto point, etc. are particular cases of this parametric problem. The primal and dual variants of the extragradient method are proposed as a tool for searching for equilibrium points. The convergence of both variants is analyzed.

  12. Heuristic segmentation of a nonstationary time series

    NASA Astrophysics Data System (ADS)

    Fukuda, Kensuke; Eugene Stanley, H.; Nunes Amaral, Luís A.

    2004-02-01

    Many phenomena, both natural and human influenced, give rise to signals whose statistical properties change under time translation, i.e., are nonstationary. For some practical purposes, a nonstationary time series can be seen as a concatenation of stationary segments. However, the exact segmentation of a nonstationary time series is a hard computational problem which cannot be solved exactly by existing methods. For this reason, heuristic methods have been proposed. Using one such method, it has been reported that for several cases of interest—e.g., heart beat data and Internet traffic fluctuations—the distribution of durations of these stationary segments decays with a power-law tail. A potential technical difficulty that has not been thoroughly investigated is that a nonstationary time series with a (scalefree) power-law distribution of stationary segments is harder to segment than other nonstationary time series because of the wider range of possible segment lengths. Here, we investigate the validity of a heuristic segmentation algorithm recently proposed by Bernaola-Galván et al. [Phys. Rev. Lett. 87, 168105 (2001)] by systematically analyzing surrogate time series with different statistical properties. We find that if a given nonstationary time series has stationary periods whose length is distributed as a power law, the algorithm can split the time series into a set of stationary segments with the correct statistical properties. We also find that the estimated power-law exponent of the distribution of stationary-segment lengths is affected by (i) the minimum segment length and (ii) the ratio R≡σɛ/σx¯, where σx¯ is the standard deviation of the mean values of the segments and σɛ is the standard deviation of the fluctuations within a segment. Furthermore, we determine that the performance of the algorithm is generally not affected by uncorrelated noise spikes or by weak long-range temporal correlations of the fluctuations within segments.

  13. Coming Alive From Nine to Five: The Career Search Method.

    ERIC Educational Resources Information Center

    Michelozzi, Betty Neville; Michelozzi, Peter J.

    The concept of career search is envisioned as a means to personal growth rather than merely a choice of occupations. A career search program is described which sets up a basic foundation of self-assessment, exploring personal values, interests, and skills. Four inventories are described which can help individuals develop self-descriptions in terms…

  14. Efficient GRASP based heuristics for the capacitated continuous location-allocation problem

    NASA Astrophysics Data System (ADS)

    Luis, Martino; Ramli, Mohammad Fadzli; Saputra, Ruswiati Surya

    2015-05-01

    This paper explores the np-hard capacitated continuous location-allocation problem, where the number of facilities to be located is specified and each of which has a constant capacity. Efficient greedy randomised adaptive search procedure (GRASP) based heuristics are proposed to tackle the problem. A scheme that applies the furthest distance rule (FDR) and self-adjusted threshold parameters to generate initial facility locations that are situated sparsely within GRASP framework is also put forward. The construction of the restricted candidate list (RCL) within GRASP is also guided by applying a concept of restricted regions that prevents new facility locations to be sited too close to the previous selected facility locations. The performance of the proposed GRASP heuristics is evaluated by conducting experiments using data sets taken from the literature typically used for the uncapacitated continuous location-allocation problem. The preliminary computational experiments show that the proposed methods provide encouraging solutions when compared to recently published papers. Some future research avenues on the subject are also briefly highlighted.

  15. A Heuristic Approach to Scheduling University Timetables.

    ERIC Educational Resources Information Center

    Loo, E. H.; And Others

    1986-01-01

    Categories of facilities utilization and scheduling requirements to be considered when using a heuristic approach to timetabling are described together with a nine-step algorithm and the computerized timetabling system, Timetable Schedules System (TTS); TTS utilizes heuristic approach. An example demonstrating use of TTS and a program flowchart…

  16. Heuristics in an Educational System Design.

    ERIC Educational Resources Information Center

    Imbrogno, Salvatore

    Using a heuristic system design is the most cost effective means for confronting unstructured policy problems that require action in cases in which there is a limited empirical data base or a diversity of opinion concerning preferred ends and feasible means. This paper discusses heuristic principles that can serve as guides to problem solving and…

  17. "A Heuristic for Visual Thinking in History"

    ERIC Educational Resources Information Center

    Staley, David J.

    2007-01-01

    This article details a heuristic history teachers can use in assigning and evaluating multimedia projects in history. To use this heuristic successfully, requires more than simply following the steps in the list or stages in a recipe: in many ways, it requires a reorientation in what it means to think like an historian. This article, as much as…

  18. A Multi-Start Evolutionary Local Search for the Two-Echelon Location Routing Problem

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Phuong; Prins, Christian; Prodhon, Caroline

    This paper presents a new hybrid metaheuristic between a greedy randomized adaptive search procedure (GRASP) and an evolutionary/iterated local search (ELS/ILS), using Tabu list to solve the two-echelon location routing problem (LRP-2E). The GRASP uses in turn three constructive heuristics followed by local search to generate the initial solutions. From a solution of GRASP, an intensification strategy is carried out by a dynamic alternation between ELS and ILS. In this phase, each child is obtained by mutation and evaluated through a splitting procedure of giant tour followed by a local search. The tabu list, defined by two characteristics of solution (total cost and number of trips), is used to avoid searching a space already explored. The results show that our metaheuristic clearly outperforms all previously published methods on LRP-2E benchmark instances. Furthermore, it is competitive with the best meta-heuristic published for the single-echelon LRP.

  19. Heuristic Evaluation on Mobile Interfaces: A New Checklist

    PubMed Central

    Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis

    2014-01-01

    The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. PMID:25295300

  20. Heuristic Techniques Application In A 3-D Space

    NASA Astrophysics Data System (ADS)

    Mazouz, A. Kader

    1989-02-01

    This paper discusses the application of a heuristic technique to stack regular and irregular shapes objects on the same container or on the same pallet. The computer representation of any object is based on the recursive octree method where each unit volume element is a voxel. Then, the choice of the space taken by any shape object within the volume is made through the heuristic approach. The heuristic technique developed is an evaluation function that compares all the available spaces based on weighing factors and threshold levels. The parameters used are shape, space available, contents of the object, and dimensions. The goal is to choose the most feasible available space every time an object is ready to be stacked. The heuristic algorithm is implemented within a knowledge based system to control a flexible material handling cell. Generally the cell comprises a material handling robot, a conveyance system that brings the objects to the cell where objects are distributed randomly to the cell, a vision system to identify the objects and verify the stacking procedure, and a computer to control and initiate the decision making process to stack all shape objects on the same volume.

  1. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    PubMed Central

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and

  2. General heuristics algorithms for solving capacitated arc routing problem

    NASA Astrophysics Data System (ADS)

    Fadzli, Mohammad; Najwa, Nurul; Masran, Hafiz

    2015-05-01

    In this paper, we try to determine the near-optimum solution for the capacitated arc routing problem (CARP). In general, NP-hard CARP is a special graph theory specifically arises from street services such as residential waste collection and road maintenance. By purpose, the design of the CARP model and its solution techniques is to find optimum (or near-optimum) routing cost for a fleet of vehicles involved in operation. In other words, finding minimum-cost routing is compulsory in order to reduce overall operation cost that related with vehicles. In this article, we provide a combination of various heuristics algorithm to solve a real case of CARP in waste collection and benchmark instances. These heuristics work as a central engine in finding initial solutions or near-optimum in search space without violating the pre-setting constraints. The results clearly show that these heuristics algorithms could provide good initial solutions in both real-life and benchmark instances.

  3. Differential Search Algorithm Based Edge Detection

    NASA Astrophysics Data System (ADS)

    Gunen, M. A.; Civicioglu, P.; Beşdok, E.

    2016-06-01

    In this paper, a new method has been presented for the extraction of edge information by using Differential Search Optimization Algorithm. The proposed method is based on using a new heuristic image thresholding method for edge detection. The success of the proposed method has been examined on fusion of two remote sensed images. The applicability of the proposed method on edge detection and image fusion problems have been analysed in detail and the empirical results exposed that the proposed method is useful for solving the mentioned problems.

  4. Heuristic Approach to the Schwarzschild Geometry

    NASA Astrophysics Data System (ADS)

    Visser, Matt

    In this article I present a simple Newtonian heuristic for motivating a weak-field approximation for the spacetime geometry of a point particle. The heuristic is based on Newtonian gravity, the notion of local inertial frames (the Einstein equivalence principle), plus the use of Galilean coordinate transformations to connect the freely falling local inertial frames back to the "fixed stars." Because of the heuristic and quasi-Newtonian manner in which the specific choice of spacetime geometry is motivated, we are at best justified in expecting it to be a weak-field approximation to the true spacetime geometry. However, in the case of a spherically symmetric point mass the result is coincidentally an exact solution of the full vacuum Einstein field equations — it is the Schwarzschild geometry in Painlevé-Gullstrand coordinates. This result is much stronger than the well-known result of Michell and Laplace whereby a Newtonian argument correctly estimates the value of the Schwarzschild radius — using the heuristic presented in this article one obtains the entire Schwarzschild geometry. The heuristic also gives sensible results — a Riemann flat geometry — when applied to a constant gravitational field. Furthermore, a subtle extension of the heuristic correctly reproduces the Reissner-Nordström geometry and even the de Sitter geometry. Unfortunately the heuristic construction is not truly generic. For instance, it is incapable of generating the Kerr geometry or anti-de Sitter space. Despite this limitation, the heuristic does have useful pedagogical value in that it provides a simple and direct plausibility argument (not a derivation) for the Schwarzschild geometry — suitable for classroom use in situations where the full power and technical machinery of general relativity might be inappropriate. The extended heuristic provides more challenging problems — suitable for use at the graduate level.

  5. Deterministic algorithm with agglomerative heuristic for location problems

    NASA Astrophysics Data System (ADS)

    Kazakovtsev, L.; Stupina, A.

    2015-10-01

    Authors consider the clustering problem solved with the k-means method and p-median problem with various distance metrics. The p-median problem and the k-means problem as its special case are most popular models of the location theory. They are implemented for solving problems of clustering and many practically important logistic problems such as optimal factory or warehouse location, oil or gas wells, optimal drilling for oil offshore, steam generators in heavy oil fields. Authors propose new deterministic heuristic algorithm based on ideas of the Information Bottleneck Clustering and genetic algorithms with greedy heuristic. In this paper, results of running new algorithm on various data sets are given in comparison with known deterministic and stochastic methods. New algorithm is shown to be significantly faster than the Information Bottleneck Clustering method having analogous preciseness.

  6. Analysis Methods for the DRIFT Dark Matter Search

    NASA Astrophysics Data System (ADS)

    Ayad, R.; Hyatt, M.; Hanson-Hart, Z.; Katz-Hyman, M.; Maher, P.; Posner, A.; Martoff, C. J.; Kirkpatrick, J.; Snowden-Ifft, D. P.; Lawson, T. B.; Lightfoot, P. K.; Morgan, B.; Paling, S. M.; Roberts, J. W.; Robinson, M.; Spooner, N. J. C.

    2003-04-01

    The DRIFT Experiment [1] is an underground search for WIMP Dark Matter using a novel detector invented for this purpose: the Negative Ion TPC (NITPC). Data is collected in the form of digitized time-records of signals received on each active anode wire of the NITPC endcap. Analysis procedures developed to characterize this data and discriminate backgrounds (x-rays, gamma rays, alpha particles) from potential Dark Matter signals (simulated with neutron elastic scattering) will be discussed. [1] Low Pressure Negative Ion TPC for Dark Matter Search. D. P. Snowden-Ifft, C. J. Martoff, J. M. Burwell, Phys Rev. D. Rapid Comm. 61, 101301 (2000)

  7. Convergence of the standard RLS method and UDUT factorisation of covariance matrix for solving the algebraic Riccati equation of the DLQR via heuristic approximate dynamic programming

    NASA Astrophysics Data System (ADS)

    Moraes Rêgo, Patrícia Helena; Viana da Fonseca Neto, João; Ferreira, Ernesto M.

    2015-08-01

    The main focus of this article is to present a proposal to solve, via UDUT factorisation, the convergence and numerical stability problems that are related to the covariance matrix ill-conditioning of the recursive least squares (RLS) approach for online approximations of the algebraic Riccati equation (ARE) solution associated with the discrete linear quadratic regulator (DLQR) problem formulated in the actor-critic reinforcement learning and approximate dynamic programming context. The parameterisations of the Bellman equation, utility function and dynamic system as well as the algebra of Kronecker product assemble a framework for the solution of the DLQR problem. The condition number and the positivity parameter of the covariance matrix are associated with statistical metrics for evaluating the approximation performance of the ARE solution via RLS-based estimators. The performance of RLS approximators is also evaluated in terms of consistence and polarisation when associated with reinforcement learning methods. The used methodology contemplates realisations of online designs for DLQR controllers that is evaluated in a multivariable dynamic system model.

  8. A set-covering based heuristic algorithm for the periodic vehicle routing problem.

    PubMed

    Cacchiani, V; Hemmelmayr, V C; Tricoire, F

    2014-01-30

    We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011)  [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems. PMID:24748696

  9. Hybridization of decomposition and local search for multiobjective optimization.

    PubMed

    Ke, Liangjun; Zhang, Qingfu; Battiti, Roberto

    2014-10-01

    Combining ideas from evolutionary algorithms, decomposition approaches, and Pareto local search, this paper suggests a simple yet efficient memetic algorithm for combinatorial multiobjective optimization problems: memetic algorithm based on decomposition (MOMAD). It decomposes a combinatorial multiobjective problem into a number of single objective optimization problems using an aggregation method. MOMAD evolves three populations: 1) population P(L) for recording the current solution to each subproblem; 2) population P(P) for storing starting solutions for Pareto local search; and 3) an external population P(E) for maintaining all the nondominated solutions found so far during the search. A problem-specific single objective heuristic can be applied to these subproblems to initialize the three populations. At each generation, a Pareto local search method is first applied to search a neighborhood of each solution in P(P) to update P(L) and P(E). Then a single objective local search is applied to each perturbed solution in P(L) for improving P(L) and P(E), and reinitializing P(P). The procedure is repeated until a stopping condition is met. MOMAD provides a generic hybrid multiobjective algorithmic framework in which problem specific knowledge, well developed single objective local search and heuristics and Pareto local search methods can be hybridized. It is a population based iterative method and thus an anytime algorithm. Extensive experiments have been conducted in this paper to study MOMAD and compare it with some other state-of-the-art algorithms on the multiobjective traveling salesman problem and the multiobjective knapsack problem. The experimental results show that our proposed algorithm outperforms or performs similarly to the best so far heuristics on these two problems. PMID:25222724

  10. Heuristic Modeling for TRMM Lifetime Predictions

    NASA Technical Reports Server (NTRS)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  11. Mixed Integer Programming and Heuristic Scheduling for Space Communication

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming

    2013-01-01

    Optimal planning and scheduling for a communication network was created where the nodes within the network are communicating at the highest possible rates while meeting the mission requirements and operational constraints. The planning and scheduling problem was formulated in the framework of Mixed Integer Programming (MIP) to introduce a special penalty function to convert the MIP problem into a continuous optimization problem, and to solve the constrained optimization problem using heuristic optimization. The communication network consists of space and ground assets with the link dynamics between any two assets varying with respect to time, distance, and telecom configurations. One asset could be communicating with another at very high data rates at one time, and at other times, communication is impossible, as the asset could be inaccessible from the network due to planetary occultation. Based on the network's geometric dynamics and link capabilities, the start time, end time, and link configuration of each view period are selected to maximize the communication efficiency within the network. Mathematical formulations for the constrained mixed integer optimization problem were derived, and efficient analytical and numerical techniques were developed to find the optimal solution. By setting up the problem using MIP, the search space for the optimization problem is reduced significantly, thereby speeding up the solution process. The ratio of the dimension of the traditional method over the proposed formulation is approximately an order N (single) to 2*N (arraying), where N is the number of receiving antennas of a node. By introducing a special penalty function, the MIP problem with non-differentiable cost function and nonlinear constraints can be converted into a continuous variable problem, whose solution is possible.

  12. A hybrid cuckoo search algorithm with Nelder Mead method for solving global optimization problems.

    PubMed

    Ali, Ahmed F; Tawhid, Mohamed A

    2016-01-01

    Cuckoo search algorithm is a promising metaheuristic population based method. It has been applied to solve many real life problems. In this paper, we propose a new cuckoo search algorithm by combining the cuckoo search algorithm with the Nelder-Mead method in order to solve the integer and minimax optimization problems. We call the proposed algorithm by hybrid cuckoo search and Nelder-Mead method (HCSNM). HCSNM starts the search by applying the standard cuckoo search for number of iterations then the best obtained solution is passing to the Nelder-Mead algorithm as an intensification process in order to accelerate the search and overcome the slow convergence of the standard cuckoo search algorithm. The proposed algorithm is balancing between the global exploration of the Cuckoo search algorithm and the deep exploitation of the Nelder-Mead method. We test HCSNM algorithm on seven integer programming problems and ten minimax problems and compare against eight algorithms for solving integer programming problems and seven algorithms for solving minimax problems. The experiments results show the efficiency of the proposed algorithm and its ability to solve integer and minimax optimization problems in reasonable time. PMID:27217988

  13. Heuristic approach to deriving models for gene finding.

    PubMed

    Besemer, J; Borodovsky, M

    1999-10-01

    Computer methods of accurate gene finding in DNA sequences require models of protein coding and non-coding regions derived either from experimentally validated training sets or from large amounts of anonymous DNA sequence. Here we propose a new, heuristic method producing fairly accurate inhomogeneous Markov models of protein coding regions. The new method needs such a small amount of DNA sequence data that the model can be built 'on the fly' by a web server for any DNA sequence >400 nt. Tests on 10 complete bacterial genomes performed with the GeneMark.hmm program demonstrated the ability of the new models to detect 93.1% of annotated genes on average, while models built by traditional training predict an average of 93.9% of genes. Models built by the heuristic approach could be used to find genes in small fragments of anonymous prokaryotic genomes and in genomes of organelles, viruses, phages and plasmids, as well as in highly inhomogeneous genomes where adjustment of models to local DNA composition is needed. The heuristic method also gives an insight into the mechanism of codon usage pattern evolution. PMID:10481031

  14. On the predictability of protein database search complexity and its relevance to optimization of distributed searches.

    PubMed

    Deciu, Cosmin; Sun, Jun; Wall, Mark A

    2007-09-01

    We discuss several aspects related to load balancing of database search jobs in a distributed computing environment, such as Linux cluster. Load balancing is a technique for making the most of multiple computational resources, which is particularly relevant in environments in which the usage of such resources is very high. The particular case of the Sequest program is considered here, but the general methodology should apply to any similar database search program. We show how the runtimes for Sequest searches of tandem mass spectral data can be predicted from profiles of previous representative searches, and how this information can be used for better load balancing of novel data. A well-known heuristic load balancing method is shown to be applicable to this problem, and its performance is analyzed for a variety of search parameters. PMID:17663575

  15. Optimization of pressurized water reactor shuffling by simulated annealing with heuristics

    SciTech Connect

    Stevens, J.G.; Smith, K.S.; Rempe, K.R.; Downar, T.J.

    1995-09-01

    Simulated-annealing optimization of reactor core loading patterns is implemented with support for design heuristics during candidate pattern generation. The SIMAN optimization module uses the advanced nodal method of SIMULATE-3 and the full cross-section detail of CASMO-3 to evaluate accurately the neutronic performance of each candidate, resulting in high-quality patterns. The use of heuristics within simulated annealing is explored. Heuristics improve the consistency of optimization results for both fast- and slow-annealing runs with no penalty from the exclusion of unusual candidates. Thus, the heuristic application of designer judgment during automated pattern generation is shown to be effective. The capability of the SIMAN module to find and evaluate families of loading patterns that satisfy design constraints and have good objective performance within practical run times is demonstrated. The use of automated evaluations of successive cycles to explore multicycle effects of design decisions is discussed.

  16. Social welfare as small-scale help: evolutionary psychology and the deservingness heuristic.

    PubMed

    Petersen, Michael Bang

    2012-01-01

    Public opinion concerning social welfare is largely driven by perceptions of recipient deservingness. Extant research has argued that this heuristic is learned from a variety of cultural, institutional, and ideological sources. The present article provides evidence supporting a different view: that the deservingness heuristic is rooted in psychological categories that evolved over the course of human evolution to regulate small-scale exchanges of help. To test predictions made on the basis of this view, a method designed to measure social categorization is embedded in nationally representative surveys conducted in different countries. Across the national- and individual-level differences that extant research has used to explain the heuristic, people categorize welfare recipients on the basis of whether they are lazy or unlucky. This mode of categorization furthermore induces people to think about large-scale welfare politics as its presumed ancestral equivalent: small-scale help giving. The general implications for research on heuristics are discussed. PMID:22375300

  17. A spectral KRMI conjugate gradient method under the strong-Wolfe line search

    NASA Astrophysics Data System (ADS)

    Khadijah, Wan; Rivaie, Mohd.; Mamat, Mustafa; Jusoh, Ibrahim

    2016-06-01

    In this paper, a modification of spectral conjugate gradient (CG) method is proposed which combines the advantages of the spectral CG method and the RMIL method namely as spectral Khadijah-Rivaie-Mustafa-Ibrahim (SKRMI) to solve unconstrained optimization problems. Based on inexact line searches, the objective function generates a sufficient descent direction and the global convergence property for the proposed method has been proved. Moreover, the method reduces to the standard RMIL method if exact line search is applied. Numerical results are also presented to examine the efficiency of the proposed method.

  18. Portfolios in Stochastic Local Search: Efficiently Computing Most Probable Explanations in Bayesian Networks

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Roth, Dan; Wilkins, David C.

    2001-01-01

    Portfolio methods support the combination of different algorithms and heuristics, including stochastic local search (SLS) heuristics, and have been identified as a promising approach to solve computationally hard problems. While successful in experiments, theoretical foundations and analytical results for portfolio-based SLS heuristics are less developed. This article aims to improve the understanding of the role of portfolios of heuristics in SLS. We emphasize the problem of computing most probable explanations (MPEs) in Bayesian networks (BNs). Algorithmically, we discuss a portfolio-based SLS algorithm for MPE computation, Stochastic Greedy Search (SGS). SGS supports the integration of different initialization operators (or initialization heuristics) and different search operators (greedy and noisy heuristics), thereby enabling new analytical and experimental results. Analytically, we introduce a novel Markov chain model tailored to portfolio-based SLS algorithms including SGS, thereby enabling us to analytically form expected hitting time results that explain empirical run time results. For a specific BN, we show the benefit of using a homogenous initialization portfolio. To further illustrate the portfolio approach, we consider novel additive search heuristics for handling determinism in the form of zero entries in conditional probability tables in BNs. Our additive approach adds rather than multiplies probabilities when computing the utility of an explanation. We motivate the additive measure by studying the dramatic impact of zero entries in conditional probability tables on the number of zero-probability explanations, which again complicates the search process. We consider the relationship between MAXSAT and MPE, and show that additive utility (or gain) is a generalization, to the probabilistic setting, of MAXSAT utility (or gain) used in the celebrated GSAT and WalkSAT algorithms and their descendants. Utilizing our Markov chain framework, we show that

  19. The Saccharomyces Genome Database: Advanced Searching Methods and Data Mining.

    PubMed

    Cherry, J Michael

    2015-12-01

    At the core of the Saccharomyces Genome Database (SGD) are chromosomal features that encode a product. These include protein-coding genes and major noncoding RNA genes, such as tRNA and rRNA genes. The basic entry point into SGD is a gene or open-reading frame name that leads directly to the locus summary information page. A keyword describing function, phenotype, selective condition, or text from abstracts will also provide a door into the SGD. A DNA or protein sequence can be used to identify a gene or a chromosomal region using BLAST. Protein and DNA sequence identifiers, PubMed and NCBI IDs, author names, and function terms are also valid entry points. The information in SGD has been gathered and is maintained by a group of scientific biocurators and software developers who are devoted to providing researchers with up-to-date information from the published literature, connections to all the major research resources, and tools that allow the data to be explored. All the collected information cannot be represented or summarized for every possible question; therefore, it is necessary to be able to search the structured data in the database. This protocol describes the YeastMine tool, which provides an advanced search capability via an interactive tool. The SGD also archives results from microarray expression experiments, and a strategy designed to explore these data using the SPELL (Serial Pattern of Expression Levels Locator) tool is provided. PMID:26631124

  20. Scheduling constrained tools using heuristic techniques

    NASA Astrophysics Data System (ADS)

    Maram, Venkataramana; Rahman, Syariza Abdul; Maram, Sandhya Rani

    2015-12-01

    One of the main challenge to the current manufacturing production planning is to provide schedules of operations to maximize resource utilization to yield highest overall productivity. This is achieved by scheduling available resources to activities. There can be many different real time scenarios with different combination of input resources to produce parts. In this paper, the problem is simplified to single machine with individual process times and due dates to represent the real world scheduling problem. The main objective function is to minimize the total tardiness or late jobs. Nearest greedy method of assignment problem algorithm is used to find the initial solution followed by Simulated Annealing (SA) algorithm for the improvement part. Simulated Annealing is one of the meta-heuristic techniques in solving combinatorial optimization problem. The general purpose Microsoft Visual C++ is used to developed algorithm for finding the best solution. The proposed hybrid approach able to generate best schedule in 7th and optimal in 170th iteration with tardiness 8 and 7 hours respectively.

  1. Bflinks: Reliable Bugfix Links via Bidirectional References and Tuned Heuristics

    PubMed Central

    2014-01-01

    Background. Data from software version archives and defect databases can be used for defect insertion circumstance analysis and defect prediction. The first step in such analyses is identifying defect-correcting changes in the version archive (bugfix commits) and enriching them with additional metadata by establishing bugfix links to corresponding entries in the defect database. Candidate bugfix commits are typically identified via heuristic string matching on the commit message. Research Questions. Which filters could be used to obtain a set of bugfix links? How to tune their parameters? What accuracy is achieved? Method. We analyze a modular set of seven independent filters, including new ones that make use of reverse links, and evaluate visual heuristics for setting cutoff parameters. For a commercial repository, a product expert manually verifies over 2500 links to validate the results with unprecedented accuracy. Results. The heuristics pick a very good parameter value for five filters and a reasonably good one for the sixth. The combined filtering, called bflinks, provides 93% precision and only 7% results loss. Conclusion. Bflinks can provide high-quality results and adapts to repositories with different properties. PMID:27433506

  2. Earthdata Search: Methods for Improving Data Discovery, Visualization, and Access

    NASA Astrophysics Data System (ADS)

    Quinn, P.; Pilone, D.; Crouch, M.; Siarto, J.; Sun, B.

    2015-12-01

    In a landscape of heterogeneous data from diverse sources and disciplines, producing useful tools poses a significant challenge. NASA's Earthdata Search application tackles this challenge, enabling discovery and inter-comparison of data across the wide array of scientific disciplines that use NASA Earth observation data. During this talk, we will give a brief overview of the application, and then share our approach for understanding and satisfying the needs of users from several disparate scientific communities. Our approach involves: - Gathering fine-grained metrics to understand user behavior - Using metrics to quantify user success - Combining metrics, feedback, and user research to understand user needs - Applying professional design toward addressing user needs - Using metrics and A/B testing to evaluate the viability of changes - Providing enhanced features for services to promote adoption - Encouraging good metadata quality and soliciting feedback for metadata issues - Open sourcing the application and its components to allow it to serve more users

  3. Planning collision free paths for two cooperating robots using a divide-and-conquer C-space traversal heuristic

    NASA Technical Reports Server (NTRS)

    Weaver, Johnathan M.

    1993-01-01

    A method was developed to plan feasible and obstacle-avoiding paths for two spatial robots working cooperatively in a known static environment. Cooperating spatial robots as referred to herein are robots which work in 6D task space while simultaneously grasping and manipulating a common, rigid payload. The approach is configuration space (c-space) based and performs selective rather than exhaustive c-space mapping. No expensive precomputations are required. A novel, divide-and-conquer type of heuristic is used to guide the selective mapping process. The heuristic does not involve any robot, environment, or task specific assumptions. A technique was also developed which enables solution of the cooperating redundant robot path planning problem without requiring the use of inverse kinematics for a redundant robot. The path planning strategy involves first attempting to traverse along the configuration space vector from the start point towards the goal point. If an unsafe region is encountered, an intermediate via point is identified by conducting a systematic search in the hyperplane orthogonal to and bisecting the unsafe region of the vector. This process is repeatedly applied until a solution to the global path planning problem is obtained. The basic concept behind this strategy is that better local decisions at the beginning of the trouble region may be made if a possible way around the 'center' of the trouble region is known. Thus, rather than attempting paths which look promising locally (at the beginning of a trouble region) but which may not yield overall results, the heuristic attempts local strategies that appear promising for circumventing the unsafe region.

  4. A comparison of field-based similarity searching methods: CatShape, FBSS, and ROCS.

    PubMed

    Moffat, Kirstin; Gillet, Valerie J; Whittle, Martin; Bravi, Gianpaolo; Leach, Andrew R

    2008-04-01

    Three field-based similarity methods are compared in retrospective virtual screening experiments. The methods are the CatShape module of CATALYST, ROCS, and an in-house program developed at the University of Sheffield called FBSS. The programs are used in both rigid and flexible searches carried out in the MDL Drug Data Report. UNITY 2D fingerprints are also used to provide a comparison with a more traditional approach to similarity searching, and similarity based on simple whole-molecule properties is used to provide a baseline for the more sophisticated searches. Overall, UNITY 2D fingerprints and ROCS with the chemical force field option gave comparable performance and were superior to the shape-only 3D methods. When the flexible methods were compared with the rigid methods, it was generally found that the flexible methods gave slightly better results than their respective rigid methods; however, the increased performance did not justify the additional computational cost required. PMID:18351728

  5. A new method to improve network topological similarity search: applied to fold recognition

    PubMed Central

    Lhota, John; Hauptman, Ruth; Hart, Thomas; Ng, Clara; Xie, Lei

    2015-01-01

    Motivation: Similarity search is the foundation of bioinformatics. It plays a key role in establishing structural, functional and evolutionary relationships between biological sequences. Although the power of the similarity search has increased steadily in recent years, a high percentage of sequences remain uncharacterized in the protein universe. Thus, new similarity search strategies are needed to efficiently and reliably infer the structure and function of new sequences. The existing paradigm for studying protein sequence, structure, function and evolution has been established based on the assumption that the protein universe is discrete and hierarchical. Cumulative evidence suggests that the protein universe is continuous. As a result, conventional sequence homology search methods may be not able to detect novel structural, functional and evolutionary relationships between proteins from weak and noisy sequence signals. To overcome the limitations in existing similarity search methods, we propose a new algorithmic framework—Enrichment of Network Topological Similarity (ENTS)—to improve the performance of large scale similarity searches in bioinformatics. Results: We apply ENTS to a challenging unsolved problem: protein fold recognition. Our rigorous benchmark studies demonstrate that ENTS considerably outperforms state-of-the-art methods. As the concept of ENTS can be applied to any similarity metric, it may provide a general framework for similarity search on any set of biological entities, given their representation as a network. Availability and implementation: Source code freely available upon request Contact: lxie@iscb.org PMID:25717198

  6. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    NASA Astrophysics Data System (ADS)

    Rocha, Humberto; Dias, Joana M.; Ferreira, Brígida C.; Lopes, Maria C.

    2013-05-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem.

  7. The Stanford Cluster Search: Scope, Method, and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Willick, Jeffrey A.; Thompson, Keith L.; Mathiesen, Benjamin F.; Perlmutter, Saul; Knop, Robert A.; Hill, Gary J.

    2001-06-01

    We describe the scientific motivation behind, and the methodology of, the Stanford Cluster Search (StaCS), a program to compile a catalog of optically selected galaxy clusters at intermediate and high (0.3<~z<~1) redshifts. The clusters are identified using a matched filter algorithm applied to deep CCD images covering ~60 deg2 of sky. These images are obtained from several data archives, principally that of the Berkeley Supernova Cosmology Project of Perlmutter et al. Potential clusters are confirmed with spectroscopic observations at the 9.2 m Hobby-Eberly Telescope. Follow-up observations at optical, submillimeter, and X-ray wavelengths are planned in order to estimate cluster masses. Our long-term scientific goal is to measure the cluster number density as a function of mass and redshift, n(M, z), which is sensitive to the cosmological density parameter Ωm and the amplitude of density fluctuations σ8. The combined data set will contain clusters ranging over an order of magnitude in mass and allow constraints on these parameters accurate to ~10%. We present our first spectroscopically confirmed cluster candidates and describe how to access them electronically. The Hobby-Eberly Telescope (HET) is a joint project of the University of Texas at Austin, the Pennsylvania State University, Stanford University, Ludwig-Maximillians-Universität München, and Georg-August-Universität Göttingen. The HET is named in honor of its principal benefactors, William P. Hobby and Robert E. Eberly.

  8. Nonlinear multi-agent path search method based on OFDM communication

    NASA Astrophysics Data System (ADS)

    Sato, Masatoshi; Igarashi, Yusuke; Tanaka, Mamoru

    This paper presents novel shortest paths searching system based on analog circuit analysis which is called sequential local current comparison method on alternating-current (AC) circuit (AC-SLCC). Local current comparison (LCC) method is a path searching method where path is selected in the direction of the maximum current in a direct-current (DC) resistive circuit. Since a plurality of shortest paths searching by LCC method can be done by solving the current distribution on the resistive circuit analysis, the shortest path problem can be solved at supersonic speed. AC-SLCC method is a novel LCC method with orthogonal frequency division multiplexing (OFDM) communication on AC circuit. It is able to send data with the shortest path and without major data loss, and this suggest the possibility of application to various things (especially OFDM communication techniques).

  9. Hybrid Evolutionary-Heuristic Algorithm for Capacitor Banks Allocation

    NASA Astrophysics Data System (ADS)

    Barukčić, Marinko; Nikolovski, Srete; Jović, Franjo

    2010-11-01

    The issue of optimal allocation of capacitor banks concerning power losses minimization in distribution networks are considered in this paper. This optimization problem has been recently tackled by application of contemporary soft computing methods such as: genetic algorithms, neural networks, fuzzy logic, simulated annealing, ant colony methods, and hybrid methods. An evolutionaryheuristic method has been proposed for optimal capacitor allocation in radial distribution networks. An evolutionary method based on genetic algorithm is developed. The proposed method has a reduced number of parameters compared to the usual genetic algorithm. A heuristic stage is used for improving the optimal solution given by the evolutionary stage. A new cost-voltage node index is used in the heuristic stage in order to improve the quality of solution. The efficiency of the proposed two-stage method has been tested on different test networks. The quality of solution has been verified by comparison tests with other methods on the same test networks. The proposed method has given significantly better solutions for time dependent load in the 69-bus network than found in references.

  10. A new type of descent conjugate gradient method with exact line search

    NASA Astrophysics Data System (ADS)

    Hajar, Nurul; Mamat, Mustafa; Rivaie, Mohd.; Jusoh, Ibrahim

    2016-06-01

    Nowadays, conjugate gradient (CG) methods are impressive for solving nonlinear unconstrained optimization problems. In this paper, a new CG method is proposed and analyzed. This new CG method satisfies descent condition and its global convergence is established using exact line search. Numerical results show that this new CG method substantially outperforms the previous CG methods. This new CG method is considered robust, efficient and provided faster and stable convergence.