Sample records for meta-heuristic optimization tool

  1. Meta-heuristic algorithms as tools for hydrological science

    NASA Astrophysics Data System (ADS)

    Yoo, Do Guen; Kim, Joong Hoon

    2014-12-01

    In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.

  2. Application of dragonfly algorithm for optimal performance analysis of process parameters in turn-mill operations- A case study

    NASA Astrophysics Data System (ADS)

    Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT

    2018-02-01

    Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).

  3. Parameter estimation using meta-heuristics in systems biology: a comprehensive review.

    PubMed

    Sun, Jianyong; Garibaldi, Jonathan M; Hodgman, Charlie

    2012-01-01

    This paper gives a comprehensive review of the application of meta-heuristics to optimization problems in systems biology, mainly focussing on the parameter estimation problem (also called the inverse problem or model calibration). It is intended for either the system biologist who wishes to learn more about the various optimization techniques available and/or the meta-heuristic optimizer who is interested in applying such techniques to problems in systems biology. First, the parameter estimation problems emerging from different areas of systems biology are described from the point of view of machine learning. Brief descriptions of various meta-heuristics developed for these problems follow, along with outlines of their advantages and disadvantages. Several important issues in applying meta-heuristics to the systems biology modelling problem are addressed, including the reliability and identifiability of model parameters, optimal design of experiments, and so on. Finally, we highlight some possible future research directions in this field.

  4. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.

  5. Hybridisations of Variable Neighbourhood Search and Modified Simplex Elements to Harmony Search and Shuffled Frog Leaping Algorithms for Process Optimisations

    NASA Astrophysics Data System (ADS)

    Aungkulanon, P.; Luangpaiboon, P.

    2010-10-01

    Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.

  6. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis.

    PubMed

    Tashkova, Katerina; Korošec, Peter; Silc, Jurij; Todorovski, Ljupčo; Džeroski, Sašo

    2011-10-11

    We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology.

  7. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis

    PubMed Central

    2011-01-01

    Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These results hold for both real and artificial data, for all observability scenarios considered, and for all amounts of noise added to the artificial data. In sum, the meta-heuristic methods considered are suitable for estimating the parameters in the ODE model of the dynamics of endocytosis under a range of conditions: With the model and conditions being representative of parameter estimation tasks in ODE models of biochemical systems, our results clearly highlight the promise of bio-inspired meta-heuristic methods for parameter estimation in dynamic system models within system biology. PMID:21989196

  8. Solving Energy-Aware Real-Time Tasks Scheduling Problem with Shuffled Frog Leaping Algorithm on Heterogeneous Platforms

    PubMed Central

    Zhang, Weizhe; Bai, Enci; He, Hui; Cheng, Albert M.K.

    2015-01-01

    Reducing energy consumption is becoming very important in order to keep battery life and lower overall operational costs for heterogeneous real-time multiprocessor systems. In this paper, we first formulate this as a combinatorial optimization problem. Then, a successful meta-heuristic, called Shuffled Frog Leaping Algorithm (SFLA) is proposed to reduce the energy consumption. Precocity remission and local optimal avoidance techniques are proposed to avoid the precocity and improve the solution quality. Convergence acceleration significantly reduces the search time. Experimental results show that the SFLA-based energy-aware meta-heuristic uses 30% less energy than the Ant Colony Optimization (ACO) algorithm, and 60% less energy than the Genetic Algorithm (GA) algorithm. Remarkably, the running time of the SFLA-based meta-heuristic is 20 and 200 times less than ACO and GA, respectively, for finding the optimal solution. PMID:26110406

  9. Determining the optimal number of Kanban in multi-products supply chain system

    NASA Astrophysics Data System (ADS)

    Widyadana, G. A.; Wee, H. M.; Chang, Jer-Yuan

    2010-02-01

    Kanban, a key element of just-in-time system, is a re-order card or signboard giving instruction or triggering the pull system to manufacture or supply a component based on actual usage of material. There are two types of Kanban: production Kanban and withdrawal Kanban. This study uses optimal and meta-heuristic methods to determine the Kanban quantity and withdrawal lot sizes in a supply chain system. Although the mix integer programming method gives an optimal solution, it is not time efficient. For this reason, the meta-heuristic methods are suggested. In this study, a genetic algorithm (GA) and a hybrid of genetic algorithm and simulated annealing (GASA) are used. The study compares the performance of GA and GASA with that of the optimal method using MIP. The given problems show that both GA and GASA result in a near optimal solution, and they outdo the optimal method in term of run time. In addition, the GASA heuristic method gives a better performance than the GA heuristic method.

  10. Multiobjective hyper heuristic scheme for system design and optimization

    NASA Astrophysics Data System (ADS)

    Rafique, Amer Farhan

    2012-11-01

    As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.

  11. A Heuristic Bioinspired for 8-Piece Puzzle

    NASA Astrophysics Data System (ADS)

    Machado, M. O.; Fabres, P. A.; Melo, J. C. L.

    2017-10-01

    This paper investigates a mathematical model inspired by nature, and presents a Meta-Heuristic that is efficient in improving the performance of an informed search, when using strategy A * using a General Search Tree as data structure. The work hypothesis suggests that the investigated meta-heuristic is optimal in nature and may be promising in minimizing the computational resources required by an objective-based agent in solving high computational complexity problems (n-part puzzle) as well as In the optimization of objective functions for local search agents. The objective of this work is to describe qualitatively the characteristics and properties of the mathematical model investigated, correlating the main concepts of the A * function with the significant variables of the metaheuristic used. The article shows that the amount of memory required to perform this search when using the metaheuristic is less than using the A * function to evaluate the nodes of a general search tree for the eight-piece puzzle. It is concluded that the meta-heuristic must be parameterized according to the chosen heuristic and the level of the tree that contains the possible solutions to the chosen problem.

  12. Performance Review of Harmony Search, Differential Evolution and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Mohan Pandey, Hari

    2017-08-01

    Metaheuristic algorithms are effective in the design of an intelligent system. These algorithms are widely applied to solve complex optimization problems, including image processing, big data analytics, language processing, pattern recognition and others. This paper presents a performance comparison of three meta-heuristic algorithms, namely Harmony Search, Differential Evolution, and Particle Swarm Optimization. These algorithms are originated altogether from different fields of meta-heuristics yet share a common objective. The standard benchmark functions are used for the simulation. Statistical tests are conducted to derive a conclusion on the performance. The key motivation to conduct this research is to categorize the computational capabilities, which might be useful to the researchers.

  13. A Swarm Optimization approach for clinical knowledge mining.

    PubMed

    Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A

    2015-10-01

    Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. LTI system order reduction approach based on asymptotical equivalence and the Co-operation of biology-related algorithms

    NASA Astrophysics Data System (ADS)

    Ryzhikov, I. S.; Semenkin, E. S.; Akhmedova, Sh A.

    2017-02-01

    A novel order reduction method for linear time invariant systems is described. The method is based on reducing the initial problem to an optimization one, using the proposed model representation, and solving the problem with an efficient optimization algorithm. The proposed method of determining the model allows all the parameters of the model with lower order to be identified and by definition, provides the model with the required steady-state. As a powerful optimization tool, the meta-heuristic Co-Operation of Biology-Related Algorithms was used. Experimental results proved that the proposed approach outperforms other approaches and that the reduced order model achieves a high level of accuracy.

  15. Non linear predictive control of a LEGO mobile robot

    NASA Astrophysics Data System (ADS)

    Merabti, H.; Bouchemal, B.; Belarbi, K.; Boucherma, D.; Amouri, A.

    2014-10-01

    Metaheuristics are general purpose heuristics which have shown a great potential for the solution of difficult optimization problems. In this work, we apply the meta heuristic, namely particle swarm optimization, PSO, for the solution of the optimization problem arising in NLMPC. This algorithm is easy to code and may be considered as alternatives for the more classical solution procedures. The PSO- NLMPC is applied to control a mobile robot for the tracking trajectory and obstacles avoidance. Experimental results show the strength of this approach.

  16. Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation

    PubMed Central

    Shen, Liang; Huang, Xiaotao; Fan, Chongyi

    2018-01-01

    Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm. PMID:29724013

  17. Investigation into the efficiency of different bionic algorithm combinations for a COBRA meta-heuristic

    NASA Astrophysics Data System (ADS)

    Akhmedova, Sh; Semenkin, E.

    2017-02-01

    Previously, a meta-heuristic approach, called Co-Operation of Biology-Related Algorithms or COBRA, for solving real-parameter optimization problems was introduced and described. COBRA’s basic idea consists of a cooperative work of five well-known bionic algorithms such as Particle Swarm Optimization, the Wolf Pack Search, the Firefly Algorithm, the Cuckoo Search Algorithm and the Bat Algorithm, which were chosen due to the similarity of their schemes. The performance of this meta-heuristic was evaluated on a set of test functions and its workability was demonstrated. Thus it was established that the idea of the algorithms’ cooperative work is useful. However, it is unclear which bionic algorithms should be included in this cooperation and how many of them. Therefore, the five above-listed algorithms and additionally the Fish School Search algorithm were used for the development of five different modifications of COBRA by varying the number of component-algorithms. These modifications were tested on the same set of functions and the best of them was found. Ways of further improving the COBRA algorithm are then discussed.

  18. Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation.

    PubMed

    Shen, Liang; Huang, Xiaotao; Fan, Chongyi

    2018-05-01

    Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm.

  19. A new hybrid meta-heuristic algorithm for optimal design of large-scale dome structures

    NASA Astrophysics Data System (ADS)

    Kaveh, A.; Ilchi Ghazaan, M.

    2018-02-01

    In this article a hybrid algorithm based on a vibrating particles system (VPS) algorithm, multi-design variable configuration (Multi-DVC) cascade optimization, and an upper bound strategy (UBS) is presented for global optimization of large-scale dome truss structures. The new algorithm is called MDVC-UVPS in which the VPS algorithm acts as the main engine of the algorithm. The VPS algorithm is one of the most recent multi-agent meta-heuristic algorithms mimicking the mechanisms of damped free vibration of single degree of freedom systems. In order to handle a large number of variables, cascade sizing optimization utilizing a series of DVCs is used. Moreover, the UBS is utilized to reduce the computational time. Various dome truss examples are studied to demonstrate the effectiveness and robustness of the proposed method, as compared to some existing structural optimization techniques. The results indicate that the MDVC-UVPS technique is a powerful search and optimization method for optimizing structural engineering problems.

  20. Stable and accurate methods for identification of water bodies from Landsat series imagery using meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Gamshadzaei, Mohammad Hossein; Rahimzadegan, Majid

    2017-10-01

    Identification of water extents in Landsat images is challenging due to surfaces with similar reflectance to water extents. The objective of this study is to provide stable and accurate methods for identifying water extents in Landsat images based on meta-heuristic algorithms. Then, seven Landsat images were selected from various environmental regions in Iran. Training of the algorithms was performed using 40 water pixels and 40 nonwater pixels in operational land imager images of Chitgar Lake (one of the study regions). Moreover, high-resolution images from Google Earth were digitized to evaluate the results. Two approaches were considered: index-based and artificial intelligence (AI) algorithms. In the first approach, nine common water spectral indices were investigated. AI algorithms were utilized to acquire coefficients of optimal band combinations to extract water extents. Among the AI algorithms, the artificial neural network algorithm and also the ant colony optimization, genetic algorithm, and particle swarm optimization (PSO) meta-heuristic algorithms were implemented. Index-based methods represented different performances in various regions. Among AI methods, PSO had the best performance with average overall accuracy and kappa coefficient of 93% and 98%, respectively. The results indicated the applicability of acquired band combinations to extract accurately and stably water extents in Landsat imagery.

  1. MetaPIGA v2.0: maximum likelihood large phylogeny estimation using the metapopulation genetic algorithm and other stochastic heuristics.

    PubMed

    Helaers, Raphaël; Milinkovitch, Michel C

    2010-07-15

    The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org.

  2. MetaPIGA v2.0: maximum likelihood large phylogeny estimation using the metapopulation genetic algorithm and other stochastic heuristics

    PubMed Central

    2010-01-01

    Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org. PMID:20633263

  3. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    NASA Astrophysics Data System (ADS)

    Abtahi, Amir-Reza; Bijari, Afsane

    2017-03-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  4. Optimal path planning for a mobile robot using cuckoo search algorithm

    NASA Astrophysics Data System (ADS)

    Mohanty, Prases K.; Parhi, Dayal R.

    2016-03-01

    The shortest/optimal path planning is essential for efficient operation of autonomous vehicles. In this article, a new nature-inspired meta-heuristic algorithm has been applied for mobile robot path planning in an unknown or partially known environment populated by a variety of static obstacles. This meta-heuristic algorithm is based on the levy flight behaviour and brood parasitic behaviour of cuckoos. A new objective function has been formulated between the robots and the target and obstacles, which satisfied the conditions of obstacle avoidance and target-seeking behaviour of robots present in the terrain. Depending upon the objective function value of each nest (cuckoo) in the swarm, the robot avoids obstacles and proceeds towards the target. The smooth optimal trajectory is framed with this algorithm when the robot reaches its goal. Some simulation and experimental results are presented at the end of the paper to show the effectiveness of the proposed navigational controller.

  5. Non-uniform cosine modulated filter banks using meta-heuristic algorithms in CSD space.

    PubMed

    Kalathil, Shaeen; Elias, Elizabeth

    2015-11-01

    This paper presents an efficient design of non-uniform cosine modulated filter banks (CMFB) using canonic signed digit (CSD) coefficients. CMFB has got an easy and efficient design approach. Non-uniform decomposition can be easily obtained by merging the appropriate filters of a uniform filter bank. Only the prototype filter needs to be designed and optimized. In this paper, the prototype filter is designed using window method, weighted Chebyshev approximation and weighted constrained least square approximation. The coefficients are quantized into CSD, using a look-up-table. The finite precision CSD rounding, deteriorates the filter bank performances. The performances of the filter bank are improved using suitably modified meta-heuristic algorithms. The different meta-heuristic algorithms which are modified and used in this paper are Artificial Bee Colony algorithm, Gravitational Search algorithm, Harmony Search algorithm and Genetic algorithm and they result in filter banks with less implementation complexity, power consumption and area requirements when compared with those of the conventional continuous coefficient non-uniform CMFB.

  6. Non-uniform cosine modulated filter banks using meta-heuristic algorithms in CSD space

    PubMed Central

    Kalathil, Shaeen; Elias, Elizabeth

    2014-01-01

    This paper presents an efficient design of non-uniform cosine modulated filter banks (CMFB) using canonic signed digit (CSD) coefficients. CMFB has got an easy and efficient design approach. Non-uniform decomposition can be easily obtained by merging the appropriate filters of a uniform filter bank. Only the prototype filter needs to be designed and optimized. In this paper, the prototype filter is designed using window method, weighted Chebyshev approximation and weighted constrained least square approximation. The coefficients are quantized into CSD, using a look-up-table. The finite precision CSD rounding, deteriorates the filter bank performances. The performances of the filter bank are improved using suitably modified meta-heuristic algorithms. The different meta-heuristic algorithms which are modified and used in this paper are Artificial Bee Colony algorithm, Gravitational Search algorithm, Harmony Search algorithm and Genetic algorithm and they result in filter banks with less implementation complexity, power consumption and area requirements when compared with those of the conventional continuous coefficient non-uniform CMFB. PMID:26644921

  7. A new chaotic multi-verse optimization algorithm for solving engineering optimization problems

    NASA Astrophysics Data System (ADS)

    Sayed, Gehad Ismail; Darwish, Ashraf; Hassanien, Aboul Ella

    2018-03-01

    Multi-verse optimization algorithm (MVO) is one of the recent meta-heuristic optimization algorithms. The main inspiration of this algorithm came from multi-verse theory in physics. However, MVO like most optimization algorithms suffers from low convergence rate and entrapment in local optima. In this paper, a new chaotic multi-verse optimization algorithm (CMVO) is proposed to overcome these problems. The proposed CMVO is applied on 13 benchmark functions and 7 well-known design problems in the engineering and mechanical field; namely, three-bar trust, speed reduce design, pressure vessel problem, spring design, welded beam, rolling element-bearing and multiple disc clutch brake. In the current study, a modified feasible-based mechanism is employed to handle constraints. In this mechanism, four rules were used to handle the specific constraint problem through maintaining a balance between feasible and infeasible solutions. Moreover, 10 well-known chaotic maps are used to improve the performance of MVO. The experimental results showed that CMVO outperforms other meta-heuristic optimization algorithms on most of the optimization problems. Also, the results reveal that sine chaotic map is the most appropriate map to significantly boost MVO's performance.

  8. Ant Colony Optimization for Markowitz Mean-Variance Portfolio Model

    NASA Astrophysics Data System (ADS)

    Deng, Guang-Feng; Lin, Woo-Tsong

    This work presents Ant Colony Optimization (ACO), which was initially developed to be a meta-heuristic for combinatorial optimization, for solving the cardinality constraints Markowitz mean-variance portfolio model (nonlinear mixed quadratic programming problem). To our knowledge, an efficient algorithmic solution for this problem has not been proposed until now. Using heuristic algorithms in this case is imperative. Numerical solutions are obtained for five analyses of weekly price data for the following indices for the period March, 1992 to September, 1997: Hang Seng 31 in Hong Kong, DAX 100 in Germany, FTSE 100 in UK, S&P 100 in USA and Nikkei 225 in Japan. The test results indicate that the ACO is much more robust and effective than Particle swarm optimization (PSO), especially for low-risk investment portfolios.

  9. IMHOTEP—a composite score integrating popular tools for predicting the functional consequences of non-synonymous sequence variants

    PubMed Central

    Knecht, Carolin; Mort, Matthew; Junge, Olaf; Cooper, David N.; Krawczak, Michael

    2017-01-01

    Abstract The in silico prediction of the functional consequences of mutations is an important goal of human pathogenetics. However, bioinformatic tools that classify mutations according to their functionality employ different algorithms so that predictions may vary markedly between tools. We therefore integrated nine popular prediction tools (PolyPhen-2, SNPs&GO, MutPred, SIFT, MutationTaster2, Mutation Assessor and FATHMM as well as conservation-based Grantham Score and PhyloP) into a single predictor. The optimal combination of these tools was selected by means of a wide range of statistical modeling techniques, drawing upon 10 029 disease-causing single nucleotide variants (SNVs) from Human Gene Mutation Database and 10 002 putatively ‘benign’ non-synonymous SNVs from UCSC. Predictive performance was found to be markedly improved by model-based integration, whilst maximum predictive capability was obtained with either random forest, decision tree or logistic regression analysis. A combination of PolyPhen-2, SNPs&GO, MutPred, MutationTaster2 and FATHMM was found to perform as well as all tools combined. Comparison of our approach with other integrative approaches such as Condel, CoVEC, CAROL, CADD, MetaSVM and MetaLR using an independent validation dataset, revealed the superiority of our newly proposed integrative approach. An online implementation of this approach, IMHOTEP (‘Integrating Molecular Heuristics and Other Tools for Effect Prediction’), is provided at http://www.uni-kiel.de/medinfo/cgi-bin/predictor/. PMID:28180317

  10. Optimizing a multi-product closed-loop supply chain using NSGA-II, MOSA, and MOPSO meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar

    2018-07-01

    This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.

  11. Optimizing a multi-product closed-loop supply chain using NSGA-II, MOSA, and MOPSO meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar

    2017-07-01

    This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.

  12. Identification of Disease Critical Genes Using Collective Meta-heuristic Approaches: An Application to Preeclampsia.

    PubMed

    Biswas, Surama; Dutta, Subarna; Acharyya, Sriyankar

    2017-12-01

    Identifying a small subset of disease critical genes out of a large size of microarray gene expression data is a challenge in computational life sciences. This paper has applied four meta-heuristic algorithms, namely, honey bee mating optimization (HBMO), harmony search (HS), differential evolution (DE) and genetic algorithm (basic version GA) to find disease critical genes of preeclampsia which affects women during gestation. Two hybrid algorithms, namely, HBMO-kNN and HS-kNN have been newly proposed here where kNN (k nearest neighbor classifier) is used for sample classification. Performances of these new approaches have been compared with other two hybrid algorithms, namely, DE-kNN and SGA-kNN. Three datasets of different sizes have been used. In a dataset, the set of genes found common in the output of each algorithm is considered here as disease critical genes. In different datasets, the percentage of classification or classification accuracy of meta-heuristic algorithms varied between 92.46 and 100%. HBMO-kNN has the best performance (99.64-100%) in almost all data sets. DE-kNN secures the second position (99.42-100%). Disease critical genes obtained here match with clinically revealed preeclampsia genes to a large extent.

  13. Optimizing the warranty period by cuckoo meta-heuristic algorithm in heterogeneous customers' population

    NASA Astrophysics Data System (ADS)

    Roozitalab, Ali; Asgharizadeh, Ezzatollah

    2013-12-01

    Warranty is now an integral part of each product. Since its length is directly related to the cost of production, it should be set in such a way that it would maximize revenue generation and customers' satisfaction. Furthermore, based on the behavior of customers, it is assumed that increasing the warranty period to earn the trust of more customers leads to more sales until the market is saturated. We should bear in mind that different groups of consumers have different consumption behaviors and that performance of the product has a direct impact on the failure rate over the life of the product. Therefore, the optimum duration for every group is different. In fact, we cannot present different warranty periods for various customer groups. In conclusion, using cuckoo meta-heuristic optimization algorithm, we try to find a common period for the entire population. Results with high convergence offer a term length that will maximize the aforementioned goals simultaneously. The study was tested using real data from Appliance Company. The results indicate a significant increase in sales when the optimization approach was applied; it provides a longer warranty through increased revenue from selling, not only reducing profit margins but also increasing it.

  14. Development a heuristic method to locate and allocate the medical centers to minimize the earthquake relief operation time.

    PubMed

    Aghamohammadi, Hossein; Saadi Mesgari, Mohammad; Molaei, Damoon; Aghamohammadi, Hasan

    2013-01-01

    Location-allocation is a combinatorial optimization problem, and is defined as Non deterministic Polynomial Hard (NP) hard optimization. Therefore, solution of such a problem should be shifted from exact to heuristic or Meta heuristic due to the complexity of the problem. Locating medical centers and allocating injuries of an earthquake to them has high importance in earthquake disaster management so that developing a proper method will reduce the time of relief operation and will consequently decrease the number of fatalities. This paper presents the development of a heuristic method based on two nested genetic algorithms to optimize this location allocation problem by using the abilities of Geographic Information System (GIS). In the proposed method, outer genetic algorithm is applied to the location part of the problem and inner genetic algorithm is used to optimize the resource allocation. The final outcome of implemented method includes the spatial location of new required medical centers. The method also calculates that how many of the injuries at each demanding point should be taken to any of the existing and new medical centers as well. The results of proposed method showed high performance of designed structure to solve a capacitated location-allocation problem that may arise in a disaster situation when injured people has to be taken to medical centers in a reasonable time.

  15. A Hybrid Cellular Genetic Algorithm for Multi-objective Crew Scheduling Problem

    NASA Astrophysics Data System (ADS)

    Jolai, Fariborz; Assadipour, Ghazal

    Crew scheduling is one of the important problems of the airline industry. This problem aims to cover a number of flights by crew members, such that all the flights are covered. In a robust scheduling the assignment should be so that the total cost, delays, and unbalanced utilization are minimized. As the problem is NP-hard and the objectives are in conflict with each other, a multi-objective meta-heuristic called CellDE, which is a hybrid cellular genetic algorithm, is implemented as the optimization method. The proposed algorithm provides the decision maker with a set of non-dominated or Pareto-optimal solutions, and enables them to choose the best one according to their preferences. A set of problems of different sizes is generated and solved using the proposed algorithm. Evaluating the performance of the proposed algorithm, three metrics are suggested, and the diversity and the convergence of the achieved Pareto front are appraised. Finally a comparison is made between CellDE and PAES, another meta-heuristic algorithm. The results show the superiority of CellDE.

  16. A Simulation of Readiness-Based Sparing Policies

    DTIC Science & Technology

    2017-06-01

    variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...available in the optimization tools. 14. SUBJECT TERMS readiness-based sparing, discrete event simulation, optimization, multi-indenture...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the

  17. A hybrid algorithm optimization approach for machine loading problem in flexible manufacturing system

    NASA Astrophysics Data System (ADS)

    Kumar, Vijay M.; Murthy, ANN; Chandrashekara, K.

    2012-05-01

    The production planning problem of flexible manufacturing system (FMS) concerns with decisions that have to be made before an FMS begins to produce parts according to a given production plan during an upcoming planning horizon. The main aspect of production planning deals with machine loading problem in which selection of a subset of jobs to be manufactured and assignment of their operations to the relevant machines are made. Such problems are not only combinatorial optimization problems, but also happen to be non-deterministic polynomial-time-hard, making it difficult to obtain satisfactory solutions using traditional optimization techniques. In this paper, an attempt has been made to address the machine loading problem with objectives of minimization of system unbalance and maximization of throughput simultaneously while satisfying the system constraints related to available machining time and tool slot designing and using a meta-hybrid heuristic technique based on genetic algorithm and particle swarm optimization. The results reported in this paper demonstrate the model efficiency and examine the performance of the system with respect to measures such as throughput and system utilization.

  18. Meta-RaPS Algorithm for the Aerial Refueling Scheduling Problem

    NASA Technical Reports Server (NTRS)

    Kaplan, Sezgin; Arin, Arif; Rabadi, Ghaith

    2011-01-01

    The Aerial Refueling Scheduling Problem (ARSP) can be defined as determining the refueling completion times for each fighter aircraft (job) on multiple tankers (machines). ARSP assumes that jobs have different release times and due dates, The total weighted tardiness is used to evaluate schedule's quality. Therefore, ARSP can be modeled as a parallel machine scheduling with release limes and due dates to minimize the total weighted tardiness. Since ARSP is NP-hard, it will be more appropriate to develop a pproimate or heuristic algorithm to obtain solutions in reasonable computation limes. In this paper, Meta-Raps-ATC algorithm is implemented to create high quality solutions. Meta-RaPS (Meta-heuristic for Randomized Priority Search) is a recent and promising meta heuristic that is applied by introducing randomness to a construction heuristic. The Apparent Tardiness Rule (ATC), which is a good rule for scheduling problems with tardiness objective, is used to construct initial solutions which are improved by an exchanging operation. Results are presented for generated instances.

  19. A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models

    PubMed Central

    Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung

    2015-01-01

    Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237

  20. Identification of Shearer Cutting Patterns Using Vibration Signals Based on a Least Squares Support Vector Machine with an Improved Fruit Fly Optimization Algorithm

    PubMed Central

    Si, Lei; Wang, Zhongbin; Liu, Xinhua; Tan, Chao; Liu, Ze; Xu, Jing

    2016-01-01

    Shearers play an important role in fully mechanized coal mining face and accurately identifying their cutting pattern is very helpful for improving the automation level of shearers and ensuring the safety of coal mining. The least squares support vector machine (LSSVM) has been proven to offer strong potential in prediction and classification issues, particularly by employing an appropriate meta-heuristic algorithm to determine the values of its two parameters. However, these meta-heuristic algorithms have the drawbacks of being hard to understand and reaching the global optimal solution slowly. In this paper, an improved fly optimization algorithm (IFOA) to optimize the parameters of LSSVM was presented and the LSSVM coupled with IFOA (IFOA-LSSVM) was used to identify the shearer cutting pattern. The vibration acceleration signals of five cutting patterns were collected and the special state features were extracted based on the ensemble empirical mode decomposition (EEMD) and the kernel function. Some examples on the IFOA-LSSVM model were further presented and the results were compared with LSSVM, PSO-LSSVM, GA-LSSVM and FOA-LSSVM models in detail. The comparison results indicate that the proposed approach was feasible, efficient and outperformed the others. Finally, an industrial application example at the coal mining face was demonstrated to specify the effect of the proposed system. PMID:26771615

  1. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  2. Load Frequency Control of AC Microgrid Interconnected Thermal Power System

    NASA Astrophysics Data System (ADS)

    Lal, Deepak Kumar; Barisal, Ajit Kumar

    2017-08-01

    In this paper, a microgrid (MG) power generation system is interconnected with a single area reheat thermal power system for load frequency control study. A new meta-heuristic optimization algorithm i.e. Moth-Flame Optimization (MFO) algorithm is applied to evaluate optimal gains of the fuzzy based proportional, integral and derivative (PID) controllers. The system dynamic performance is studied by comparing the results with MFO optimized classical PI/PID controllers. Also the system performance is investigated with fuzzy PID controller optimized by recently developed grey wolf optimizer (GWO) algorithm, which has proven its superiority over other previously developed algorithm in many interconnected power systems.

  3. Mitigating energy loss on distribution lines through the allocation of reactors

    NASA Astrophysics Data System (ADS)

    Miranda, T. M.; Romero, F.; Meffe, A.; Castilho Neto, J.; Abe, L. F. T.; Corradi, F. E.

    2018-03-01

    This paper presents a methodology for automatic reactors allocation on medium voltage distribution lines to reduce energy loss. In Brazil, some feeders are distinguished by their long lengths and very low load, which results in a high influence of the capacitance of the line on the circuit’s performance, requiring compensation through the installation of reactors. The automatic allocation is accomplished using an optimization meta-heuristic called Global Neighbourhood Algorithm. Given a set of reactor models and a circuit, it outputs an optimal solution in terms of reduction of energy loss. The algorithm is also able to verify if the voltage limits determined by the user are not being violated, besides checking for energy quality. The methodology was implemented in a software tool, which can also show the allocation graphically. A simulation with four real feeders is presented in the paper. The obtained results were able to reduce the energy loss significantly, from 50.56%, in the worst case, to 93.10%, in the best case.

  4. Lexicographic goal programming and assessment tools for a combinatorial production problem.

    DOT National Transportation Integrated Search

    2008-01-01

    NP-complete combinatorial problems often necessitate the use of near-optimal solution techniques including : heuristics and metaheuristics. The addition of multiple optimization criteria can further complicate : comparison of these solution technique...

  5. Neural model of gene regulatory network: a survey on supportive meta-heuristics.

    PubMed

    Biswas, Surama; Acharyya, Sriyankar

    2016-06-01

    Gene regulatory network (GRN) is produced as a result of regulatory interactions between different genes through their coded proteins in cellular context. Having immense importance in disease detection and drug finding, GRN has been modelled through various mathematical and computational schemes and reported in survey articles. Neural and neuro-fuzzy models have been the focus of attraction in bioinformatics. Predominant use of meta-heuristic algorithms in training neural models has proved its excellence. Considering these facts, this paper is organized to survey neural modelling schemes of GRN and the efficacy of meta-heuristic algorithms towards parameter learning (i.e. weighting connections) within the model. This survey paper renders two different structure-related approaches to infer GRN which are global structure approach and substructure approach. It also describes two neural modelling schemes, such as artificial neural network/recurrent neural network based modelling and neuro-fuzzy modelling. The meta-heuristic algorithms applied so far to learn the structure and parameters of neutrally modelled GRN have been reviewed here.

  6. Meta-heuristic CRPS minimization for the calibration of short-range probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Mohammadi, Seyedeh Atefeh; Rahmani, Morteza; Azadi, Majid

    2016-08-01

    This paper deals with the probabilistic short-range temperature forecasts over synoptic meteorological stations across Iran using non-homogeneous Gaussian regression (NGR). NGR creates a Gaussian forecast probability density function (PDF) from the ensemble output. The mean of the normal predictive PDF is a bias-corrected weighted average of the ensemble members and its variance is a linear function of the raw ensemble variance. The coefficients for the mean and variance are estimated by minimizing the continuous ranked probability score (CRPS) during a training period. CRPS is a scoring rule for distributional forecasts. In the paper of Gneiting et al. (Mon Weather Rev 133:1098-1118, 2005), Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is used to minimize the CRPS. Since BFGS is a conventional optimization method with its own limitations, we suggest using the particle swarm optimization (PSO), a robust meta-heuristic method, to minimize the CRPS. The ensemble prediction system used in this study consists of nine different configurations of the weather research and forecasting model for 48-h forecasts of temperature during autumn and winter 2011 and 2012. The probabilistic forecasts were evaluated using several common verification scores including Brier score, attribute diagram and rank histogram. Results show that both BFGS and PSO find the optimal solution and show the same evaluation scores, but PSO can do this with a feasible random first guess and much less computational complexity.

  7. Damage identification of a TLP floating wind turbine by meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Ettefagh, M. M.

    2015-12-01

    Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring (SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP (Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms (GA), Artificial Immune System (AIS), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine (TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.

  8. Hybrid General Pattern Search and Simulated Annealing for Industrail Production Planning Problems

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Barsoum, N.

    2010-06-01

    In this paper, the hybridization of GPS (General Pattern Search) method and SA (Simulated Annealing) incorporated in the optimization process in order to look for the global optimal solution for the fitness function and decision variables as well as minimum computational CPU time. The real strength of SA approach been tested in this case study problem of industrial production planning. This is due to the great advantage of SA for being easily escaping from trapped in local minima by accepting up-hill move through a probabilistic procedure in the final stages of optimization process. Vasant [1] in his Ph. D thesis has provided 16 different techniques of heuristic and meta-heuristic in solving industrial production problems with non-linear cubic objective functions, eight decision variables and 29 constraints. In this paper, fuzzy technological problems have been solved using hybrid techniques of general pattern search and simulated annealing. The simulated and computational results are compared to other various evolutionary techniques.

  9. Image Edge Tracking via Ant Colony Optimization

    NASA Astrophysics Data System (ADS)

    Li, Ruowei; Wu, Hongkun; Liu, Shilong; Rahman, M. A.; Liu, Sanchi; Kwok, Ngai Ming

    2018-04-01

    A good edge plot should use continuous thin lines to describe the complete contour of the captured object. However, the detection of weak edges is a challenging task because of the associated low pixel intensities. Ant Colony Optimization (ACO) has been employed by many researchers to address this problem. The algorithm is a meta-heuristic method developed by mimicking the natural behaviour of ants. It uses iterative searches to find the optimal solution that cannot be found via traditional optimization approaches. In this work, ACO is employed to track and repair broken edges obtained via conventional Sobel edge detector to produced a result with more connected edges.

  10. OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.

    PubMed

    Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein

    2018-01-01

    Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.

  11. The importance of meta-ethics in engineering education.

    PubMed

    Haws, David R

    2004-04-01

    Our shared moral framework is negotiated as part of the social contract. Some elements of that framework are established (tell the truth under oath), but other elements lack an overlapping consensus (just when can an individual lie to protect his or her privacy?). The tidy bits of our accepted moral framework have been codified, becoming the subject of legal rather than ethical consideration. Those elements remaining in the realm of ethics seem fragmented and inconsistent. Yet, our engineering students will need to navigate the broken ground of this complex moral landscape. A minimalist approach would leave our students with formulated dogma--principles of right and wrong such as the National Society for Professional Engineers (NSPE) Code of Ethics for Engineers--but without any insight into the genesis of these principles. A slightly deeper, micro-ethics approach would teach our students to solve ethical problems by applying heuristics--giving our students a rational process to manipulate ethical dilemmas using the same principles simply referenced a priori by dogma. A macro-ethics approach--helping students to inductively construct a posteriori principles from case studies--goes beyond the simple statement or manipulation of principles, but falls short of linking personal moral principles to the larger, social context. Ultimately, it is this social context that requires both the application of ethical principles, and the negotiation of moral values--from an understanding of meta-ethics. The approaches to engineering ethics instruction (dogma, heuristics, case studies, and meta-ethics) can be associated with stages of moral development. If we leave our students with only a dogmatic reaction to ethical dilemmas, they will be dependent on the ethical decisions of others (a denial of their fundamental potential for moral autonomy). Heuristics offers a tool to deal independently with moral questions, but a tool that too frequently reduces to casuistry when rigidly applied to "simplified" dilemmas. Case studies, while providing a context for engineering ethics, can encourage the premature analysis of specific moral conduct rather than the development of broad moral principles--stifling our students' facility with meta-ethics. Clearly, if a moral sense is developmental, ethics instruction should lead our students from lower to higher stages of moral development.

  12. Adaptive Swarm Balancing Algorithms for rare-event prediction in imbalanced healthcare data

    PubMed Central

    Wong, Raymond K.; Mohammed, Sabah; Fiaidhi, Jinan; Sung, Yunsick

    2017-01-01

    Clinical data analysis and forecasting have made substantial contributions to disease control, prevention and detection. However, such data usually suffer from highly imbalanced samples in class distributions. In this paper, we aim to formulate effective methods to rebalance binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat algorithm, and apply them to empower the effects of synthetic minority over-sampling technique (SMOTE) for pre-processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reported in this paper reveal that the performance improvements obtained by the former methods are not scalable to larger data scales. The latter methods, which we call Adaptive Swarm Balancing Algorithms, lead to significant efficiency and effectiveness improvements on large datasets while the first method is invalid. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. The proposed methods lead to more credible performances of the classifier, and shortening the run time compared to brute-force method. PMID:28753613

  13. A Discrete Fruit Fly Optimization Algorithm for the Traveling Salesman Problem.

    PubMed

    Jiang, Zi-Bin; Yang, Qiong

    2016-01-01

    The fruit fly optimization algorithm (FOA) is a newly developed bio-inspired algorithm. The continuous variant version of FOA has been proven to be a powerful evolutionary approach to determining the optima of a numerical function on a continuous definition domain. In this study, a discrete FOA (DFOA) is developed and applied to the traveling salesman problem (TSP), a common combinatorial problem. In the DFOA, the TSP tour is represented by an ordering of city indices, and the bio-inspired meta-heuristic search processes are executed with two elaborately designed main procedures: the smelling and tasting processes. In the smelling process, an effective crossover operator is used by the fruit fly group to search for the neighbors of the best-known swarm location. During the tasting process, an edge intersection elimination (EXE) operator is designed to improve the neighbors of the non-optimum food location in order to enhance the exploration performance of the DFOA. In addition, benchmark instances from the TSPLIB are classified in order to test the searching ability of the proposed algorithm. Furthermore, the effectiveness of the proposed DFOA is compared to that of other meta-heuristic algorithms. The results indicate that the proposed DFOA can be effectively used to solve TSPs, especially large-scale problems.

  14. A Discrete Fruit Fly Optimization Algorithm for the Traveling Salesman Problem

    PubMed Central

    Jiang, Zi-bin; Yang, Qiong

    2016-01-01

    The fruit fly optimization algorithm (FOA) is a newly developed bio-inspired algorithm. The continuous variant version of FOA has been proven to be a powerful evolutionary approach to determining the optima of a numerical function on a continuous definition domain. In this study, a discrete FOA (DFOA) is developed and applied to the traveling salesman problem (TSP), a common combinatorial problem. In the DFOA, the TSP tour is represented by an ordering of city indices, and the bio-inspired meta-heuristic search processes are executed with two elaborately designed main procedures: the smelling and tasting processes. In the smelling process, an effective crossover operator is used by the fruit fly group to search for the neighbors of the best-known swarm location. During the tasting process, an edge intersection elimination (EXE) operator is designed to improve the neighbors of the non-optimum food location in order to enhance the exploration performance of the DFOA. In addition, benchmark instances from the TSPLIB are classified in order to test the searching ability of the proposed algorithm. Furthermore, the effectiveness of the proposed DFOA is compared to that of other meta-heuristic algorithms. The results indicate that the proposed DFOA can be effectively used to solve TSPs, especially large-scale problems. PMID:27812175

  15. antaRNA: ant colony-based RNA sequence design.

    PubMed

    Kleinkauf, Robert; Mann, Martin; Backofen, Rolf

    2015-10-01

    RNA sequence design is studied at least as long as the classical folding problem. Although for the latter the functional fold of an RNA molecule is to be found ,: inverse folding tries to identify RNA sequences that fold into a function-specific target structure. In combination with RNA-based biotechnology and synthetic biology ,: reliable RNA sequence design becomes a crucial step to generate novel biochemical components. In this article ,: the computational tool antaRNA is presented. It is capable of compiling RNA sequences for a given structure that comply in addition with an adjustable full range objective GC-content distribution ,: specific sequence constraints and additional fuzzy structure constraints. antaRNA applies ant colony optimization meta-heuristics and its superior performance is shown on a biological datasets. http://www.bioinf.uni-freiburg.de/Software/antaRNA CONTACT: backofen@informatik.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  16. Underwater Robot Task Planning Using Multi-Objective Meta-Heuristics

    PubMed Central

    Landa-Torres, Itziar; Manjarres, Diana; Bilbao, Sonia; Del Ser, Javier

    2017-01-01

    Robotics deployed in the underwater medium are subject to stringent operational conditions that impose a high degree of criticality on the allocation of resources and the schedule of operations in mission planning. In this context the so-called cost of a mission must be considered as an additional criterion when designing optimal task schedules within the mission at hand. Such a cost can be conceived as the impact of the mission on the robotic resources themselves, which range from the consumption of battery to other negative effects such as mechanic erosion. This manuscript focuses on this issue by devising three heuristic solvers aimed at efficiently scheduling tasks in robotic swarms, which collaborate together to accomplish a mission, and by presenting experimental results obtained over realistic scenarios in the underwater environment. The heuristic techniques resort to a Random-Keys encoding strategy to represent the allocation of robots to tasks and the relative execution order of such tasks within the schedule of certain robots. The obtained results reveal interesting differences in terms of Pareto optimality and spread between the algorithms considered in the benchmark, which are insightful for the selection of a proper task scheduler in real underwater campaigns. PMID:28375160

  17. Underwater Robot Task Planning Using Multi-Objective Meta-Heuristics.

    PubMed

    Landa-Torres, Itziar; Manjarres, Diana; Bilbao, Sonia; Del Ser, Javier

    2017-04-04

    Robotics deployed in the underwater medium are subject to stringent operational conditions that impose a high degree of criticality on the allocation of resources and the schedule of operations in mission planning. In this context the so-called cost of a mission must be considered as an additional criterion when designing optimal task schedules within the mission at hand. Such a cost can be conceived as the impact of the mission on the robotic resources themselves, which range from the consumption of battery to other negative effects such as mechanic erosion. This manuscript focuses on this issue by devising three heuristic solvers aimed at efficiently scheduling tasks in robotic swarms, which collaborate together to accomplish a mission, and by presenting experimental results obtained over realistic scenarios in the underwater environment. The heuristic techniques resort to a Random-Keys encoding strategy to represent the allocation of robots to tasks and the relative execution order of such tasks within the schedule of certain robots. The obtained results reveal interesting differences in terms of Pareto optimality and spread between the algorithms considered in the benchmark, which are insightful for the selection of a proper task scheduler in real underwater campaigns.

  18. Proposal of Evolutionary Simplex Method for Global Optimization Problem

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki

    To make an agile decision in a rational manner, role of optimization engineering has been notified increasingly under diversified customer demand. With this point of view, in this paper, we have proposed a new evolutionary method serving as an optimization technique in the paradigm of optimization engineering. The developed method has prospects to solve globally various complicated problem appearing in real world applications. It is evolved from the conventional method known as Nelder and Mead’s Simplex method by virtue of idea borrowed from recent meta-heuristic method such as PSO. Mentioning an algorithm to handle linear inequality constraints effectively, we have validated effectiveness of the proposed method through comparison with other methods using several benchmark problems.

  19. Integrating Predictive Modeling with Control System Design for Managed Aquifer Recharge and Recovery Applications

    NASA Astrophysics Data System (ADS)

    Drumheller, Z. W.; Regnery, J.; Lee, J. H.; Illangasekare, T. H.; Kitanidis, P. K.; Smits, K. M.

    2014-12-01

    Aquifers around the world show troubling signs of irreversible depletion and seawater intrusion as climate change, population growth, and urbanization led to reduced natural recharge rates and overuse. Scientists and engineers have begun to re-investigate the technology of managed aquifer recharge and recovery (MAR) as a means to increase the reliability of the diminishing and increasingly variable groundwater supply. MAR systems offer the possibility of naturally increasing groundwater storage while improving the quality of impaired water used for recharge. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data-driven, real-time control. Our project seeks to ease the operational challenges of MAR facilities through the implementation of active sensor networks, adaptively calibrated flow and transport models, and simulation-based meta-heuristic control optimization methods. The developed system works by continually collecting hydraulic and water quality data from a sensor network embedded within the aquifer. The data is fed into an inversion algorithm, which calibrates the parameters and initial conditions of a predictive flow and transport model. The calibrated model is passed to a meta-heuristic control optimization algorithm (e.g. genetic algorithm) to execute the simulations and determine the best course of action, i.e., the optimal pumping policy for current aquifer conditions. The optimal pumping policy is manually or autonomously applied. During operation, sensor data are used to assess the accuracy of the optimal prediction and augment the pumping strategy as needed. At laboratory-scale, a small (18"H x 46"L) and an intermediate (6'H x 16'L) two-dimensional synthetic aquifer were constructed and outfitted with sensor networks. Data collection and model inversion components were developed and sensor data were validated by analytical measurements.

  20. Modeling an enhanced ridesharing system with meet points and time windows

    PubMed Central

    Li, Xin; Hu, Sangen; Deng, Kai

    2018-01-01

    With the rising of e-hailing services in urban areas, ride sharing is becoming a common mode of transportation. This paper presents a mathematical model to design an enhanced ridesharing system with meet points and users’ preferable time windows. The introduction of meet points allows ridesharing operators to trade off the benefits of saving en-route delays and the cost of additional walking for some passengers to be collectively picked up or dropped off. This extension to the traditional door-to-door ridesharing problem brings more operation flexibility in urban areas (where potential requests may be densely distributed in neighborhood), and thus could achieve better system performance in terms of reducing the total travel time and increasing the served passengers. We design and implement a Tabu-based meta-heuristic algorithm to solve the proposed mixed integer linear program (MILP). To evaluate the validation and effectiveness of the proposed model and solution algorithm, several scenarios are designed and also resolved to optimality by CPLEX. Results demonstrate that (i) detailed route plan associated with passenger assignment to meet points can be obtained with en-route delay savings; (ii) as compared to CPLEX, the meta-heuristic algorithm bears the advantage of higher computation efficiency and produces good quality solutions with 8%~15% difference from the global optima; and (iii) introducing meet points to ridesharing system saves the total travel time by 2.7%-3.8% for small-scale ridesharing systems. More benefits are expected for ridesharing systems with large size of fleet. This study provides a new tool to efficiently operate the ridesharing system, particularly when the ride sharing vehicles are in short supply during peak hours. Traffic congestion mitigation will also be expected. PMID:29715302

  1. A computational approach to animal breeding.

    PubMed

    Berger-Wolf, Tanya Y; Moore, Cristopher; Saia, Jared

    2007-02-07

    We propose a computational model of mating strategies for controlled animal breeding programs. A mating strategy in a controlled breeding program is a heuristic with some optimization criteria as a goal. Thus, it is appropriate to use the computational tools available for analysis of optimization heuristics. In this paper, we propose the first discrete model of the controlled animal breeding problem and analyse heuristics for two possible objectives: (1) breeding for maximum diversity and (2) breeding a target individual. These two goals are representative of conservation biology and agricultural livestock management, respectively. We evaluate several mating strategies and provide upper and lower bounds for the expected number of matings. While the population parameters may vary and can change the actual number of matings for a particular strategy, the order of magnitude of the number of expected matings and the relative competitiveness of the mating heuristics remains the same. Thus, our simple discrete model of the animal breeding problem provides a novel viable and robust approach to designing and comparing breeding strategies in captive populations.

  2. The Probability Heuristics Model of Syllogistic Reasoning.

    ERIC Educational Resources Information Center

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  3. Optimizing a realistic large-scale frequency assignment problem using a new parallel evolutionary approach

    NASA Astrophysics Data System (ADS)

    Chaves-González, José M.; Vega-Rodríguez, Miguel A.; Gómez-Pulido, Juan A.; Sánchez-Pérez, Juan M.

    2011-08-01

    This article analyses the use of a novel parallel evolutionary strategy to solve complex optimization problems. The work developed here has been focused on a relevant real-world problem from the telecommunication domain to verify the effectiveness of the approach. The problem, known as frequency assignment problem (FAP), basically consists of assigning a very small number of frequencies to a very large set of transceivers used in a cellular phone network. Real data FAP instances are very difficult to solve due to the NP-hard nature of the problem, therefore using an efficient parallel approach which makes the most of different evolutionary strategies can be considered as a good way to obtain high-quality solutions in short periods of time. Specifically, a parallel hyper-heuristic based on several meta-heuristics has been developed. After a complete experimental evaluation, results prove that the proposed approach obtains very high-quality solutions for the FAP and beats any other result published.

  4. Solving Inverse Kinematics of Robot Manipulators by Means of Meta-Heuristic Optimisation

    NASA Astrophysics Data System (ADS)

    Wichapong, Kritsada; Bureerat, Sujin; Pholdee, Nantiwat

    2018-05-01

    This paper presents the use of meta-heuristic algorithms (MHs) for solving inverse kinematics of robot manipulators based on using forward kinematic. Design variables are joint angular displacements used to move a robot end-effector to the target in the Cartesian space while the design problem is posed to minimize error between target points and the positions of the robot end-effector. The problem is said to be a dynamic problem as the target points always changed by a robot user. Several well established MHs are used to solve the problem and the results obtained from using different meta-heuristics are compared based on the end-effector error and searching speed of the algorithms. From the study, the best performer will be obtained for setting as the baseline for future development of MH-based inverse kinematic solving.

  5. A modified generalized extremal optimization algorithm for the quay crane scheduling problem with interference constraints

    NASA Astrophysics Data System (ADS)

    Guo, Peng; Cheng, Wenming; Wang, Yi

    2014-10-01

    The quay crane scheduling problem (QCSP) determines the handling sequence of tasks at ship bays by a set of cranes assigned to a container vessel such that the vessel's service time is minimized. A number of heuristics or meta-heuristics have been proposed to obtain the near-optimal solutions to overcome the NP-hardness of the problem. In this article, the idea of generalized extremal optimization (GEO) is adapted to solve the QCSP with respect to various interference constraints. The resulting GEO is termed the modified GEO. A randomized searching method for neighbouring task-to-QC assignments to an incumbent task-to-QC assignment is developed in executing the modified GEO. In addition, a unidirectional search decoding scheme is employed to transform a task-to-QC assignment to an active quay crane schedule. The effectiveness of the developed GEO is tested on a suite of benchmark problems introduced by K.H. Kim and Y.M. Park in 2004 (European Journal of Operational Research, Vol. 156, No. 3). Compared with other well-known existing approaches, the experiment results show that the proposed modified GEO is capable of obtaining the optimal or near-optimal solution in a reasonable time, especially for large-sized problems.

  6. A Modified Mean Gray Wolf Optimization Approach for Benchmark and Biomedical Problems.

    PubMed

    Singh, Narinder; Singh, S B

    2017-01-01

    A modified variant of gray wolf optimization algorithm, namely, mean gray wolf optimization algorithm has been developed by modifying the position update (encircling behavior) equations of gray wolf optimization algorithm. The proposed variant has been tested on 23 standard benchmark well-known test functions (unimodal, multimodal, and fixed-dimension multimodal), and the performance of modified variant has been compared with particle swarm optimization and gray wolf optimization. Proposed algorithm has also been applied to the classification of 5 data sets to check feasibility of the modified variant. The results obtained are compared with many other meta-heuristic approaches, ie, gray wolf optimization, particle swarm optimization, population-based incremental learning, ant colony optimization, etc. The results show that the performance of modified variant is able to find best solutions in terms of high level of accuracy in classification and improved local optima avoidance.

  7. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    PubMed

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  8. Generation of structural topologies using efficient technique based on sorted compliances

    NASA Astrophysics Data System (ADS)

    Mazur, Monika; Tajs-Zielińska, Katarzyna; Bochenek, Bogdan

    2018-01-01

    Topology optimization, although well recognized is still widely developed. It has gained recently more attention since large computational ability become available for designers. This process is stimulated simultaneously by variety of emerging, innovative optimization methods. It is observed that traditional gradient-based mathematical programming algorithms, in many cases, are replaced by novel and e cient heuristic methods inspired by biological, chemical or physical phenomena. These methods become useful tools for structural optimization because of their versatility and easy numerical implementation. In this paper engineering implementation of a novel heuristic algorithm for minimum compliance topology optimization is discussed. The performance of the topology generator is based on implementation of a special function utilizing information of compliance distribution within the design space. With a view to cope with engineering problems the algorithm has been combined with structural analysis system Ansys.

  9. Maximizing the nurses' preferences in nurse scheduling problem: mathematical modeling and a meta-heuristic algorithm

    NASA Astrophysics Data System (ADS)

    Jafari, Hamed; Salmasi, Nasser

    2015-09-01

    The nurse scheduling problem (NSP) has received a great amount of attention in recent years. In the NSP, the goal is to assign shifts to the nurses in order to satisfy the hospital's demand during the planning horizon by considering different objective functions. In this research, we focus on maximizing the nurses' preferences for working shifts and weekends off by considering several important factors such as hospital's policies, labor laws, governmental regulations, and the status of nurses at the end of the previous planning horizon in one of the largest hospitals in Iran i.e., Milad Hospital. Due to the shortage of available nurses, at first, the minimum total number of required nurses is determined. Then, a mathematical programming model is proposed to solve the problem optimally. Since the proposed research problem is NP-hard, a meta-heuristic algorithm based on simulated annealing (SA) is applied to heuristically solve the problem in a reasonable time. An initial feasible solution generator and several novel neighborhood structures are applied to enhance performance of the SA algorithm. Inspired from our observations in Milad hospital, random test problems are generated to evaluate the performance of the SA algorithm. The results of computational experiments indicate that the applied SA algorithm provides solutions with average percentage gap of 5.49 % compared to the upper bounds obtained from the mathematical model. Moreover, the applied SA algorithm provides significantly better solutions in a reasonable time than the schedules provided by the head nurses.

  10. Product Mix Selection Using AN Evolutionary Technique

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Vasant, Pandian

    2009-08-01

    This paper proposes an evolutionary technique for the solution of a real—life industrial problem and particular for the product mix selection problem. The evolutionary technique is a combination of a genetic algorithm that preserves the feasibility of the trial solutions with penalties and some local optimization method. The goal of this paper has been achieved in finding the best near optimal solution for the profit fitness function respect to vagueness factor and level of satisfaction. The findings of the profit values will be very useful for the decision makers in the industrial engineering sector for the implementation purpose. It's possible to improve the solutions obtained in this study by employing other meta-heuristic methods such as simulated annealing, tabu Search, ant colony optimization, particle swarm optimization and artificial immune systems.

  11. Exploring the quantum speed limit with computer games

    NASA Astrophysics Data System (ADS)

    Sørensen, Jens Jakob W. H.; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F.

    2016-04-01

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. ‘Gamification’—the application of game elements in a non-game context—is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  12. Exploring the quantum speed limit with computer games.

    PubMed

    Sørensen, Jens Jakob W H; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F

    2016-04-14

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. 'Gamification'--the application of game elements in a non-game context--is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  13. Structures vibration control via Tuned Mass Dampers using a co-evolution Coral Reefs Optimization algorithm

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.; Camacho-Gómez, C.; Magdaleno, A.; Pereira, E.; Lorenzana, A.

    2017-04-01

    In this paper we tackle a problem of optimal design and location of Tuned Mass Dampers (TMDs) for structures subjected to earthquake ground motions, using a novel meta-heuristic algorithm. Specifically, the Coral Reefs Optimization (CRO) with Substrate Layer (CRO-SL) is proposed as a competitive co-evolution algorithm with different exploration procedures within a single population of solutions. The proposed approach is able to solve the TMD design and location problem, by exploiting the combination of different types of searching mechanisms. This promotes a powerful evolutionary-like algorithm for optimization problems, which is shown to be very effective in this particular problem of TMDs tuning. The proposed algorithm's performance has been evaluated and compared with several reference algorithms in two building models with two and four floors, respectively.

  14. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  15. A Multi-Verse Optimizer with Levy Flights for Numerical Optimization and Its Application in Test Scheduling for Network-on-Chip.

    PubMed

    Hu, Cong; Li, Zhi; Zhou, Tian; Zhu, Aijun; Xu, Chuanpei

    2016-01-01

    We propose a new meta-heuristic algorithm named Levy flights multi-verse optimizer (LFMVO), which incorporates Levy flights into multi-verse optimizer (MVO) algorithm to solve numerical and engineering optimization problems. The Original MVO easily falls into stagnation when wormholes stochastically re-span a number of universes (solutions) around the best universe achieved over the course of iterations. Since Levy flights are superior in exploring unknown, large-scale search space, they are integrated into the previous best universe to force MVO out of stagnation. We test this method on three sets of 23 well-known benchmark test functions and an NP complete problem of test scheduling for Network-on-Chip (NoC). Experimental results prove that the proposed LFMVO is more competitive than its peers in both the quality of the resulting solutions and convergence speed.

  16. A Multi-Verse Optimizer with Levy Flights for Numerical Optimization and Its Application in Test Scheduling for Network-on-Chip

    PubMed Central

    Hu, Cong; Li, Zhi; Zhou, Tian; Zhu, Aijun; Xu, Chuanpei

    2016-01-01

    We propose a new meta-heuristic algorithm named Levy flights multi-verse optimizer (LFMVO), which incorporates Levy flights into multi-verse optimizer (MVO) algorithm to solve numerical and engineering optimization problems. The Original MVO easily falls into stagnation when wormholes stochastically re-span a number of universes (solutions) around the best universe achieved over the course of iterations. Since Levy flights are superior in exploring unknown, large-scale search space, they are integrated into the previous best universe to force MVO out of stagnation. We test this method on three sets of 23 well-known benchmark test functions and an NP complete problem of test scheduling for Network-on-Chip (NoC). Experimental results prove that the proposed LFMVO is more competitive than its peers in both the quality of the resulting solutions and convergence speed. PMID:27926946

  17. A new graph-based method for pairwise global network alignment

    PubMed Central

    Klau, Gunnar W

    2009-01-01

    Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162

  18. Algorithm for parametric community detection in networks.

    PubMed

    Bettinelli, Andrea; Hansen, Pierre; Liberti, Leo

    2012-07-01

    Modularity maximization is extensively used to detect communities in complex networks. It has been shown, however, that this method suffers from a resolution limit: Small communities may be undetectable in the presence of larger ones even if they are very dense. To alleviate this defect, various modifications of the modularity function have been proposed as well as multiresolution methods. In this paper we systematically study a simple model (proposed by Pons and Latapy [Theor. Comput. Sci. 412, 892 (2011)] and similar to the parametric model of Reichardt and Bornholdt [Phys. Rev. E 74, 016110 (2006)]) with a single parameter α that balances the fraction of within community edges and the expected fraction of edges according to the configuration model. An exact algorithm is proposed to find optimal solutions for all values of α as well as the corresponding successive intervals of α values for which they are optimal. This algorithm relies upon a routine for exact modularity maximization and is limited to moderate size instances. An agglomerative hierarchical heuristic is therefore proposed to address parametric modularity detection in large networks. At each iteration the smallest value of α for which it is worthwhile to merge two communities of the current partition is found. Then merging is performed and the data are updated accordingly. An implementation is proposed with the same time and space complexity as the well-known Clauset-Newman-Moore (CNM) heuristic [Phys. Rev. E 70, 066111 (2004)]. Experimental results on artificial and real world problems show that (i) communities are detected by both exact and heuristic methods for all values of the parameter α; (ii) the dendrogram summarizing the results of the heuristic method provides a useful tool for substantive analysis, as illustrated particularly on a Les Misérables data set; (iii) the difference between the parametric modularity values given by the exact method and those given by the heuristic is moderate; (iv) the heuristic version of the proposed parametric method, viewed as a modularity maximization tool, gives better results than the CNM heuristic for large instances.

  19. Improved teaching-learning-based and JAYA optimization algorithms for solving flexible flow shop scheduling problems

    NASA Astrophysics Data System (ADS)

    Buddala, Raviteja; Mahapatra, Siba Sankar

    2017-11-01

    Flexible flow shop (or a hybrid flow shop) scheduling problem is an extension of classical flow shop scheduling problem. In a simple flow shop configuration, a job having `g' operations is performed on `g' operation centres (stages) with each stage having only one machine. If any stage contains more than one machine for providing alternate processing facility, then the problem becomes a flexible flow shop problem (FFSP). FFSP which contains all the complexities involved in a simple flow shop and parallel machine scheduling problems is a well-known NP-hard (Non-deterministic polynomial time) problem. Owing to high computational complexity involved in solving these problems, it is not always possible to obtain an optimal solution in a reasonable computation time. To obtain near-optimal solutions in a reasonable computation time, a large variety of meta-heuristics have been proposed in the past. However, tuning algorithm-specific parameters for solving FFSP is rather tricky and time consuming. To address this limitation, teaching-learning-based optimization (TLBO) and JAYA algorithm are chosen for the study because these are not only recent meta-heuristics but they do not require tuning of algorithm-specific parameters. Although these algorithms seem to be elegant, they lose solution diversity after few iterations and get trapped at the local optima. To alleviate such drawback, a new local search procedure is proposed in this paper to improve the solution quality. Further, mutation strategy (inspired from genetic algorithm) is incorporated in the basic algorithm to maintain solution diversity in the population. Computational experiments have been conducted on standard benchmark problems to calculate makespan and computational time. It is found that the rate of convergence of TLBO is superior to JAYA. From the results, it is found that TLBO and JAYA outperform many algorithms reported in the literature and can be treated as efficient methods for solving the FFSP.

  20. Heuristic algorithms for solving of the tool routing problem for CNC cutting machines

    NASA Astrophysics Data System (ADS)

    Chentsov, P. A.; Petunin, A. A.; Sesekin, A. N.; Shipacheva, E. N.; Sholohov, A. E.

    2015-11-01

    The article is devoted to the problem of minimizing the path of the cutting tool to shape cutting machines began. This problem can be interpreted as a generalized traveling salesman problem. Earlier version of the dynamic programming method to solve this problem was developed. Unfortunately, this method allows to process an amount not exceeding thirty circuits. In this regard, the task of constructing quasi-optimal route becomes relevant. In this paper we propose options for quasi-optimal greedy algorithms. Comparison of the results of exact and approximate algorithms is given.

  1. Associating optical measurements of MEO and GEO objects using Population-Based Meta-Heuristic methods

    NASA Astrophysics Data System (ADS)

    Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.

    2016-11-01

    Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.

  2. "Understanding" medical school curriculum content using KnowledgeMap.

    PubMed

    Denny, Joshua C; Smithers, Jeffrey D; Miller, Randolph A; Spickard, Anderson

    2003-01-01

    To describe the development and evaluation of computational tools to identify concepts within medical curricular documents, using information derived from the National Library of Medicine's Unified Medical Language System (UMLS). The long-term goal of the KnowledgeMap (KM) project is to provide faculty and students with an improved ability to develop, review, and integrate components of the medical school curriculum. The KM concept identifier uses lexical resources partially derived from the UMLS (SPECIALIST lexicon and Metathesaurus), heuristic language processing techniques, and an empirical scoring algorithm. KM differentiates among potentially matching Metathesaurus concepts within a source document. The authors manually identified important "gold standard" biomedical concepts within selected medical school full-content lecture documents and used these documents to compare KM concept recognition with that of a known state-of-the-art "standard"-the National Library of Medicine's MetaMap program. The number of "gold standard" concepts in each lecture document identified by either KM or MetaMap, and the cause of each failure or relative success in a random subset of documents. For 4,281 "gold standard" concepts, MetaMap matched 78% and KM 82%. Precision for "gold standard" concepts was 85% for MetaMap and 89% for KM. The heuristics of KM accurately matched acronyms, concepts underspecified in the document, and ambiguous matches. The most frequent cause of matching failures was absence of target concepts from the UMLS Metathesaurus. The prototypic KM system provided an encouraging rate of concept extraction for representative medical curricular texts. Future versions of KM should be evaluated for their ability to allow administrators, lecturers, and students to navigate through the medical curriculum to locate redundancies, find interrelated information, and identify omissions. In addition, the ability of KM to meet specific, personal information needs should be assessed.

  3. Meta-heuristic ant colony optimization technique to forecast the amount of summer monsoon rainfall: skill comparison with Markov chain model

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Sutapa; Goswami, Sayantika; Das, Debanjana; Middey, Anirban

    2014-05-01

    Forecasting summer monsoon rainfall with precision becomes crucial for the farmers to plan for harvesting in a country like India where the national economy is mostly based on regional agriculture. The forecast of monsoon rainfall based on artificial neural network is a well-researched problem. In the present study, the meta-heuristic ant colony optimization (ACO) technique is implemented to forecast the amount of summer monsoon rainfall for the next day over Kolkata (22.6°N, 88.4°E), India. The ACO technique belongs to swarm intelligence and simulates the decision-making processes of ant colony similar to other adaptive learning techniques. ACO technique takes inspiration from the foraging behaviour of some ant species. The ants deposit pheromone on the ground in order to mark a favourable path that should be followed by other members of the colony. A range of rainfall amount replicating the pheromone concentration is evaluated during the summer monsoon season. The maximum amount of rainfall during summer monsoon season (June—September) is observed to be within the range of 7.5-35 mm during the period from 1998 to 2007, which is in the range 4 category set by the India Meteorological Department (IMD). The result reveals that the accuracy in forecasting the amount of rainfall for the next day during the summer monsoon season using ACO technique is 95 % where as the forecast accuracy is 83 % with Markov chain model (MCM). The forecast through ACO and MCM are compared with other existing models and validated with IMD observations from 2008 to 2012.

  4. Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space

    DTIC Science & Technology

    2015-05-01

    ARL-TR-7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...

  5. A fast inverse treatment planning strategy facilitating optimized catheter selection in image-guided high-dose-rate interstitial gynecologic brachytherapy.

    PubMed

    Guthier, Christian V; Damato, Antonio L; Hesser, Juergen W; Viswanathan, Akila N; Cormack, Robert A

    2017-12-01

    Interstitial high-dose rate (HDR) brachytherapy is an important therapeutic strategy for the treatment of locally advanced gynecologic (GYN) cancers. The outcome of this therapy is determined by the quality of dose distribution achieved. This paper focuses on a novel yet simple heuristic for catheter selection for GYN HDR brachytherapy and their comparison against state of the art optimization strategies. The proposed technique is intended to act as a decision-supporting tool to select a favorable needle configuration. The presented heuristic for catheter optimization is based on a shrinkage-type algorithm (SACO). It is compared against state of the art planning in a retrospective study of 20 patients who previously received image-guided interstitial HDR brachytherapy using a Syed Neblett template. From those plans, template orientation and position are estimated via a rigid registration of the template with the actual catheter trajectories. All potential straight trajectories intersecting the contoured clinical target volume (CTV) are considered for catheter optimization. Retrospectively generated plans and clinical plans are compared with respect to dosimetric performance and optimization time. All plans were generated with one single run of the optimizer lasting 0.6-97.4 s. Compared to manual optimization, SACO yields a statistically significant (P ≤ 0.05) improved target coverage while at the same time fulfilling all dosimetric constraints for organs at risk (OARs). Comparing inverse planning strategies, dosimetric evaluation for SACO and "hybrid inverse planning and optimization" (HIPO), as gold standard, shows no statistically significant difference (P > 0.05). However, SACO provides the potential to reduce the number of used catheters without compromising plan quality. The proposed heuristic for needle selection provides fast catheter selection with optimization times suited for intraoperative treatment planning. Compared to manual optimization, the proposed methodology results in fewer catheters without a clinically significant loss in plan quality. The proposed approach can be used as a decision support tool that guides the user to find the ideal number and configuration of catheters. © 2017 American Association of Physicists in Medicine.

  6. Optimisation of flight dynamic control based on many-objectives meta-heuristic: a comparative study

    NASA Astrophysics Data System (ADS)

    Bureerat, Sujin; Pholdee, Nantiwat; Radpukdee, Thana

    2018-05-01

    Development of many objective meta-heuristics (MnMHs) is a currently interesting topic as they are suitable to real applications of optimisation problems which usually require many ob-jectives. However, most of MnMHs have been mostly developed and tested based on stand-ard testing functions while the use of MnMHs to real applications is rare. Therefore, in this work, MnMHs are applied for optimisation design of flight dynamic control. The design prob-lem is posed to find control gains for minimising; the control effort, the spiral root, the damp-ing in roll root, sideslip angle deviation, and maximising; the damping ratio of the dutch-roll complex pair, the dutch-roll frequency, bank angle at pre-specified times 1 seconds and 2.8 second subjected to several constraints based on Military Specifications (1969) requirement. Several established many-objective meta-heuristics (MnMHs) are used to solve the problem while their performances are compared. With this research work, performance of several MnMHs for flight control is investigated. The results obtained will be the baseline for future development of flight dynamic and control.

  7. Combining local search with co-evolution in a remarkably simple way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boettcher, S.; Percus, A.

    2000-05-01

    The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less

  8. Restart Operator Meta-heuristics for a Problem-Oriented Evolutionary Strategies Algorithm in Inverse Mathematical MISO Modelling Problem Solving

    NASA Astrophysics Data System (ADS)

    Ryzhikov, I. S.; Semenkin, E. S.

    2017-02-01

    This study is focused on solving an inverse mathematical modelling problem for dynamical systems based on observation data and control inputs. The mathematical model is being searched in the form of a linear differential equation, which determines the system with multiple inputs and a single output, and a vector of the initial point coordinates. The described problem is complex and multimodal and for this reason the proposed evolutionary-based optimization technique, which is oriented on a dynamical system identification problem, was applied. To improve its performance an algorithm restart operator was implemented.

  9. Investigations of quantum heuristics for optimization

    NASA Astrophysics Data System (ADS)

    Rieffel, Eleanor; Hadfield, Stuart; Jiang, Zhang; Mandra, Salvatore; Venturelli, Davide; Wang, Zhihui

    We explore the design of quantum heuristics for optimization, focusing on the quantum approximate optimization algorithm, a metaheuristic developed by Farhi, Goldstone, and Gutmann. We develop specific instantiations of the of quantum approximate optimization algorithm for a variety of challenging combinatorial optimization problems. Through theoretical analyses and numeric investigations of select problems, we provide insight into parameter setting and Hamiltonian design for quantum approximate optimization algorithms and related quantum heuristics, and into their implementation on hardware realizable in the near term.

  10. Better Decomposition Heuristics for the Maximum-Weight Connected Graph Problem Using Betweenness Centrality

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takanori; Bannai, Hideo; Nagasaki, Masao; Miyano, Satoru

    We present new decomposition heuristics for finding the optimal solution for the maximum-weight connected graph problem, which is known to be NP-hard. Previous optimal algorithms for solving the problem decompose the input graph into subgraphs using heuristics based on node degree. We propose new heuristics based on betweenness centrality measures, and show through computational experiments that our new heuristics tend to reduce the number of subgraphs in the decomposition, and therefore could lead to the reduction in computational time for finding the optimal solution. The method is further applied to analysis of biological pathway data.

  11. It looks easy! Heuristics for combinatorial optimization problems.

    PubMed

    Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair

    2006-04-01

    Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.

  12. A random-key encoded harmony search approach for energy-efficient production scheduling with shared resources

    NASA Astrophysics Data System (ADS)

    Garcia-Santiago, C. A.; Del Ser, J.; Upton, C.; Quilligan, F.; Gil-Lopez, S.; Salcedo-Sanz, S.

    2015-11-01

    When seeking near-optimal solutions for complex scheduling problems, meta-heuristics demonstrate good performance with affordable computational effort. This has resulted in a gravitation towards these approaches when researching industrial use-cases such as energy-efficient production planning. However, much of the previous research makes assumptions about softer constraints that affect planning strategies and about how human planners interact with the algorithm in a live production environment. This article describes a job-shop problem that focuses on minimizing energy consumption across a production facility of shared resources. The application scenario is based on real facilities made available by the Irish Center for Manufacturing Research. The formulated problem is tackled via harmony search heuristics with random keys encoding. Simulation results are compared to a genetic algorithm, a simulated annealing approach and a first-come-first-served scheduling. The superior performance obtained by the proposed scheduler paves the way towards its practical implementation over industrial production chains.

  13. Object tracking based on harmony search: comparative study

    NASA Astrophysics Data System (ADS)

    Gao, Ming-Liang; He, Xiao-Hai; Luo, Dai-Sheng; Yu, Yan-Mei

    2012-10-01

    Visual tracking can be treated as an optimization problem. A new meta-heuristic optimal algorithm, Harmony Search (HS), was first applied to perform visual tracking by Fourie et al. As the authors point out, many subjects are still required in ongoing research. Our work is a continuation of Fourie's study, with four prominent improved variations of HS, namely Improved Harmony Search (IHS), Global-best Harmony Search (GHS), Self-adaptive Harmony Search (SHS) and Differential Harmony Search (DHS) adopted into the tracking system. Their performances are tested and analyzed on multiple challenging video sequences. Experimental results show that IHS is best, with DHS ranking second among the four improved trackers when the iteration number is small. However, the differences between all four reduced gradually, along with the increasing number of iterations.

  14. HEURISTIC OPTIMIZATION AND ALGORITHM TUNING APPLIED TO SORPTIVE BARRIER DESIGN

    EPA Science Inventory

    While heuristic optimization is applied in environmental applications, ad-hoc algorithm configuration is typical. We use a multi-layer sorptive barrier design problem as a benchmark for an algorithm-tuning procedure, as applied to three heuristics (genetic algorithms, simulated ...

  15. A meta-heuristic method for solving scheduling problem: crow search algorithm

    NASA Astrophysics Data System (ADS)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  16. Heuristic Diagrams as a Tool to Teach History of Science

    ERIC Educational Resources Information Center

    Chamizo, Jose A.

    2012-01-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…

  17. Rapid Deployment of Optimal Control for Building HVAC Systems using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    DTIC Science & Technology

    2017-03-21

    Energy and Water Projects March 21, 2017 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...included reduced system energy use and cost as well as improved performance driven by autonomous commissioning and optimized system control. In the end...improve system performance and reduce energy use and cost. However, implementing these solutions into the extremely heterogeneous and often

  18. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    PubMed Central

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  19. A Geographical Heuristic Routing Protocol for VANETs

    PubMed Central

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-01-01

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation). PMID:27669254

  20. A Geographical Heuristic Routing Protocol for VANETs.

    PubMed

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-09-23

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation).

  1. Procedures for Separations within Batches of Values, 1. The Orderly Tool Kit and Some Heuristics

    DTIC Science & Technology

    1989-03-01

    separations within batches of values, I. The orderly tool kit and some heuristics by Thu Hoang* and John W. Tukey** *Universite Rene Descartes ...separations with batches of values, . The orderly tool kit and heuristics Thu Hoang* and John W. Tukey** *Universite Rene Descartes Laboratoire de

  2. Parameter optimization of differential evolution algorithm for automatic playlist generation problem

    NASA Astrophysics Data System (ADS)

    Alamag, Kaye Melina Natividad B.; Addawe, Joel M.

    2017-11-01

    With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.

  3. BMP analysis system for watershed-based stormwater management.

    PubMed

    Zhen, Jenny; Shoemaker, Leslie; Riverson, John; Alvi, Khalid; Cheng, Mow-Soung

    2006-01-01

    Best Management Practices (BMPs) are measures for mitigating nonpoint source (NPS) pollution caused mainly by stormwater runoff. Established urban and newly developing areas must develop cost effective means for restoring or minimizing impacts, and planning future growth. Prince George's County in Maryland, USA, a fast-growing region in the Washington, DC metropolitan area, has developed a number of tools to support analysis and decision making for stormwater management planning and design at the watershed level. These tools support watershed analysis, innovative BMPs, and optimization. Application of these tools can help achieve environmental goals and lead to significant cost savings. This project includes software development that utilizes GIS information and technology, integrates BMP processes simulation models, and applies system optimization techniques for BMP planning and selection. The system employs the ESRI ArcGIS as the platform, and provides GIS-based visualization and support for developing networks including sequences of land uses, BMPs, and stream reaches. The system also provides interfaces for BMP placement, BMP attribute data input, and decision optimization management. The system includes a stand-alone BMP simulation and evaluation module, which complements both research and regulatory nonpoint source control assessment efforts, and allows flexibility in the examining various BMP design alternatives. Process based simulation of BMPs provides a technique that is sensitive to local climate and rainfall patterns. The system incorporates a meta-heuristic optimization technique to find the most cost-effective BMP placement and implementation plan given a control target, or a fixed cost. A case study is presented to demonstrate the application of the Prince George's County system. The case study involves a highly urbanized area in the Anacostia River (a tributary to Potomac River) watershed southeast of Washington, DC. An innovative system of management practices is proposed to minimize runoff, improve water quality, and provide water reuse opportunities. Proposed management techniques include bioretention, green roof, and rooftop runoff collection (rain barrel) systems. The modeling system was used to identify the most cost-effective combinations of management practices to help minimize frequency and size of runoff events and resulting combined sewer overflows to the Anacostia River.

  4. Swarm intelligence-based approach for optimal design of CMOS differential amplifier and comparator circuit using a hybrid salp swarm algorithm

    NASA Astrophysics Data System (ADS)

    Asaithambi, Sasikumar; Rajappa, Muthaiah

    2018-05-01

    In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.

  5. Swarm intelligence-based approach for optimal design of CMOS differential amplifier and comparator circuit using a hybrid salp swarm algorithm.

    PubMed

    Asaithambi, Sasikumar; Rajappa, Muthaiah

    2018-05-01

    In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.

  6. Cooperation, Fast and Slow: Meta-Analytic Evidence for a Theory of Social Heuristics and Self-Interested Deliberation.

    PubMed

    Rand, David G

    2016-09-01

    Does cooperating require the inhibition of selfish urges? Or does "rational" self-interest constrain cooperative impulses? I investigated the role of intuition and deliberation in cooperation by meta-analyzing 67 studies in which cognitive-processing manipulations were applied to economic cooperation games (total N = 17,647; no indication of publication bias using Egger's test, Begg's test, or p-curve). My meta-analysis was guided by the social heuristics hypothesis, which proposes that intuition favors behavior that typically maximizes payoffs, whereas deliberation favors behavior that maximizes one's payoff in the current situation. Therefore, this theory predicts that deliberation will undermine pure cooperation (i.e., cooperation in settings where there are few future consequences for one's actions, such that cooperating is not in one's self-interest) but not strategic cooperation (i.e., cooperation in settings where cooperating can maximize one's payoff). As predicted, the meta-analysis revealed 17.3% more pure cooperation when intuition was promoted over deliberation, but no significant difference in strategic cooperation between more intuitive and more deliberative conditions. © The Author(s) 2016.

  7. On the asymptotic optimality and improved strategies of SPTB heuristic for open-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Bai, Danyu; Zhang, Zhihai

    2014-08-01

    This article investigates the open-shop scheduling problem with the optimal criterion of minimising the sum of quadratic completion times. For this NP-hard problem, the asymptotic optimality of the shortest processing time block (SPTB) heuristic is proven in the sense of limit. Moreover, three different improvements, namely, the job-insert scheme, tabu search and genetic algorithm, are introduced to enhance the quality of the original solution generated by the SPTB heuristic. At the end of the article, a series of numerical experiments demonstrate the convergence of the heuristic, the performance of the improvements and the effectiveness of the quadratic objective.

  8. Solving NP-Hard Problems with Physarum-Based Ant Colony System.

    PubMed

    Liu, Yuxin; Gao, Chao; Zhang, Zili; Lu, Yuxiao; Chen, Shi; Liang, Mingxin; Tao, Li

    2017-01-01

    NP-hard problems exist in many real world applications. Ant colony optimization (ACO) algorithms can provide approximate solutions for those NP-hard problems, but the performance of ACO algorithms is significantly reduced due to premature convergence and weak robustness, etc. With these observations in mind, this paper proposes a Physarum-based pheromone matrix optimization strategy in ant colony system (ACS) for solving NP-hard problems such as traveling salesman problem (TSP) and 0/1 knapsack problem (0/1 KP). In the Physarum-inspired mathematical model, one of the unique characteristics is that critical tubes can be reserved in the process of network evolution. The optimized updating strategy employs the unique feature and accelerates the positive feedback process in ACS, which contributes to the quick convergence of the optimal solution. Some experiments were conducted using both benchmark and real datasets. The experimental results show that the optimized ACS outperforms other meta-heuristic algorithms in accuracy and robustness for solving TSPs. Meanwhile, the convergence rate and robustness for solving 0/1 KPs are better than those of classical ACS.

  9. Developing a Shuffled Complex-Self Adaptive Hybrid Evolution (SC-SAHEL) Framework for Water Resources Management and Water-Energy System Optimization

    NASA Astrophysics Data System (ADS)

    Rahnamay Naeini, M.; Sadegh, M.; AghaKouchak, A.; Hsu, K. L.; Sorooshian, S.; Yang, T.

    2017-12-01

    Meta-Heuristic optimization algorithms have gained a great deal of attention in a wide variety of fields. Simplicity and flexibility of these algorithms, along with their robustness, make them attractive tools for solving optimization problems. Different optimization methods, however, hold algorithm-specific strengths and limitations. Performance of each individual algorithm obeys the "No-Free-Lunch" theorem, which means a single algorithm cannot consistently outperform all possible optimization problems over a variety of problems. From users' perspective, it is a tedious process to compare, validate, and select the best-performing algorithm for a specific problem or a set of test cases. In this study, we introduce a new hybrid optimization framework, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL), which combines the strengths of different evolutionary algorithms (EAs) in a parallel computing scheme, and allows users to select the most suitable algorithm tailored to the problem at hand. The concept of SC-SAHEL is to execute different EAs as separate parallel search cores, and let all participating EAs to compete during the course of the search. The newly developed SC-SAHEL algorithm is designed to automatically select, the best performing algorithm for the given optimization problem. This algorithm is rigorously effective in finding the global optimum for several strenuous benchmark test functions, and computationally efficient as compared to individual EAs. We benchmark the proposed SC-SAHEL algorithm over 29 conceptual test functions, and two real-world case studies - one hydropower reservoir model and one hydrological model (SAC-SMA). Results show that the proposed framework outperforms individual EAs in an absolute majority of the test problems, and can provide competitive results to the fittest EA algorithm with more comprehensive information during the search. The proposed framework is also flexible for merging additional EAs, boundary-handling techniques, and sampling schemes, and has good potential to be used in Water-Energy system optimal operation and management.

  10. An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics

    NASA Technical Reports Server (NTRS)

    Baluja, Shumeet

    1995-01-01

    This report is a repository of the results obtained from a large scale empirical comparison of seven iterative and evolution-based optimization heuristics. Twenty-seven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include job-shop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2368 to 22040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.

  11. Balancing exploration, uncertainty and computational demands in many objective reservoir optimization

    NASA Astrophysics Data System (ADS)

    Zatarain Salazar, Jazmin; Reed, Patrick M.; Quinn, Julianne D.; Giuliani, Matteo; Castelletti, Andrea

    2017-11-01

    Reservoir operations are central to our ability to manage river basin systems serving conflicting multi-sectoral demands under increasingly uncertain futures. These challenges motivate the need for new solution strategies capable of effectively and efficiently discovering the multi-sectoral tradeoffs that are inherent to alternative reservoir operation policies. Evolutionary many-objective direct policy search (EMODPS) is gaining importance in this context due to its capability of addressing multiple objectives and its flexibility in incorporating multiple sources of uncertainties. This simulation-optimization framework has high potential for addressing the complexities of water resources management, and it can benefit from current advances in parallel computing and meta-heuristics. This study contributes a diagnostic assessment of state-of-the-art parallel strategies for the auto-adaptive Borg Multi Objective Evolutionary Algorithm (MOEA) to support EMODPS. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple sectoral demands from hydropower production, urban water supply, recreation and environmental flows need to be balanced. Using EMODPS with different parallel configurations of the Borg MOEA, we optimize operating policies over different size ensembles of synthetic streamflows and evaporation rates. As we increase the ensemble size, we increase the statistical fidelity of our objective function evaluations at the cost of higher computational demands. This study demonstrates how to overcome the mathematical and computational barriers associated with capturing uncertainties in stochastic multiobjective reservoir control optimization, where parallel algorithmic search serves to reduce the wall-clock time in discovering high quality representations of key operational tradeoffs. Our results show that emerging self-adaptive parallelization schemes exploiting cooperative search populations are crucial. Such strategies provide a promising new set of tools for effectively balancing exploration, uncertainty, and computational demands when using EMODPS.

  12. Optimizing and evaluating the reconstruction of Metagenome-assembled microbial genomes.

    PubMed

    Papudeshi, Bhavya; Haggerty, J Matthew; Doane, Michael; Morris, Megan M; Walsh, Kevin; Beattie, Douglas T; Pande, Dnyanada; Zaeri, Parisa; Silva, Genivaldo G Z; Thompson, Fabiano; Edwards, Robert A; Dinsdale, Elizabeth A

    2017-11-28

    Microbiome/host interactions describe characteristics that affect the host's health. Shotgun metagenomics includes sequencing a random subset of the microbiome to analyze its taxonomic and metabolic potential. Reconstruction of DNA fragments into genomes from metagenomes (called metagenome-assembled genomes) assigns unknown fragments to taxa/function and facilitates discovery of novel organisms. Genome reconstruction incorporates sequence assembly and sorting of assembled sequences into bins, characteristic of a genome. However, the microbial community composition, including taxonomic and phylogenetic diversity may influence genome reconstruction. We determine the optimal reconstruction method for four microbiome projects that had variable sequencing platforms (IonTorrent and Illumina), diversity (high or low), and environment (coral reefs and kelp forests), using a set of parameters to select for optimal assembly and binning tools. We tested the effects of the assembly and binning processes on population genome reconstruction using 105 marine metagenomes from 4 projects. Reconstructed genomes were obtained from each project using 3 assemblers (IDBA, MetaVelvet, and SPAdes) and 2 binning tools (GroopM and MetaBat). We assessed the efficiency of assemblers using statistics that including contig continuity and contig chimerism and the effectiveness of binning tools using genome completeness and taxonomic identification. We concluded that SPAdes, assembled more contigs (143,718 ± 124 contigs) of longer length (N50 = 1632 ± 108 bp), and incorporated the most sequences (sequences-assembled = 19.65%). The microbial richness and evenness were maintained across the assembly, suggesting low contig chimeras. SPAdes assembly was responsive to the biological and technological variations within the project, compared with other assemblers. Among binning tools, we conclude that MetaBat produced bins with less variation in GC content (average standard deviation: 1.49), low species richness (4.91 ± 0.66), and higher genome completeness (40.92 ± 1.75) across all projects. MetaBat extracted 115 bins from the 4 projects of which 66 bins were identified as reconstructed metagenome-assembled genomes with sequences belonging to a specific genus. We identified 13 novel genomes, some of which were 100% complete, but show low similarity to genomes within databases. In conclusion, we present a set of biologically relevant parameters for evaluation to select for optimal assembly and binning tools. For the tools we tested, SPAdes assembler and MetaBat binning tools reconstructed quality metagenome-assembled genomes for the four projects. We also conclude that metagenomes from microbial communities that have high coverage of phylogenetically distinct, and low taxonomic diversity results in highest quality metagenome-assembled genomes.

  13. Augmented design and analysis of computer experiments: a novel tolerance embedded global optimization approach applied to SWIR hyperspectral illumination design.

    PubMed

    Keresztes, Janos C; John Koshel, R; D'huys, Karlien; De Ketelaere, Bart; Audenaert, Jan; Goos, Peter; Saeys, Wouter

    2016-12-26

    A novel meta-heuristic approach for minimizing nonlinear constrained problems is proposed, which offers tolerance information during the search for the global optimum. The method is based on the concept of design and analysis of computer experiments combined with a novel two phase design augmentation (DACEDA), which models the entire merit space using a Gaussian process, with iteratively increased resolution around the optimum. The algorithm is introduced through a series of cases studies with increasing complexity for optimizing uniformity of a short-wave infrared (SWIR) hyperspectral imaging (HSI) illumination system (IS). The method is first demonstrated for a two-dimensional problem consisting of the positioning of analytical isotropic point sources. The method is further applied to two-dimensional (2D) and five-dimensional (5D) SWIR HSI IS versions using close- and far-field measured source models applied within the non-sequential ray-tracing software FRED, including inherent stochastic noise. The proposed method is compared to other heuristic approaches such as simplex and simulated annealing (SA). It is shown that DACEDA converges towards a minimum with 1 % improvement compared to simplex and SA, and more importantly requiring only half the number of simulations. Finally, a concurrent tolerance analysis is done within DACEDA for to the five-dimensional case such that further simulations are not required.

  14. A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.

    PubMed

    Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao

    2011-08-01

    The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.

  15. Parallel Harmony Search Based Distributed Energy Resource Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceylan, Oguzhan; Liu, Guodong; Tomsovic, Kevin

    2015-01-01

    This paper presents a harmony search based parallel optimization algorithm to minimize voltage deviations in three phase unbalanced electrical distribution systems and to maximize active power outputs of distributed energy resources (DR). The main contribution is to reduce the adverse impacts on voltage profile during a day as photovoltaics (PVs) output or electrical vehicles (EVs) charging changes throughout a day. The IEEE 123- bus distribution test system is modified by adding DRs and EVs under different load profiles. The simulation results show that by using parallel computing techniques, heuristic methods may be used as an alternative optimization tool in electricalmore » power distribution systems operation.« less

  16. A novel optimization algorithm for MIMO Hammerstein model identification under heavy-tailed noise.

    PubMed

    Jin, Qibing; Wang, Hehe; Su, Qixin; Jiang, Beiyan; Liu, Qie

    2018-01-01

    In this paper, we study the system identification of multi-input multi-output (MIMO) Hammerstein processes under the typical heavy-tailed noise. To the best of our knowledge, there is no general analytical method to solve this identification problem. Motivated by this, we propose a general identification method to solve this problem based on a Gaussian-Mixture Distribution intelligent optimization algorithm (GMDA). The nonlinear part of Hammerstein process is modeled by a Radial Basis Function (RBF) neural network, and the identification problem is converted to an optimization problem. To overcome the drawbacks of analytical identification method in the presence of heavy-tailed noise, a meta-heuristic optimization algorithm, Cuckoo search (CS) algorithm is used. To improve its performance for this identification problem, the Gaussian-mixture Distribution (GMD) and the GMD sequences are introduced to improve the performance of the standard CS algorithm. Numerical simulations for different MIMO Hammerstein models are carried out, and the simulation results verify the effectiveness of the proposed GMDA. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  17. A hybrid artificial bee colony algorithm for numerical function optimization

    NASA Astrophysics Data System (ADS)

    Alqattan, Zakaria N.; Abdullah, Rosni

    2015-02-01

    Artificial Bee Colony (ABC) algorithm is one of the swarm intelligence algorithms; it has been introduced by Karaboga in 2005. It is a meta-heuristic optimization search algorithm inspired from the intelligent foraging behavior of the honey bees in nature. Its unique search process made it as one of the most competitive algorithm with some other search algorithms in the area of optimization, such as Genetic algorithm (GA) and Particle Swarm Optimization (PSO). However, the ABC performance of the local search process and the bee movement or the solution improvement equation still has some weaknesses. The ABC is good in avoiding trapping at the local optimum but it spends its time searching around unpromising random selected solutions. Inspired by the PSO, we propose a Hybrid Particle-movement ABC algorithm called HPABC, which adapts the particle movement process to improve the exploration of the original ABC algorithm. Numerical benchmark functions were used in order to experimentally test the HPABC algorithm. The results illustrate that the HPABC algorithm can outperform the ABC algorithm in most of the experiments (75% better in accuracy and over 3 times faster).

  18. Training Scalable Restricted Boltzmann Machines Using a Quantum Annealer

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Bass, G.; Dulny, J., III

    2016-12-01

    Machine learning and the optimization involved therein is of critical importance for commercial and military applications. Due to the computational complexity of many-variable optimization, the conventional approach is to employ meta-heuristic techniques to find suboptimal solutions. Quantum Annealing (QA) hardware offers a completely novel approach with the potential to obtain significantly better solutions with large speed-ups compared to traditional computing. In this presentation, we describe our development of new machine learning algorithms tailored for QA hardware. We are training restricted Boltzmann machines (RBMs) using QA hardware on large, high-dimensional commercial datasets. Traditional optimization heuristics such as contrastive divergence and other closely related techniques are slow to converge, especially on large datasets. Recent studies have indicated that QA hardware when used as a sampler provides better training performance compared to conventional approaches. Most of these studies have been limited to moderately-sized datasets due to the hardware restrictions imposed by exisitng QA devices, which make it difficult to solve real-world problems at scale. In this work we develop novel strategies to circumvent this issue. We discuss scale-up techniques such as enhanced embedding and partitioned RBMs which allow large commercial datasets to be learned using QA hardware. We present our initial results obtained by training an RBM as an autoencoder on an image dataset. The results obtained so far indicate that the convergence rates can be improved significantly by increasing RBM network connectivity. These ideas can be readily applied to generalized Boltzmann machines and we are currently investigating this in an ongoing project.

  19. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1988-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  20. Engineering applications of heuristic multilevel optimization methods

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.

    1989-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  1. Rapid Deployment of Optimal Control for Building HVAC Systems Using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    DTIC Science & Technology

    2017-03-21

    for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES None 14. ABSTRACT ESTCP project EW-201409 aimed at demonstrating the benefits ...of innovative software technology for building HV AC systems. These benefits included reduced system energy use and cost as wetl as improved...Control Approach March 2017 This document has been cleared for public release; Distribution Statement A

  2. Managing search complexity in linguistic geometry.

    PubMed

    Stilman, B

    1997-01-01

    This paper is a new step in the development of linguistic geometry. This formal theory is intended to discover and generalize the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper, we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing linguistic geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in the paper on two pilot examples of the solution of complex optimization problems. The first example is a problem of strategic planning for the air combat, in which concurrent actions of four vehicles are simulated as serial interleaving moves. The second example is a problem of strategic planning for the space comb of eight autonomous vehicles (with interleaving moves) that requires generation of the search tree of the depth 25 with the branching factor 30. This is beyond the capabilities of modern and conceivable future computers (employing conventional approaches). In both examples the linguistic geometry tools showed deep and highly selective searches in comparison with conventional search algorithms. For the first example a sketch of the proof of optimality of the solution is considered.

  3. Sequential and Mixed Genetic Algorithm and Learning Automata (SGALA, MGALA) for Feature Selection in QSAR

    PubMed Central

    MotieGhader, Habib; Gharaghani, Sajjad; Masoudi-Sobhanzadeh, Yosef; Masoudi-Nejad, Ali

    2017-01-01

    Feature selection is of great importance in Quantitative Structure-Activity Relationship (QSAR) analysis. This problem has been solved using some meta-heuristic algorithms such as GA, PSO, ACO and so on. In this work two novel hybrid meta-heuristic algorithms i.e. Sequential GA and LA (SGALA) and Mixed GA and LA (MGALA), which are based on Genetic algorithm and learning automata for QSAR feature selection are proposed. SGALA algorithm uses advantages of Genetic algorithm and Learning Automata sequentially and the MGALA algorithm uses advantages of Genetic Algorithm and Learning Automata simultaneously. We applied our proposed algorithms to select the minimum possible number of features from three different datasets and also we observed that the MGALA and SGALA algorithms had the best outcome independently and in average compared to other feature selection algorithms. Through comparison of our proposed algorithms, we deduced that the rate of convergence to optimal result in MGALA and SGALA algorithms were better than the rate of GA, ACO, PSO and LA algorithms. In the end, the results of GA, ACO, PSO, LA, SGALA, and MGALA algorithms were applied as the input of LS-SVR model and the results from LS-SVR models showed that the LS-SVR model had more predictive ability with the input from SGALA and MGALA algorithms than the input from all other mentioned algorithms. Therefore, the results have corroborated that not only is the predictive efficiency of proposed algorithms better, but their rate of convergence is also superior to the all other mentioned algorithms. PMID:28979308

  4. Sequential and Mixed Genetic Algorithm and Learning Automata (SGALA, MGALA) for Feature Selection in QSAR.

    PubMed

    MotieGhader, Habib; Gharaghani, Sajjad; Masoudi-Sobhanzadeh, Yosef; Masoudi-Nejad, Ali

    2017-01-01

    Feature selection is of great importance in Quantitative Structure-Activity Relationship (QSAR) analysis. This problem has been solved using some meta-heuristic algorithms such as GA, PSO, ACO and so on. In this work two novel hybrid meta-heuristic algorithms i.e. Sequential GA and LA (SGALA) and Mixed GA and LA (MGALA), which are based on Genetic algorithm and learning automata for QSAR feature selection are proposed. SGALA algorithm uses advantages of Genetic algorithm and Learning Automata sequentially and the MGALA algorithm uses advantages of Genetic Algorithm and Learning Automata simultaneously. We applied our proposed algorithms to select the minimum possible number of features from three different datasets and also we observed that the MGALA and SGALA algorithms had the best outcome independently and in average compared to other feature selection algorithms. Through comparison of our proposed algorithms, we deduced that the rate of convergence to optimal result in MGALA and SGALA algorithms were better than the rate of GA, ACO, PSO and LA algorithms. In the end, the results of GA, ACO, PSO, LA, SGALA, and MGALA algorithms were applied as the input of LS-SVR model and the results from LS-SVR models showed that the LS-SVR model had more predictive ability with the input from SGALA and MGALA algorithms than the input from all other mentioned algorithms. Therefore, the results have corroborated that not only is the predictive efficiency of proposed algorithms better, but their rate of convergence is also superior to the all other mentioned algorithms.

  5. Rhodobase, a meta-analytical tool for reconstructing gene regulatory networks in a model photosynthetic bacterium.

    PubMed

    Moskvin, Oleg V; Bolotin, Dmitry; Wang, Andrew; Ivanov, Pavel S; Gomelsky, Mark

    2011-02-01

    We present Rhodobase, a web-based meta-analytical tool for analysis of transcriptional regulation in a model anoxygenic photosynthetic bacterium, Rhodobacter sphaeroides. The gene association meta-analysis is based on the pooled data from 100 of R. sphaeroides whole-genome DNA microarrays. Gene-centric regulatory networks were visualized using the StarNet approach (Jupiter, D.C., VanBuren, V., 2008. A visual data mining tool that facilitates reconstruction of transcription regulatory networks. PLoS ONE 3, e1717) with several modifications. We developed a means to identify and visualize operons and superoperons. We designed a framework for the cross-genome search for transcription factor binding sites that takes into account high GC-content and oligonucleotide usage profile characteristic of the R. sphaeroides genome. To facilitate reconstruction of directional relationships between co-regulated genes, we screened upstream sequences (-400 to +20bp from start codons) of all genes for putative binding sites of bacterial transcription factors using a self-optimizing search method developed here. To test performance of the meta-analysis tools and transcription factor site predictions, we reconstructed selected nodes of the R. sphaeroides transcription factor-centric regulatory matrix. The test revealed regulatory relationships that correlate well with the experimentally derived data. The database of transcriptional profile correlations, the network visualization engine and the optimized search engine for transcription factor binding sites analysis are available at http://rhodobase.org. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. Heuristic Diagrams as a Tool to Teach History of Science

    NASA Astrophysics Data System (ADS)

    Chamizo, José A.

    2012-05-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.

  7. Heuristic query optimization for query multiple table and multiple clausa on mobile finance application

    NASA Astrophysics Data System (ADS)

    Indrayana, I. N. E.; P, N. M. Wirasyanti D.; Sudiartha, I. KG

    2018-01-01

    Mobile application allow many users to access data from the application without being limited to space, space and time. Over time the data population of this application will increase. Data access time will cause problems if the data record has reached tens of thousands to millions of records.The objective of this research is to maintain the performance of data execution for large data records. One effort to maintain data access time performance is to apply query optimization method. The optimization used in this research is query heuristic optimization method. The built application is a mobile-based financial application using MySQL database with stored procedure therein. This application is used by more than one business entity in one database, thus enabling rapid data growth. In this stored procedure there is an optimized query using heuristic method. Query optimization is performed on a “Select” query that involves more than one table with multiple clausa. Evaluation is done by calculating the average access time using optimized and unoptimized queries. Access time calculation is also performed on the increase of population data in the database. The evaluation results shown the time of data execution with query heuristic optimization relatively faster than data execution time without using query optimization.

  8. The probability heuristics model of syllogistic reasoning.

    PubMed

    Chater, N; Oaksford, M

    1999-03-01

    A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.

  9. Implementation of Chaotic Gaussian Particle Swarm Optimization for Optimize Learning-to-Rank Software Defect Prediction Model Construction

    NASA Astrophysics Data System (ADS)

    Buchari, M. A.; Mardiyanto, S.; Hendradjaya, B.

    2018-03-01

    Finding the existence of software defect as early as possible is the purpose of research about software defect prediction. Software defect prediction activity is required to not only state the existence of defects, but also to be able to give a list of priorities which modules require a more intensive test. Therefore, the allocation of test resources can be managed efficiently. Learning to rank is one of the approach that can provide defect module ranking data for the purposes of software testing. In this study, we propose a meta-heuristic chaotic Gaussian particle swarm optimization to improve the accuracy of learning to rank software defect prediction approach. We have used 11 public benchmark data sets as experimental data. Our overall results has demonstrated that the prediction models construct using Chaotic Gaussian Particle Swarm Optimization gets better accuracy on 5 data sets, ties in 5 data sets and gets worse in 1 data sets. Thus, we conclude that the application of Chaotic Gaussian Particle Swarm Optimization in Learning-to-Rank approach can improve the accuracy of the defect module ranking in data sets that have high-dimensional features.

  10. A linguistic geometry for 3D strategic planning

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1995-01-01

    This paper is a new step in the development and application of the Linguistic Geometry. This formal theory is intended to discover the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing Linguistic Geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in this paper on the new pilot example of the solution of the extremely complex 3D optimization problem of strategic planning for the space combat of autonomous vehicles. This example demonstrates deep and highly selective search in comparison with conventional search algorithms.

  11. Cost versus life cycle assessment-based environmental impact optimization of drinking water production plants.

    PubMed

    Capitanescu, F; Rege, S; Marvuglia, A; Benetto, E; Ahmadi, A; Gutiérrez, T Navarrete; Tiruta-Barna, L

    2016-07-15

    Empowering decision makers with cost-effective solutions for reducing industrial processes environmental burden, at both design and operation stages, is nowadays a major worldwide concern. The paper addresses this issue for the sector of drinking water production plants (DWPPs), seeking for optimal solutions trading-off operation cost and life cycle assessment (LCA)-based environmental impact while satisfying outlet water quality criteria. This leads to a challenging bi-objective constrained optimization problem, which relies on a computationally expensive intricate process-modelling simulator of the DWPP and has to be solved with limited computational budget. Since mathematical programming methods are unusable in this case, the paper examines the performances in tackling these challenges of six off-the-shelf state-of-the-art global meta-heuristic optimization algorithms, suitable for such simulation-based optimization, namely Strength Pareto Evolutionary Algorithm (SPEA2), Non-dominated Sorting Genetic Algorithm (NSGA-II), Indicator-based Evolutionary Algorithm (IBEA), Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D), Differential Evolution (DE), and Particle Swarm Optimization (PSO). The results of optimization reveal that good reduction in both operating cost and environmental impact of the DWPP can be obtained. Furthermore, NSGA-II outperforms the other competing algorithms while MOEA/D and DE perform unexpectedly poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.

    PubMed

    Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk

    2017-01-01

    Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.

  13. Fast optimization of multipump Raman amplifiers based on a simplified wavelength and power budget heuristic

    NASA Astrophysics Data System (ADS)

    de O. Rocha, Helder R.; Castellani, Carlos E. S.; Silva, Jair A. L.; Pontes, Maria J.; Segatto, Marcelo E. V.

    2015-01-01

    We report a simple budget heuristic for a fast optimization of multipump Raman amplifiers based on the reallocation of the pump wavelengths and the optical powers. A set of different optical fibers are analyzed as the Raman gain medium, and a four-pump amplifier setup is optimized for each of them in order to achieve ripples close to 1 dB and gains up to 20 dB in the C band. Later, a comparison between our proposed heuristic and a multiobjective optimization based on a nondominated sorting genetic algorithm is made, highlighting the fact that our new approach can give similar solutions after at least an order of magnitude fewer iterations. The results shown in this paper can potentially pave the way for real-time optimization of multipump Raman amplifier systems.

  14. A new three-dimensional manufacturing service composition method under various structures using improved Flower Pollination Algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Wenyu; Yang, Yushu; Zhang, Shuai; Yu, Dejian; Chen, Yong

    2018-05-01

    With the growing complexity of customer requirements and the increasing scale of manufacturing services, how to select and combine the single services to meet the complex demand of the customer has become a growing concern. This paper presents a new manufacturing service composition method to solve the multi-objective optimization problem based on quality of service (QoS). The proposed model not only presents different methods for calculating the transportation time and transportation cost under various structures but also solves the three-dimensional composition optimization problem, including service aggregation, service selection, and service scheduling simultaneously. Further, an improved Flower Pollination Algorithm (IFPA) is proposed to solve the three-dimensional composition optimization problem using a matrix-based representation scheme. The mutation operator and crossover operator of the Differential Evolution (DE) algorithm are also used to extend the basic Flower Pollination Algorithm (FPA) to improve its performance. Compared to Genetic Algorithm, DE, and basic FPA, the experimental results confirm that the proposed method demonstrates superior performance than other meta heuristic algorithms and can obtain better manufacturing service composition solutions.

  15. Hybrid flower pollination algorithm strategies for t-way test suite generation.

    PubMed

    Nasser, Abdullah B; Zamli, Kamal Z; Alsewari, AbdulRahman A; Ahmed, Bestoun S

    2018-01-01

    The application of meta-heuristic algorithms for t-way testing has recently become prevalent. Consequently, many useful meta-heuristic algorithms have been developed on the basis of the implementation of t-way strategies (where t indicates the interaction strength). Mixed results have been reported in the literature to highlight the fact that no single strategy appears to be superior compared with other configurations. The hybridization of two or more algorithms can enhance the overall search capabilities, that is, by compensating the limitation of one algorithm with the strength of others. Thus, hybrid variants of the flower pollination algorithm (FPA) are proposed in the current work. Four hybrid variants of FPA are considered by combining FPA with other algorithmic components. The experimental results demonstrate that FPA hybrids overcome the problems of slow convergence in the original FPA and offers statistically superior performance compared with existing t-way strategies in terms of test suite size.

  16. Hybrid flower pollination algorithm strategies for t-way test suite generation

    PubMed Central

    Zamli, Kamal Z.; Alsewari, AbdulRahman A.

    2018-01-01

    The application of meta-heuristic algorithms for t-way testing has recently become prevalent. Consequently, many useful meta-heuristic algorithms have been developed on the basis of the implementation of t-way strategies (where t indicates the interaction strength). Mixed results have been reported in the literature to highlight the fact that no single strategy appears to be superior compared with other configurations. The hybridization of two or more algorithms can enhance the overall search capabilities, that is, by compensating the limitation of one algorithm with the strength of others. Thus, hybrid variants of the flower pollination algorithm (FPA) are proposed in the current work. Four hybrid variants of FPA are considered by combining FPA with other algorithmic components. The experimental results demonstrate that FPA hybrids overcome the problems of slow convergence in the original FPA and offers statistically superior performance compared with existing t-way strategies in terms of test suite size. PMID:29718918

  17. Mathematical simulation and optimization of cutting mode in turning of workpieces made of nickel-based heat-resistant alloy

    NASA Astrophysics Data System (ADS)

    Bogoljubova, M. N.; Afonasov, A. I.; Kozlov, B. N.; Shavdurov, D. E.

    2018-05-01

    A predictive simulation technique of optimal cutting modes in the turning of workpieces made of nickel-based heat-resistant alloys, different from the well-known ones, is proposed. The impact of various factors on the cutting process with the purpose of determining optimal parameters of machining in concordance with certain effectiveness criteria is analyzed in the paper. A mathematical model of optimization, algorithms and computer programmes, visual graphical forms reflecting dependences of the effectiveness criteria – productivity, net cost, and tool life on parameters of the technological process - have been worked out. A nonlinear model for multidimensional functions, “solution of the equation with multiple unknowns”, “a coordinate descent method” and heuristic algorithms are accepted to solve the problem of optimization of cutting mode parameters. Research shows that in machining of workpieces made from heat-resistant alloy AISI N07263, the highest possible productivity will be achieved with the following parameters: cutting speed v = 22.1 m/min., feed rate s=0.26 mm/rev; tool life T = 18 min.; net cost – 2.45 per hour.

  18. An ant colony optimization based algorithm for identifying gene regulatory elements.

    PubMed

    Liu, Wei; Chen, Hanwu; Chen, Ling

    2013-08-01

    It is one of the most important tasks in bioinformatics to identify the regulatory elements in gene sequences. Most of the existing algorithms for identifying regulatory elements are inclined to converge into a local optimum, and have high time complexity. Ant Colony Optimization (ACO) is a meta-heuristic method based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of real ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper designs and implements an ACO based algorithm named ACRI (ant-colony-regulatory-identification) for identifying all possible binding sites of transcription factor from the upstream of co-expressed genes. To accelerate the ants' searching process, a strategy of local optimization is presented to adjust the ants' start positions on the searched sequences. By exploiting the powerful optimization ability of ACO, the algorithm ACRI can not only improve precision of the results, but also achieve a very high speed. Experimental results on real world datasets show that ACRI can outperform other traditional algorithms in the respects of speed and quality of solutions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. A novel hybrid decomposition-and-ensemble model based on CEEMD and GWO for short-term PM2.5 concentration forecasting

    NASA Astrophysics Data System (ADS)

    Niu, Mingfei; Wang, Yufang; Sun, Shaolong; Li, Yongwu

    2016-06-01

    To enhance prediction reliability and accuracy, a hybrid model based on the promising principle of "decomposition and ensemble" and a recently proposed meta-heuristic called grey wolf optimizer (GWO) is introduced for daily PM2.5 concentration forecasting. Compared with existing PM2.5 forecasting methods, this proposed model has improved the prediction accuracy and hit rates of directional prediction. The proposed model involves three main steps, i.e., decomposing the original PM2.5 series into several intrinsic mode functions (IMFs) via complementary ensemble empirical mode decomposition (CEEMD) for simplifying the complex data; individually predicting each IMF with support vector regression (SVR) optimized by GWO; integrating all predicted IMFs for the ensemble result as the final prediction by another SVR optimized by GWO. Seven benchmark models, including single artificial intelligence (AI) models, other decomposition-ensemble models with different decomposition methods and models with the same decomposition-ensemble method but optimized by different algorithms, are considered to verify the superiority of the proposed hybrid model. The empirical study indicates that the proposed hybrid decomposition-ensemble model is remarkably superior to all considered benchmark models for its higher prediction accuracy and hit rates of directional prediction.

  20. An improved robust buffer allocation method for the project scheduling problem

    NASA Astrophysics Data System (ADS)

    Ghoddousi, Parviz; Ansari, Ramin; Makui, Ahmad

    2017-04-01

    Unpredictable uncertainties cause delays and additional costs for projects. Often, when using traditional approaches, the optimizing procedure of the baseline project plan fails and leads to delays. In this study, a two-stage multi-objective buffer allocation approach is applied for robust project scheduling. In the first stage, some decisions are made on buffer sizes and allocation to the project activities. A set of Pareto-optimal robust schedules is designed using the meta-heuristic non-dominated sorting genetic algorithm (NSGA-II) based on the decisions made in the buffer allocation step. In the second stage, the Pareto solutions are evaluated in terms of the deviation from the initial start time and due dates. The proposed approach was implemented on a real dam construction project. The outcomes indicated that the obtained buffered schedule reduces the cost of disruptions by 17.7% compared with the baseline plan, with an increase of about 0.3% in the project completion time.

  1. Short-term scheduling of an open-pit mine with multiple objectives

    NASA Astrophysics Data System (ADS)

    Blom, Michelle; Pearce, Adrian R.; Stuckey, Peter J.

    2017-05-01

    This article presents a novel algorithm for the generation of multiple short-term production schedules for an open-pit mine, in which several objectives, of varying priority, characterize the quality of each solution. A short-term schedule selects regions of a mine site, known as 'blocks', to be extracted in each week of a planning horizon (typically spanning 13 weeks). Existing tools for constructing these schedules use greedy heuristics, with little optimization. To construct a single schedule in which infrastructure is sufficiently utilized, with production grades consistently close to a desired target, a planner must often run these heuristics many times, adjusting parameters after each iteration. A planner's intuition and experience can evaluate the relative quality and mineability of different schedules in a way that is difficult to automate. Of interest to a short-term planner is the generation of multiple schedules, extracting available ore and waste in varying sequences, which can then be manually compared. This article presents a tool in which multiple, diverse, short-term schedules are constructed, meeting a range of common objectives without the need for iterative parameter adjustment.

  2. A derived heuristics based multi-objective optimization procedure for micro-grid scheduling

    NASA Astrophysics Data System (ADS)

    Li, Xin; Deb, Kalyanmoy; Fang, Yanjun

    2017-06-01

    With the availability of different types of power generators to be used in an electric micro-grid system, their operation scheduling as the load demand changes with time becomes an important task. Besides satisfying load balance constraints and the generator's rated power, several other practicalities, such as limited availability of grid power and restricted ramping of power output from generators, must all be considered during the operation scheduling process, which makes it difficult to decide whether the optimization results are accurate and satisfactory. In solving such complex practical problems, heuristics-based customized optimization algorithms are suggested. However, due to nonlinear and complex interactions of variables, it is difficult to come up with heuristics in such problems off-hand. In this article, a two-step strategy is proposed in which the first task deciphers important heuristics about the problem and the second task utilizes the derived heuristics to solve the original problem in a computationally fast manner. Specifically, the specific operation scheduling is considered from a two-objective (cost and emission) point of view. The first task develops basic and advanced level knowledge bases offline from a series of prior demand-wise optimization runs and then the second task utilizes them to modify optimized solutions in an application scenario. Results on island and grid connected modes and several pragmatic formulations of the micro-grid operation scheduling problem clearly indicate the merit of the proposed two-step procedure.

  3. Exact and heuristic algorithms for Space Information Flow.

    PubMed

    Uwitonze, Alfred; Huang, Jiaqing; Ye, Yuanqing; Cheng, Wenqing; Li, Zongpeng

    2018-01-01

    Space Information Flow (SIF) is a new promising research area that studies network coding in geometric space, such as Euclidean space. The design of algorithms that compute the optimal SIF solutions remains one of the key open problems in SIF. This work proposes the first exact SIF algorithm and a heuristic SIF algorithm that compute min-cost multicast network coding for N (N ≥ 3) given terminal nodes in 2-D Euclidean space. Furthermore, we find that the Butterfly network in Euclidean space is the second example besides the Pentagram network where SIF is strictly better than Euclidean Steiner minimal tree. The exact algorithm design is based on two key techniques: Delaunay triangulation and linear programming. Delaunay triangulation technique helps to find practically good candidate relay nodes, after which a min-cost multicast linear programming model is solved over the terminal nodes and the candidate relay nodes, to compute the optimal multicast network topology, including the optimal relay nodes selected by linear programming from all the candidate relay nodes and the flow rates on the connection links. The heuristic algorithm design is also based on Delaunay triangulation and linear programming techniques. The exact algorithm can achieve the optimal SIF solution with an exponential computational complexity, while the heuristic algorithm can achieve the sub-optimal SIF solution with a polynomial computational complexity. We prove the correctness of the exact SIF algorithm. The simulation results show the effectiveness of the heuristic SIF algorithm.

  4. An OER Framework, Heuristic and Lens: Tools for Understanding Lecturers' Adoption of OER

    ERIC Educational Resources Information Center

    Cox, Glenda; Trotter, Henry

    2017-01-01

    This paper examines three new tools--a framework, an heuristic and a lens--for analysing lecturers' adoption of OER in higher educational settings. Emerging from research conducted at the universities of Cape Town (UCT), Fort Hare (UFH) and South Africa (UNISA) on why lecturers adopt--or do not adopt--OER, these tools enable greater analytical…

  5. Integrated Strategy Improves the Prediction Accuracy of miRNA in Large Dataset

    PubMed Central

    Lipps, David; Devineni, Sree

    2016-01-01

    MiRNAs are short non-coding RNAs of about 22 nucleotides, which play critical roles in gene expression regulation. The biogenesis of miRNAs is largely determined by the sequence and structural features of their parental RNA molecules. Based on these features, multiple computational tools have been developed to predict if RNA transcripts contain miRNAs or not. Although being very successful, these predictors started to face multiple challenges in recent years. Many predictors were optimized using datasets of hundreds of miRNA samples. The sizes of these datasets are much smaller than the number of known miRNAs. Consequently, the prediction accuracy of these predictors in large dataset becomes unknown and needs to be re-tested. In addition, many predictors were optimized for either high sensitivity or high specificity. These optimization strategies may bring in serious limitations in applications. Moreover, to meet continuously raised expectations on these computational tools, improving the prediction accuracy becomes extremely important. In this study, a meta-predictor mirMeta was developed by integrating a set of non-linear transformations with meta-strategy. More specifically, the outputs of five individual predictors were first preprocessed using non-linear transformations, and then fed into an artificial neural network to make the meta-prediction. The prediction accuracy of meta-predictor was validated using both multi-fold cross-validation and independent dataset. The final accuracy of meta-predictor in newly-designed large dataset is improved by 7% to 93%. The meta-predictor is also proved to be less dependent on datasets, as well as has refined balance between sensitivity and specificity. This study has two folds of importance: First, it shows that the combination of non-linear transformations and artificial neural networks improves the prediction accuracy of individual predictors. Second, a new miRNA predictor with significantly improved prediction accuracy is developed for the community for identifying novel miRNAs and the complete set of miRNAs. Source code is available at: https://github.com/xueLab/mirMeta PMID:28002428

  6. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  7. PSOLA: A Heuristic Land-Use Allocation Model Using Patch-Level Operations and Knowledge-Informed Rules.

    PubMed

    Liu, Yaolin; Peng, Jinjin; Jiao, Limin; Liu, Yanfang

    2016-01-01

    Optimizing land-use allocation is important to regional sustainable development, as it promotes the social equality of public services, increases the economic benefits of land-use activities, and reduces the ecological risk of land-use planning. Most land-use optimization models allocate land-use using cell-level operations that fragment land-use patches. These models do not cooperate well with land-use planning knowledge, leading to irrational land-use patterns. This study focuses on building a heuristic land-use allocation model (PSOLA) using particle swarm optimization. The model allocates land-use with patch-level operations to avoid fragmentation. The patch-level operations include a patch-edge operator, a patch-size operator, and a patch-compactness operator that constrain the size and shape of land-use patches. The model is also integrated with knowledge-informed rules to provide auxiliary knowledge of land-use planning during optimization. The knowledge-informed rules consist of suitability, accessibility, land use policy, and stakeholders' preference. To validate the PSOLA model, a case study was performed in Gaoqiao Town in Zhejiang Province, China. The results demonstrate that the PSOLA model outperforms a basic PSO (Particle Swarm Optimization) in the terms of the social, economic, ecological, and overall benefits by 3.60%, 7.10%, 1.53% and 4.06%, respectively, which confirms the effectiveness of our improvements. Furthermore, the model has an open architecture, enabling its extension as a generic tool to support decision making in land-use planning.

  8. PSOLA: A Heuristic Land-Use Allocation Model Using Patch-Level Operations and Knowledge-Informed Rules

    PubMed Central

    Liu, Yaolin; Peng, Jinjin; Jiao, Limin; Liu, Yanfang

    2016-01-01

    Optimizing land-use allocation is important to regional sustainable development, as it promotes the social equality of public services, increases the economic benefits of land-use activities, and reduces the ecological risk of land-use planning. Most land-use optimization models allocate land-use using cell-level operations that fragment land-use patches. These models do not cooperate well with land-use planning knowledge, leading to irrational land-use patterns. This study focuses on building a heuristic land-use allocation model (PSOLA) using particle swarm optimization. The model allocates land-use with patch-level operations to avoid fragmentation. The patch-level operations include a patch-edge operator, a patch-size operator, and a patch-compactness operator that constrain the size and shape of land-use patches. The model is also integrated with knowledge-informed rules to provide auxiliary knowledge of land-use planning during optimization. The knowledge-informed rules consist of suitability, accessibility, land use policy, and stakeholders’ preference. To validate the PSOLA model, a case study was performed in Gaoqiao Town in Zhejiang Province, China. The results demonstrate that the PSOLA model outperforms a basic PSO (Particle Swarm Optimization) in the terms of the social, economic, ecological, and overall benefits by 3.60%, 7.10%, 1.53% and 4.06%, respectively, which confirms the effectiveness of our improvements. Furthermore, the model has an open architecture, enabling its extension as a generic tool to support decision making in land-use planning. PMID:27322619

  9. Optimal design of a smart post-buckled beam actuator using bat algorithm: simulations and experiments

    NASA Astrophysics Data System (ADS)

    Mallick, Rajnish; Ganguli, Ranjan; Kumar, Ravi

    2017-05-01

    The optimized design of a smart post-buckled beam actuator (PBA) is performed in this study. A smart material based piezoceramic stack actuator is used as a prime-mover to drive the buckled beam actuator. Piezoceramic actuators are high force, small displacement devices; they possess high energy density and have high bandwidth. In this study, bench top experiments are conducted to investigate the angular tip deflections due to the PBA. A new design of a linear-to-linear motion amplification device (LX-4) is developed to circumvent the small displacement handicap of piezoceramic stack actuators. LX-4 enhances the piezoceramic actuator mechanical leverage by a factor of four. The PBA model is based on dynamic elastic stability and is analyzed using the Mathieu-Hill equation. A formal optimization is carried out using a newly developed meta-heuristic nature inspired algorithm, named as the bat algorithm (BA). The BA utilizes the echolocation capability of bats. An optimized PBA in conjunction with LX-4 generates end rotations of the order of 15° at the output end. The optimized PBA design incurs less weight and induces large end rotations, which will be useful in development of various mechanical and aerospace devices, such as helicopter trailing edge flaps, micro and nano aerial vehicles and other robotic systems.

  10. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  11. Optimizing Controlling-Value-Based Power Gating with Gate Count and Switching Activity

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Kimura, Shinji

    In this paper, a new heuristic algorithm is proposed to optimize the power domain clustering in controlling-value-based (CV-based) power gating technology. In this algorithm, both the switching activity of sleep signals (p) and the overall numbers of sleep gates (gate count, N) are considered, and the sum of the product of p and N is optimized. The algorithm effectively exerts the total power reduction obtained from the CV-based power gating. Even when the maximum depth is kept to be the same, the proposed algorithm can still achieve power reduction approximately 10% more than that of the prior algorithms. Furthermore, detailed comparison between the proposed heuristic algorithm and other possible heuristic algorithms are also presented. HSPICE simulation results show that over 26% of total power reduction can be obtained by using the new heuristic algorithm. In addition, the effect of dynamic power reduction through the CV-based power gating method and the delay overhead caused by the switching of sleep transistors are also shown in this paper.

  12. Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.

    PubMed

    Ćwik, Michał; Józefczyk, Jerzy

    2018-01-01

    An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.

  13. Age Effects and Heuristics in Decision Making*

    PubMed Central

    Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael

    2011-01-01

    Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects. PMID:22544977

  14. Age Effects and Heuristics in Decision Making.

    PubMed

    Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael

    2012-05-01

    Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects.

  15. Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Lee, Charles H.; Cheung, Kar-Ming

    2012-01-01

    In this paper, we propose to solve the constrained optimization problem in two phases. The first phase uses heuristic methods such as the ant colony method, particle swarming optimization, and genetic algorithm to seek a near optimal solution among a list of feasible initial populations. The final optimal solution can be found by using the solution of the first phase as the initial condition to the SQP algorithm. We demonstrate the above problem formulation and optimization schemes with a large-scale network that includes the DSN ground stations and a number of spacecraft of deep space missions.

  16. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    NASA Astrophysics Data System (ADS)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  17. Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Sohn, Andrew

    1996-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load imbalance among processors on a parallel machine. This paper describes the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution cost is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35% of the mesh is randomly adapted. For large-scale scientific computations, our load balancing strategy gives almost a sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remapper yields processor assignments that are less than 3% off the optimal solutions but requires only 1% of the computational time.

  18. Analysis of the type II robotic mixed-model assembly line balancing problem

    NASA Astrophysics Data System (ADS)

    Çil, Zeynel Abidin; Mete, Süleyman; Ağpak, Kürşad

    2017-06-01

    In recent years, there has been an increasing trend towards using robots in production systems. Robots are used in different areas such as packaging, transportation, loading/unloading and especially assembly lines. One important step in taking advantage of robots on the assembly line is considering them while balancing the line. On the other hand, market conditions have increased the importance of mixed-model assembly lines. Therefore, in this article, the robotic mixed-model assembly line balancing problem is studied. The aim of this study is to develop a new efficient heuristic algorithm based on beam search in order to minimize the sum of cycle times over all models. In addition, mathematical models of the problem are presented for comparison. The proposed heuristic is tested on benchmark problems and compared with the optimal solutions. The results show that the algorithm is very competitive and is a promising tool for further research.

  19. Computers Simulate Human Experts.

    ERIC Educational Resources Information Center

    Roberts, Steven K.

    1983-01-01

    Discusses recent progress in artificial intelligence in such narrowly defined areas as medical and electronic diagnosis. Also discusses use of expert systems, man-machine communication problems, novel programing environments (including comments on LISP and LISP machines), and types of knowledge used (factual, heuristic, and meta-knowledge). (JN)

  20. Judgment of riskiness: impact of personality, naive theories and heuristic thinking among female students.

    PubMed

    Gana, Kamel; Lourel, Marcel; Trouillet, Raphaël; Fort, Isabelle; Mezred, Djamila; Blaison, Christophe; Boudjemadi, Valerian; K'Delant, Pascaline; Ledrich, Julie

    2010-02-01

    Three different studies were conducted to examine the impact of heuristic reasoning in the perception of health-related events: lifetime risk of breast cancer (Study 1, n = 468), subjective life expectancy (Study 2, n = 449), and subjective age of onset of menopause (Study 3, n = 448). In each study, three experimental conditions were set up: control, anchoring heuristic and availability heuristic. Analyses of Covariance controlling for optimism, depressive mood, Locus of Control, hypochondriac tendencies and subjective health, indicated significant effect of experimental conditions on perceived breast-cancer risk (p = 0.000), subjective life expectancy (p = 0.000) and subjective onset of menopause (p = 0.000). Indeed, all findings revealed that availability and anchoring heuristics were being used to estimate personal health-related events. The results revealed that some covariates, hypochondriac tendencies in Study 1, optimism, depressive mood and subjective health in Study 2 and internal locus of control in Study 3 had significant impact on judgment of riskiness.

  1. Approximation algorithms for a genetic diagnostics problem.

    PubMed

    Kosaraju, S R; Schäffer, A A; Biesecker, L G

    1998-01-01

    We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.

  2. Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment

    NASA Astrophysics Data System (ADS)

    Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.

    2017-03-01

    Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.

  3. A similarity score-based two-phase heuristic approach to solve the dynamic cellular facility layout for manufacturing systems

    NASA Astrophysics Data System (ADS)

    Kumar, Ravi; Singh, Surya Prakash

    2017-11-01

    The dynamic cellular facility layout problem (DCFLP) is a well-known NP-hard problem. It has been estimated that the efficient design of DCFLP reduces the manufacturing cost of products by maintaining the minimum material flow among all machines in all cells, as the material flow contributes around 10-30% of the total product cost. However, being NP hard, solving the DCFLP optimally is very difficult in reasonable time. Therefore, this article proposes a novel similarity score-based two-phase heuristic approach to solve the DCFLP optimally considering multiple products in multiple times to be manufactured in the manufacturing layout. In the first phase of the proposed heuristic, a machine-cell cluster is created based on similarity scores between machines. This is provided as an input to the second phase to minimize inter/intracell material handling costs and rearrangement costs over the entire planning period. The solution methodology of the proposed approach is demonstrated. To show the efficiency of the two-phase heuristic approach, 21 instances are generated and solved using the optimization software package LINGO. The results show that the proposed approach can optimally solve the DCFLP in reasonable time.

  4. Solving the vehicle routing problem by a hybrid meta-heuristic algorithm

    NASA Astrophysics Data System (ADS)

    Yousefikhoshbakht, Majid; Khorram, Esmaile

    2012-08-01

    The vehicle routing problem (VRP) is one of the most important combinational optimization problems that has nowadays received much attention because of its real application in industrial and service problems. The VRP involves routing a fleet of vehicles, each of them visiting a set of nodes such that every node is visited by exactly one vehicle only once. So, the objective is to minimize the total distance traveled by all the vehicles. This paper presents a hybrid two-phase algorithm called sweep algorithm (SW) + ant colony system (ACS) for the classical VRP. At the first stage, the VRP is solved by the SW, and at the second stage, the ACS and 3-opt local search are used for improving the solutions. Extensive computational tests on standard instances from the literature confirm the effectiveness of the presented approach.

  5. Behavioral economics: "nudging" underserved populations to be screened for cancer.

    PubMed

    Purnell, Jason Q; Thompson, Tess; Kreuter, Matthew W; McBride, Timothy D

    2015-01-15

    Persistent disparities in cancer screening by race/ethnicity and socioeconomic status require innovative prevention tools and techniques. Behavioral economics provides tools to potentially reduce disparities by informing strategies and systems to increase prevention of breast, cervical, and colorectal cancers. With an emphasis on the predictable, but sometimes flawed, mental shortcuts (heuristics) people use to make decisions, behavioral economics offers insights that practitioners can use to enhance evidence-based cancer screening interventions that rely on judgments about the probability of developing and detecting cancer, decisions about competing screening options, and the optimal presentation of complex choices (choice architecture). In the area of judgment, we describe ways practitioners can use the availability and representativeness of heuristics and the tendency toward unrealistic optimism to increase perceptions of risk and highlight benefits of screening. We describe how several behavioral economic principles involved in decision-making can influence screening attitudes, including how framing and context effects can be manipulated to highlight personally salient features of cancer screening tests. Finally, we offer suggestions about ways practitioners can apply principles related to choice architecture to health care systems in which cancer screening takes place. These recommendations include the use of incentives to increase screening, introduction of default options, appropriate feedback throughout the decision-making and behavior completion process, and clear presentation of complex choices, particularly in the context of colorectal cancer screening. We conclude by noting gaps in knowledge and propose future research questions to guide this promising area of research and practice.

  6. Behavioral Economics: “Nudging” Underserved Populations to Be Screened for Cancer

    PubMed Central

    Thompson, Tess; Kreuter, Matthew W.; McBride, Timothy D.

    2015-01-01

    Persistent disparities in cancer screening by race/ethnicity and socioeconomic status require innovative prevention tools and techniques. Behavioral economics provides tools to potentially reduce disparities by informing strategies and systems to increase prevention of breast, cervical, and colorectal cancers. With an emphasis on the predictable, but sometimes flawed, mental shortcuts (heuristics) people use to make decisions, behavioral economics offers insights that practitioners can use to enhance evidence-based cancer screening interventions that rely on judgments about the probability of developing and detecting cancer, decisions about competing screening options, and the optimal presentation of complex choices (choice architecture). In the area of judgment, we describe ways practitioners can use the availability and representativeness of heuristics and the tendency toward unrealistic optimism to increase perceptions of risk and highlight benefits of screening. We describe how several behavioral economic principles involved in decision-making can influence screening attitudes, including how framing and context effects can be manipulated to highlight personally salient features of cancer screening tests. Finally, we offer suggestions about ways practitioners can apply principles related to choice architecture to health care systems in which cancer screening takes place. These recommendations include the use of incentives to increase screening, introduction of default options, appropriate feedback throughout the decision-making and behavior completion process, and clear presentation of complex choices, particularly in the context of colorectal cancer screening. We conclude by noting gaps in knowledge and propose future research questions to guide this promising area of research and practice. PMID:25590600

  7. A Comparison of Genetic Programming Variants for Hyper-Heuristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Sean

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved, such as routing vehicles over highways with constantly changing traffic flows, because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. Hyper-heuristics typically employ Genetic Programming (GP) and this project has investigated the relationship between the choice of GP and performance inmore » Hyper-heuristics. Results are presented demonstrating the existence of problems for which there is a statistically significant performance differential between the use of different types of GP.« less

  8. Use of Statistical Heuristics in Everyday Inductive Reasoning.

    ERIC Educational Resources Information Center

    Nisbett, Richard E.; And Others

    1983-01-01

    In everyday reasoning, people use statistical heuristics (judgmental tools that are rough intuitive equivalents of statistical principles). Use of statistical heuristics is more likely when (1) sampling is clear, (2) the role of chance is clear, (3) statistical reasoning is normative for the event, or (4) the subject has had training in…

  9. The Development, Implementation, and Evaluation of a Problem Solving Heuristic

    ERIC Educational Resources Information Center

    Lorenzo, Mercedes

    2005-01-01

    Problem-solving is one of the main goals in science teaching and is something many students find difficult. This research reports on the development, implementation and evaluation of a problem-solving heuristic. This heuristic intends to help students to understand the steps involved in problem solving (metacognitive tool), and to provide them…

  10. Multi-objective Decision Based Available Transfer Capability in Deregulated Power System Using Heuristic Approaches

    NASA Astrophysics Data System (ADS)

    Pasam, Gopi Krishna; Manohar, T. Gowri

    2016-09-01

    Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.

  11. Evaluation of Effective Factors on Travel Time in Optimization of Bus Stops Placement Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Bargegol, Iraj; Ghorbanzadeh, Mahyar; Ghasedi, Meisam; Rastbod, Mohammad

    2017-10-01

    In congested cities, locating and proper designing of bus stops according to the unequal distribution of passengers is crucial issue economically and functionally, since this subject plays an important role in the use of bus system by passengers. Location of bus stops is a complicated subject; by reducing distances between stops, walking time decreases, but the total travel time may increase. In this paper, a specified corridor in the city of Rasht in north of Iran is studied. Firstly, a new formula is presented to calculate the travel time, by which the number of stops and consequently, the travel time can be optimized. An intended corridor with specified number of stops and distances between them is addressed, the related formulas to travel time are created, and its travel time is calculated. Then the corridor is modelled using a meta-heuristic method in order that the placement and the optimal distances of bus stops for that are determined. It was found that alighting and boarding time along with bus capacity are the most effective factors affecting travel time. Consequently, it is better to have more concentration on indicated factors for improving the efficiency of bus system.

  12. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  13. Optimization Techniques for Clustering,Connectivity, and Flow Problems in Complex Networks

    DTIC Science & Technology

    2012-10-01

    discrete optimization and for analysis of performance of algorithm portfolios; introducing a metaheuristic framework of variable objective search that...The results of empirical evaluation of the proposed algorithm are also included. 1.3 Theoretical analysis of heuristics and designing new metaheuristic ...analysis of heuristics for inapproximable problems and designing new metaheuristic approaches for the problems of interest; (IV) Developing new models

  14. Motion generation of peristaltic mobile robot with particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Homma, Takahiro; Kamamichi, Norihiro

    2015-03-01

    In developments of robots, bio-mimetics is attracting attention, which is a technology for the design of the structure and function inspired from biological system. There are a lot of examples of bio-mimetics in robotics such as legged robots, flapping robots, insect-type robots, fish-type robots. In this study, we focus on the motion of earthworm and aim to develop a peristaltic mobile robot. The earthworm is a slender animal moving in soil. It has a segmented body, and each segment can be shorted and lengthened by muscular actions. It can move forward by traveling expanding motions of each segment backward. By mimicking the structure and motion of the earthworm, we can construct a robot with high locomotive performance against an irregular ground or a narrow space. In this paper, to investigate the motion analytically, a dynamical model is introduced, which consist of a series-connected multi-mass model. Simple periodic patterns which mimic the motions of earthworms are applied in an open-loop fashion, and the moving patterns are verified through numerical simulations. Furthermore, to generate efficient motion of the robot, a particle swarm optimization algorithm, one of the meta-heuristic optimization, is applied. The optimized results are investigated by comparing to simple periodic patterns.

  15. Optimal deployment of thermal energy storage under diverse economic and climate conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeForest, Nicholas; Mendes, Gonçalo; Stadler, Michael

    2014-04-01

    This paper presents an investigation of the economic benefit of thermal energy storage (TES) for cooling, across a range of economic and climate conditions. Chilled water TES systems are simulated for a large office building in four distinct locations, Miami in the U.S.; Lisbon, Portugal; Shanghai, China; and Mumbai, India. Optimal system size and operating schedules are determined using the optimization model DER-CAM, such that total cost, including electricity and amortized capital costs are minimized. The economic impacts of each optimized TES system is then compared to systems sized using a simple heuristic method, which bases system size as fractionmore » (50percent and 100percent) of total on-peak summer cooling loads. Results indicate that TES systems of all sizes can be effective in reducing annual electricity costs (5percent-15percent) and peak electricity consumption (13percent-33percent). The investigation also indentifies a number of criteria which drive TES investment, including low capital costs, electricity tariffs with high power demand charges and prolonged cooling seasons. In locations where these drivers clearly exist, the heuristically sized systems capture much of the value of optimally sized systems; between 60percent and 100percent in terms of net present value. However, in instances where these drivers are less pronounced, the heuristic tends to oversize systems, and optimization becomes crucial to ensure economically beneficial deployment of TES, increasing the net present value of heuristically sized systems by as much as 10 times in some instances.« less

  16. Reasoned versus reactive prediction of behaviour: a meta-analysis of the prototype willingness model.

    PubMed

    Todd, Jemma; Kothe, Emily; Mullan, Barbara; Monds, Lauren

    2016-01-01

    The prototype willingness model (PWM) was designed to extend expectancy-value models of health behaviour by also including a heuristic, or social reactive pathway, to better explain health-risk behaviours in adolescents and young adults. The pathway includes prototype, i.e., images of a typical person who engages in a behaviour, and willingness to engage in behaviour. The current study describes a meta-analysis of predictive research using the PWM and explores the role of the heuristic pathway and intentions in predicting behaviour. Eighty-one studies met inclusion criteria. Overall, the PWM was supported and explained 20.5% of the variance in behaviour. Willingness explained 4.9% of the variance in behaviour over and above intention, although intention tended to be more strongly related to behaviour than was willingness. The strength of the PWM relationships tended to vary according to the behaviour being tested, with alcohol consumption being the behaviour best explained. Age was also an important moderator, and, as expected, PWM behaviour was best accounted for within adolescent samples. Results were heterogeneous even after moderators were taken into consideration. This meta-analysis provides support for the PWM and may be used to inform future interventions that can be tailored for at-risk populations.

  17. Ant colony optimisation-direct cover: a hybrid ant colony direct cover technique for multi-level synthesis of multiple-valued logic functions

    NASA Astrophysics Data System (ADS)

    Abd-El-Barr, Mostafa

    2010-12-01

    The use of non-binary (multiple-valued) logic in the synthesis of digital systems can lead to savings in chip area. Advances in very large scale integration (VLSI) technology have enabled the successful implementation of multiple-valued logic (MVL) circuits. A number of heuristic algorithms for the synthesis of (near) minimal sum-of products (two-level) realisation of MVL functions have been reported in the literature. The direct cover (DC) technique is one such algorithm. The ant colony optimisation (ACO) algorithm is a meta-heuristic that uses constructive greediness to explore a large solution space in finding (near) optimal solutions. The ACO algorithm mimics the ant's behaviour in the real world in using the shortest path to reach food sources. We have previously introduced an ACO-based heuristic for the synthesis of two-level MVL functions. In this article, we introduce the ACO-DC hybrid technique for the synthesis of multi-level MVL functions. The basic idea is to use an ant to decompose a given MVL function into a number of levels and then synthesise each sub-function using a DC-based technique. The results obtained using the proposed approach are compared to those obtained using existing techniques reported in the literature. A benchmark set consisting of 50,000 randomly generated 2-variable 4-valued functions is used in the comparison. The results obtained using the proposed ACO-DC technique are shown to produce efficient realisation in terms of the average number of gates (as a measure of chip area) needed for the synthesis of a given MVL function.

  18. Combined computational-experimental design of high temperature, high-intensity permanent magnetic alloys with minimal addition of rare-earth elements

    NASA Astrophysics Data System (ADS)

    Jha, Rajesh

    AlNiCo magnets are known for high-temperature stability and superior corrosion resistance and have been widely used for various applications. Reported magnetic energy density ((BH) max) for these magnets is around 10 MGOe. Theoretical calculations show that ((BH) max) of 20 MGOe is achievable which will be helpful in covering the gap between AlNiCo and Rare-Earth Elements (REE) based magnets. An extended family of AlNiCo alloys was studied in this dissertation that consists of eight elements, and hence it is important to determine composition-property relationship between each of the alloying elements and their influence on the bulk properties. In the present research, we proposed a novel approach to efficiently use a set of computational tools based on several concepts of artificial intelligence to address a complex problem of design and optimization of high temperature REE-free magnetic alloys. A multi-dimensional random number generation algorithm was used to generate the initial set of chemical concentrations. These alloys were then examined for phase equilibria and associated magnetic properties as a screening tool to form the initial set of alloy. These alloys were manufactured and tested for desired properties. These properties were fitted with a set of multi-dimensional response surfaces and the most accurate meta-models were chosen for prediction. These properties were simultaneously extremized by utilizing a set of multi-objective optimization algorithm. This provided a set of concentrations of each of the alloying elements for optimized properties. A few of the best predicted Pareto-optimal alloy compositions were then manufactured and tested to evaluate the predicted properties. These alloys were then added to the existing data set and used to improve the accuracy of meta-models. The multi-objective optimizer then used the new meta-models to find a new set of improved Pareto-optimized chemical concentrations. This design cycle was repeated twelve times in this work. Several of these Pareto-optimized alloys outperformed most of the candidate alloys on most of the objectives. Unsupervised learning methods such as Principal Component Analysis (PCA) and Heirarchical Cluster Analysis (HCA) were used to discover various patterns within the dataset. This proves the efficacy of the combined meta-modeling and experimental approach in design optimization of magnetic alloys.

  19. Design and usability of heuristic-based deliberation tools for women facing amniocentesis.

    PubMed

    Durand, Marie-Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn

    2012-03-01

    Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). The 'Take The Best' heuristic (i.e. selection of a 'most important reason') and 'The Tallying' integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web-based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health-care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of 'your most important reason' (Take The Best) and 'weighing it up' (Tallying). Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health-care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that 'weighing it up' and 'your most important reason' were not appropriate when facing such a difficult and emotional decision. Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real-world decisions about amniocentesis. © 2011 Blackwell Publishing Ltd.

  20. Dynamic cellular manufacturing system considering machine failure and workload balance

    NASA Astrophysics Data System (ADS)

    Rabbani, Masoud; Farrokhi-Asl, Hamed; Ravanbakhsh, Mohammad

    2018-02-01

    Machines are a key element in the production system and their failure causes irreparable effects in terms of cost and time. In this paper, a new multi-objective mathematical model for dynamic cellular manufacturing system (DCMS) is provided with consideration of machine reliability and alternative process routes. In this dynamic model, we attempt to resolve the problem of integrated family (part/machine cell) formation as well as the operators' assignment to the cells. The first objective minimizes the costs associated with the DCMS. The second objective optimizes the labor utilization and, finally, a minimum value of the variance of workload between different cells is obtained by the third objective function. Due to the NP-hard nature of the cellular manufacturing problem, the problem is initially validated by the GAMS software in small-sized problems, and then the model is solved by two well-known meta-heuristic methods including non-dominated sorting genetic algorithm and multi-objective particle swarm optimization in large-scaled problems. Finally, the results of the two algorithms are compared with respect to five different comparison metrics.

  1. To connect or not to connect? Modelling the optimal degree of centralisation for wastewater infrastructures.

    PubMed

    Eggimann, Sven; Truffer, Bernhard; Maurer, Max

    2015-11-01

    The strong reliance of most utility services on centralised network infrastructures is becoming increasingly challenged by new technological advances in decentralised alternatives. However, not enough effort has been made to develop planning tools designed to address the implications of these new opportunities and to determine the optimal degree of centralisation of these infrastructures. We introduce a planning tool for sustainable network infrastructure planning (SNIP), a two-step techno-economic heuristic modelling approach based on shortest path-finding and hierarchical-agglomerative clustering algorithms to determine the optimal degree of centralisation in the field of wastewater management. This SNIP model optimises the distribution of wastewater treatment plants and the sewer network outlay relative to several cost and sewer-design parameters. Moreover, it allows us to construct alternative optimal wastewater system designs taking into account topography, economies of scale as well as the full size range of wastewater treatment plants. We quantify and confirm that the optimal degree of centralisation decreases with increasing terrain complexity and settlement dispersion while showing that the effect of the latter exceeds that of topography. Case study results for a Swiss community indicate that the calculated optimal degree of centralisation is substantially lower than the current level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Efficient heuristics for maximum common substructure search.

    PubMed

    Englert, Péter; Kovács, Péter

    2015-05-26

    Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.

  3. Optimal rail container shipment planning problem in multimodal transportation

    NASA Astrophysics Data System (ADS)

    Cao, Chengxuan; Gao, Ziyou; Li, Keping

    2012-09-01

    The optimal rail container shipment planning problem in multimodal transportation is studied in this article. The characteristics of the multi-period planning problem is presented and the problem is formulated as a large-scale 0-1 integer programming model, which maximizes the total profit generated by all freight bookings accepted in a multi-period planning horizon subject to the limited capacities. Two heuristic algorithms are proposed to obtain an approximate optimal solution of the problem. Finally, numerical experiments are conducted to demonstrate the proposed formulation and heuristic algorithms.

  4. Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH.

    PubMed

    Volk, Jochen; Herrmann, Torsten; Wüthrich, Kurt

    2008-07-01

    MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness.

  5. QuickVina: accelerating AutoDock Vina using gradient-based heuristics for global optimization.

    PubMed

    Handoko, Stephanus Daniel; Ouyang, Xuchang; Su, Chinh Tran To; Kwoh, Chee Keong; Ong, Yew Soon

    2012-01-01

    Predicting binding between macromolecule and small molecule is a crucial phase in the field of rational drug design. AutoDock Vina, one of the most widely used docking software released in 2009, uses an empirical scoring function to evaluate the binding affinity between the molecules and employs the iterated local search global optimizer for global optimization, achieving a significantly improved speed and better accuracy of the binding mode prediction compared its predecessor, AutoDock 4. In this paper, we propose further improvement in the local search algorithm of Vina by heuristically preventing some intermediate points from undergoing local search. Our improved version of Vina-dubbed QVina-achieved a maximum acceleration of about 25 times with the average speed-up of 8.34 times compared to the original Vina when tested on a set of 231 protein-ligand complexes while maintaining the optimal scores mostly identical. Using our heuristics, larger number of different ligands can be quickly screened against a given receptor within the same time frame.

  6. Metaheuristic simulation optimisation for the stochastic multi-retailer supply chain

    NASA Astrophysics Data System (ADS)

    Omar, Marina; Mustaffa, Noorfa Haszlinna H.; Othman, Siti Norsyahida

    2013-04-01

    Supply Chain Management (SCM) is an important activity in all producing facilities and in many organizations to enable vendors, manufacturers and suppliers to interact gainfully and plan optimally their flow of goods and services. A simulation optimization approach has been widely used in research nowadays on finding the best solution for decision-making process in Supply Chain Management (SCM) that generally faced a complexity with large sources of uncertainty and various decision factors. Metahueristic method is the most popular simulation optimization approach. However, very few researches have applied this approach in optimizing the simulation model for supply chains. Thus, this paper interested in evaluating the performance of metahueristic method for stochastic supply chains in determining the best flexible inventory replenishment parameters that minimize the total operating cost. The simulation optimization model is proposed based on the Bees algorithm (BA) which has been widely applied in engineering application such as training neural networks for pattern recognition. BA is a new member of meta-heuristics. BA tries to model natural behavior of honey bees in food foraging. Honey bees use several mechanisms like waggle dance to optimally locate food sources and to search new ones. This makes them a good candidate for developing new algorithms for solving optimization problems. This model considers an outbound centralised distribution system consisting of one supplier and 3 identical retailers and is assumed to be independent and identically distributed with unlimited supply capacity at supplier.

  7. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  8. Joint optimization of maintenance, buffers and machines in manufacturing lines

    NASA Astrophysics Data System (ADS)

    Nahas, Nabil; Nourelfath, Mustapha

    2018-01-01

    This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.

  9. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.

    PubMed

    Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán

    2014-03-11

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.

  10. Heuristics: foundations for a novel approach to medical decision making.

    PubMed

    Bodemer, Nicolai; Hanoch, Yaniv; Katsikopoulos, Konstantinos V

    2015-03-01

    Medical decision-making is a complex process that often takes place during uncertainty, that is, when knowledge, time, and resources are limited. How can we ensure good decisions? We present research on heuristics-simple rules of thumb-and discuss how medical decision-making can benefit from these tools. We challenge the common view that heuristics are only second-best solutions by showing that they can be more accurate, faster, and easier to apply in comparison to more complex strategies. Using the example of fast-and-frugal decision trees, we illustrate how heuristics can be studied and implemented in the medical context. Finally, we suggest how a heuristic-friendly culture supports the study and application of heuristics as complementary strategies to existing decision rules.

  11. The quasi-optimality criterion in the linear functional strategy

    NASA Astrophysics Data System (ADS)

    Kindermann, Stefan; Pereverzyev, Sergiy, Jr.; Pilipenko, Andrey

    2018-07-01

    The linear functional strategy for the regularization of inverse problems is considered. For selecting the regularization parameter therein, we propose the heuristic quasi-optimality principle and some modifications including the smoothness of the linear functionals. We prove convergence rates for the linear functional strategy with these heuristic rules taking into account the smoothness of the solution and the functionals and imposing a structural condition on the noise. Furthermore, we study these noise conditions in both a deterministic and stochastic setup and verify that for mildly-ill-posed problems and Gaussian noise, these conditions are satisfied almost surely, where on the contrary, in the severely-ill-posed case and in a similar setup, the corresponding noise condition fails to hold. Moreover, we propose an aggregation method for adaptively optimizing the parameter choice rule by making use of improved rates for linear functionals. Numerical results indicate that this method yields better results than the standard heuristic rule.

  12. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  13. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  14. Writing-to-Learn in High-School Chemistry: The Effects of Using The Science Writing Heuristic to Increase Scientific Literacy

    ERIC Educational Resources Information Center

    Nurnberg, Denae

    2017-01-01

    Writing-to-Learn in High-School Chemistry: The Effects of Using the Science Writing Heuristic to Increase Scientific Literacy The purpose of this study was to investigate the effectiveness of using the Science Writing Heuristic (SWH) as an instructional tool to improve academic achievement and writing in the context of scientific literacy. This…

  15. Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Sohn, Andrew

    1996-01-01

    Dynamic mesh adaptation on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load inbalances among processors on a parallel machine. This paper described the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution coast is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35 percent of the mesh is randomly adapted. For large scale scientific computations, our load balancing strategy gives an almost sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remappier yields processor assignments that are less than 3 percent of the optimal solutions, but requires only 1 percent of the computational time.

  16. Cat Swarm Optimization algorithm for optimal linear phase FIR filter design.

    PubMed

    Saha, Suman Kumar; Ghoshal, Sakti Prasad; Kar, Rajib; Mandal, Durbadal

    2013-11-01

    In this paper a new meta-heuristic search method, called Cat Swarm Optimization (CSO) algorithm is applied to determine the best optimal impulse response coefficients of FIR low pass, high pass, band pass and band stop filters, trying to meet the respective ideal frequency response characteristics. CSO is generated by observing the behaviour of cats and composed of two sub-models. In CSO, one can decide how many cats are used in the iteration. Every cat has its' own position composed of M dimensions, velocities for each dimension, a fitness value which represents the accommodation of the cat to the fitness function, and a flag to identify whether the cat is in seeking mode or tracing mode. The final solution would be the best position of one of the cats. CSO keeps the best solution until it reaches the end of the iteration. The results of the proposed CSO based approach have been compared to those of other well-known optimization methods such as Real Coded Genetic Algorithm (RGA), standard Particle Swarm Optimization (PSO) and Differential Evolution (DE). The CSO based results confirm the superiority of the proposed CSO for solving FIR filter design problems. The performances of the CSO based designed FIR filters have proven to be superior as compared to those obtained by RGA, conventional PSO and DE. The simulation results also demonstrate that the CSO is the best optimizer among other relevant techniques, not only in the convergence speed but also in the optimal performances of the designed filters. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Approaches to eliminate waste and reduce cost for recycling glass.

    PubMed

    Chao, Chien-Wen; Liao, Ching-Jong

    2011-12-01

    In recent years, the issue of environmental protection has received considerable attention. This paper adds to the literature by investigating a scheduling problem in the manufacturing of a glass recycling factory in Taiwan. The objective is to minimize the sum of the total holding cost and loss cost. We first represent the problem as an integer programming (IP) model, and then develop two heuristics based on the IP model to find near-optimal solutions for the problem. To validate the proposed heuristics, comparisons between optimal solutions from the IP model and solutions from the current method are conducted. The comparisons involve two problem sizes, small and large, where the small problems range from 15 to 45 jobs, and the large problems from 50 to 100 jobs. Finally, a genetic algorithm is applied to evaluate the proposed heuristics. Computational experiments show that the proposed heuristics can find good solutions in a reasonable time for the considered problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Mathematical programming formulations for satellite synthesis

    NASA Technical Reports Server (NTRS)

    Bhasin, Puneet; Reilly, Charles H.

    1987-01-01

    The problem of satellite synthesis can be described as optimally allotting locations and sometimes frequencies and polarizations, to communication satellites so that interference from unwanted satellite signals does not exceed a specified threshold. In this report, mathematical programming models and optimization methods are used to solve satellite synthesis problems. A nonlinear programming formulation which is solved using Zoutendijk's method and a gradient search method is described. Nine mixed integer programming models are considered. Results of computer runs with these nine models and five geographically compatible scenarios are presented and evaluated. A heuristic solution procedure is also used to solve two of the models studied. Heuristic solutions to three large synthesis problems are presented. The results of our analysis show that the heuristic performs very well, both in terms of solution quality and solution time, on the two models to which it was applied. It is concluded that the heuristic procedure is the best of the methods considered for solving satellite synthesis problems.

  19. Visualization for Hyper-Heuristics: Back-End Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Luke

    Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualizationmore » for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.« less

  20. Optimal tracking control for a class of nonlinear discrete-time systems with time delays based on heuristic dynamic programming.

    PubMed

    Zhang, Huaguang; Song, Ruizhuo; Wei, Qinglai; Zhang, Tieyan

    2011-12-01

    In this paper, a novel heuristic dynamic programming (HDP) iteration algorithm is proposed to solve the optimal tracking control problem for a class of nonlinear discrete-time systems with time delays. The novel algorithm contains state updating, control policy iteration, and performance index iteration. To get the optimal states, the states are also updated. Furthermore, the "backward iteration" is applied to state updating. Two neural networks are used to approximate the performance index function and compute the optimal control policy for facilitating the implementation of HDP iteration algorithm. At last, we present two examples to demonstrate the effectiveness of the proposed HDP iteration algorithm.

  1. Derived heuristics-based consistent optimization of material flow in a gold processing plant

    NASA Astrophysics Data System (ADS)

    Myburgh, Christie; Deb, Kalyanmoy

    2018-01-01

    Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.

  2. New optimization model for routing and spectrum assignment with nodes insecurity

    NASA Astrophysics Data System (ADS)

    Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli

    2017-04-01

    By adopting the orthogonal frequency division multiplexing technology, elastic optical networks can provide the flexible and variable bandwidth allocation to each connection request and get higher spectrum utilization. The routing and spectrum assignment problem in elastic optical network is a well-known NP-hard problem. In addition, information security has received worldwide attention. We combine these two problems to investigate the routing and spectrum assignment problem with the guaranteed security in elastic optical network, and establish a new optimization model to minimize the maximum index of the used frequency slots, which is used to determine an optimal routing and spectrum assignment schemes. To solve the model effectively, a hybrid genetic algorithm framework integrating a heuristic algorithm into a genetic algorithm is proposed. The heuristic algorithm is first used to sort the connection requests and then the genetic algorithm is designed to look for an optimal routing and spectrum assignment scheme. In the genetic algorithm, tailor-made crossover, mutation and local search operators are designed. Moreover, simulation experiments are conducted with three heuristic strategies, and the experimental results indicate that the effectiveness of the proposed model and algorithm framework.

  3. Heuristics for Multiobjective Optimization of Two-Sided Assembly Line Systems

    PubMed Central

    Jawahar, N.; Ponnambalam, S. G.; Sivakumar, K.; Thangadurai, V.

    2014-01-01

    Products such as cars, trucks, and heavy machinery are assembled by two-sided assembly line. Assembly line balancing has significant impacts on the performance and productivity of flow line manufacturing systems and is an active research area for several decades. This paper addresses the line balancing problem of a two-sided assembly line in which the tasks are to be assigned at L side or R side or any one side (addressed as E). Two objectives, minimum number of workstations and minimum unbalance time among workstations, have been considered for balancing the assembly line. There are two approaches to solve multiobjective optimization problem: first approach combines all the objectives into a single composite function or moves all but one objective to the constraint set; second approach determines the Pareto optimal solution set. This paper proposes two heuristics to evolve optimal Pareto front for the TALBP under consideration: Enumerative Heuristic Algorithm (EHA) to handle problems of small and medium size and Simulated Annealing Algorithm (SAA) for large-sized problems. The proposed approaches are illustrated with example problems and their performances are compared with a set of test problems. PMID:24790568

  4. OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. BOETTCHER; A. PERCUS

    2000-08-01

    We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less

  5. A novel harmony search-K means hybrid algorithm for clustering gene expression data

    PubMed Central

    Nazeer, KA Abdul; Sebastian, MP; Kumar, SD Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms. PMID:23390351

  6. A novel harmony search-K means hybrid algorithm for clustering gene expression data.

    PubMed

    Nazeer, Ka Abdul; Sebastian, Mp; Kumar, Sd Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms.

  7. Theoretical Analysis of Local Search and Simple Evolutionary Algorithms for the Generalized Travelling Salesperson Problem.

    PubMed

    Pourhassan, Mojgan; Neumann, Frank

    2018-06-22

    The generalized travelling salesperson problem is an important NP-hard combinatorial optimization problem for which meta-heuristics, such as local search and evolutionary algorithms, have been used very successfully. Two hierarchical approaches with different neighbourhood structures, namely a Cluster-Based approach and a Node-Based approach, have been proposed by Hu and Raidl (2008) for solving this problem. In this paper, local search algorithms and simple evolutionary algorithms based on these approaches are investigated from a theoretical perspective. For local search algorithms, we point out the complementary abilities of the two approaches by presenting instances where they mutually outperform each other. Afterwards, we introduce an instance which is hard for both approaches when initialized on a particular point of the search space, but where a variable neighbourhood search combining them finds the optimal solution in polynomial time. Then we turn our attention to analysing the behaviour of simple evolutionary algorithms that use these approaches. We show that the Node-Based approach solves the hard instance of the Cluster-Based approach presented in Corus et al. (2016) in polynomial time. Furthermore, we prove an exponential lower bound on the optimization time of the Node-Based approach for a class of Euclidean instances.

  8. Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm.

    PubMed

    Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel

    2016-01-01

    The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function.

  9. Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm

    PubMed Central

    Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel

    2016-01-01

    The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function. PMID:27893845

  10. Influence maximization based on partial network structure information: A comparative analysis on seed selection heuristics

    NASA Astrophysics Data System (ADS)

    Erkol, Şirag; Yücel, Gönenç

    In this study, the problem of seed selection is investigated. This problem is mainly treated as an optimization problem, which is proved to be NP-hard. There are several heuristic approaches in the literature which mostly use algorithmic heuristics. These approaches mainly focus on the trade-off between computational complexity and accuracy. Although the accuracy of algorithmic heuristics are high, they also have high computational complexity. Furthermore, in the literature, it is generally assumed that complete information on the structure and features of a network is available, which is not the case in most of the times. For the study, a simulation model is constructed, which is capable of creating networks, performing seed selection heuristics, and simulating diffusion models. Novel metric-based seed selection heuristics that rely only on partial information are proposed and tested using the simulation model. These heuristics use local information available from nodes in the synthetically created networks. The performances of heuristics are comparatively analyzed on three different network types. The results clearly show that the performance of a heuristic depends on the structure of a network. A heuristic to be used should be selected after investigating the properties of the network at hand. More importantly, the approach of partial information provided promising results. In certain cases, selection heuristics that rely only on partial network information perform very close to similar heuristics that require complete network data.

  11. Near-Optimal Tracking Control of Mobile Robots Via Receding-Horizon Dual Heuristic Programming.

    PubMed

    Lian, Chuanqiang; Xu, Xin; Chen, Hong; He, Haibo

    2016-11-01

    Trajectory tracking control of wheeled mobile robots (WMRs) has been an important research topic in control theory and robotics. Although various tracking control methods with stability have been developed for WMRs, it is still difficult to design optimal or near-optimal tracking controller under uncertainties and disturbances. In this paper, a near-optimal tracking control method is presented for WMRs based on receding-horizon dual heuristic programming (RHDHP). In the proposed method, a backstepping kinematic controller is designed to generate desired velocity profiles and the receding horizon strategy is used to decompose the infinite-horizon optimal control problem into a series of finite-horizon optimal control problems. In each horizon, a closed-loop tracking control policy is successively updated using a class of approximate dynamic programming algorithms called finite-horizon dual heuristic programming (DHP). The convergence property of the proposed method is analyzed and it is shown that the tracking control system based on RHDHP is asymptotically stable by using the Lyapunov approach. Simulation results on three tracking control problems demonstrate that the proposed method has improved control performance when compared with conventional model predictive control (MPC) and DHP. It is also illustrated that the proposed method has lower computational burden than conventional MPC, which is very beneficial for real-time tracking control.

  12. Heuristics for the inversion median problem

    PubMed Central

    2010-01-01

    Background The study of genome rearrangements has become a mainstay of phylogenetics and comparative genomics. Fundamental in such a study is the median problem: given three genomes find a fourth that minimizes the sum of the evolutionary distances between itself and the given three. Many exact algorithms and heuristics have been developed for the inversion median problem, of which the best known is MGR. Results We present a unifying framework for median heuristics, which enables us to clarify existing strategies and to place them in a partial ordering. Analysis of this framework leads to a new insight: the best strategies continue to refer to the input data rather than reducing the problem to smaller instances. Using this insight, we develop a new heuristic for inversion medians that uses input data to the end of its computation and leverages our previous work with DCJ medians. Finally, we present the results of extensive experimentation showing that our new heuristic outperforms all others in accuracy and, especially, in running time: the heuristic typically returns solutions within 1% of optimal and runs in seconds to minutes even on genomes with 25'000 genes--in contrast, MGR can take days on instances of 200 genes and cannot be used beyond 1'000 genes. Conclusion Finding good rearrangement medians, in particular inversion medians, had long been regarded as the computational bottleneck in whole-genome studies. Our new heuristic for inversion medians, ASM, which dominates all others in our framework, puts that issue to rest by providing near-optimal solutions within seconds to minutes on even the largest genomes. PMID:20122203

  13. A Heuristics Approach for Classroom Scheduling Using Genetic Algorithm Technique

    NASA Astrophysics Data System (ADS)

    Ahmad, Izah R.; Sufahani, Suliadi; Ali, Maselan; Razali, Siti N. A. M.

    2018-04-01

    Reshuffling and arranging classroom based on the capacity of the audience, complete facilities, lecturing time and many more may lead to a complexity of classroom scheduling. While trying to enhance the productivity in classroom planning, this paper proposes a heuristic approach for timetabling optimization. A new algorithm was produced to take care of the timetabling problem in a university. The proposed of heuristics approach will prompt a superior utilization of the accessible classroom space for a given time table of courses at the university. Genetic Algorithm through Java programming languages were used in this study and aims at reducing the conflicts and optimizes the fitness. The algorithm considered the quantity of students in each class, class time, class size, time accessibility in each class and lecturer who in charge of the classes.

  14. Multiple quay cranes scheduling for double cycling in container terminals

    PubMed Central

    Chu, Yanling; Zhang, Xiaoju; Yang, Zhongzhen

    2017-01-01

    Double cycling is an efficient tool to increase the efficiency of quay crane (QC) in container terminals. In this paper, an optimization model for double cycling is developed to optimize the operation sequence of multiple QCs. The objective is to minimize the makespan of the ship handling operation considering the ship balance constraint. To solve the model, an algorithm based on Lagrangian relaxation is designed. Finally, we compare the efficiency of the Lagrangian relaxation based heuristic with the branch-and-bound method and a genetic algorithm using instances of different sizes. The results of numerical experiments indicate that the proposed model can effectively reduce the unloading and loading times of QCs. The effects of the ship balance constraint are more notable when the number of QCs is high. PMID:28692699

  15. Multiple quay cranes scheduling for double cycling in container terminals.

    PubMed

    Chu, Yanling; Zhang, Xiaoju; Yang, Zhongzhen

    2017-01-01

    Double cycling is an efficient tool to increase the efficiency of quay crane (QC) in container terminals. In this paper, an optimization model for double cycling is developed to optimize the operation sequence of multiple QCs. The objective is to minimize the makespan of the ship handling operation considering the ship balance constraint. To solve the model, an algorithm based on Lagrangian relaxation is designed. Finally, we compare the efficiency of the Lagrangian relaxation based heuristic with the branch-and-bound method and a genetic algorithm using instances of different sizes. The results of numerical experiments indicate that the proposed model can effectively reduce the unloading and loading times of QCs. The effects of the ship balance constraint are more notable when the number of QCs is high.

  16. Beyond Decision Making: Cultural Ideology as Heuristic Paradigmatic Models.

    ERIC Educational Resources Information Center

    Whitley, L. Darrell

    A paradigmatic model of cultural ideology provides a context for understanding the relationship between decision-making and personal and cultural rationality. Cultural rules or heuristics exist which indicate that many decisions can be made on the basis of established strategy rather than continual analytical calculations. When an optimal solution…

  17. Learning Biology by Designing

    ERIC Educational Resources Information Center

    Janssen, Fred; Waarlo, Arend Jan

    2010-01-01

    According to a century-old tradition in biological thinking, organisms can be considered as being optimally designed. In modern biology this idea still has great heuristic value. In evolutionary biology a so-called design heuristic has been formulated which provides guidance to researchers in the generation of knowledge about biological systems.…

  18. Multi-Criteria Optimization of the Deployment of a Grid for Rural Electrification Based on a Heuristic Method

    NASA Astrophysics Data System (ADS)

    Ortiz-Matos, L.; Aguila-Tellez, A.; Hincapié-Reyes, R. C.; González-Sanchez, J. W.

    2017-07-01

    In order to design electrification systems, recent mathematical models solve the problem of location, type of electrification components, and the design of possible distribution microgrids. However, due to the amount of points to be electrified increases, the solution to these models require high computational times, thereby becoming unviable practice models. This study posed a new heuristic method for the electrification of rural areas in order to solve the problem. This heuristic algorithm presents the deployment of rural electrification microgrids in the world, by finding routes for optimal placement lines and transformers in transmission and distribution microgrids. The challenge is to obtain a display with equity in losses, considering the capacity constraints of the devices and topology of the land at minimal economic cost. An optimal scenario ensures the electrification of all neighbourhoods to a minimum investment cost in terms of the distance between electric conductors and the amount of transformation devices.

  19. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  20. Hybrid glowworm swarm optimization for task scheduling in the cloud environment

    NASA Astrophysics Data System (ADS)

    Zhou, Jing; Dong, Shoubin

    2018-06-01

    In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.

  1. Smart strategies for doctors and doctors-in-training: heuristics in medicine.

    PubMed

    Wegwarth, Odette; Gaissmaier, Wolfgang; Gigerenzer, Gerd

    2009-08-01

    How do doctors make sound decisions when confronted with probabilistic data, time pressures and a heavy workload? One theory that has been embraced by many researchers is based on optimisation, which emphasises the need to integrate all information in order to arrive at sound decisions. This notion makes heuristics, which use less than complete information, appear as second-best strategies. In this article, we challenge this pessimistic view of heuristics. We introduce two medical problems that involve decision making to the reader: one concerns coronary care issues and the other macrolide prescriptions. In both settings, decision-making tools grounded in the principles of optimisation and heuristics, respectively, have been developed to assist doctors in making decisions. We explain the structure of each of these tools and compare their performance in terms of their facilitation of correct predictions. For decisions concerning both the coronary care unit and the prescribing of macrolides, we demonstrate that sacrificing information does not necessarily imply a forfeiting of predictive accuracy, but can sometimes even lead to better decisions. Subsequently, we discuss common misconceptions about heuristics and explain when and why ignoring parts of the available information can lead to the making of more robust predictions. Heuristics are neither good nor bad per se, but, if applied in situations to which they have been adapted, can be helpful companions for doctors and doctors-in-training. This, however, requires that heuristics in medicine be openly discussed, criticised, refined and then taught to doctors-in-training rather than being simply dismissed as harmful or irrelevant. A more uniform use of explicit and accepted heuristics has the potential to reduce variations in diagnoses and to improve medical care for patients.

  2. Heuristic Evaluation on Mobile Interfaces: A New Checklist

    PubMed Central

    Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis

    2014-01-01

    The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. PMID:25295300

  3. Meta-Reasoning: Monitoring and Control of Thinking and Reasoning.

    PubMed

    Ackerman, Rakefet; Thompson, Valerie A

    2017-08-01

    Meta-Reasoning refers to the processes that monitor the progress of our reasoning and problem-solving activities and regulate the time and effort devoted to them. Monitoring processes are usually experienced as feelings of certainty or uncertainty about how well a process has, or will, unfold. These feelings are based on heuristic cues, which are not necessarily reliable. Nevertheless, we rely on these feelings of (un)certainty to regulate our mental effort. Most metacognitive research has focused on memorization and knowledge retrieval, with little attention paid to more complex processes, such as reasoning and problem solving. In that context, we recently developed a Meta-Reasoning framework, used here to review existing findings, consider their consequences, and frame questions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Optimizing Multiple-Choice Tests as Learning Events

    ERIC Educational Resources Information Center

    Little, Jeri Lynn

    2011-01-01

    Although generally used for assessment, tests can also serve as tools for learning--but different test formats may not be equally beneficial. Specifically, research has shown multiple-choice tests to be less effective than cued-recall tests in improving the later retention of the tested information (e.g., see meta-analysis by Hamaker, 1986),…

  5. Sequence-based heuristics for faster annotation of non-coding RNA families.

    PubMed

    Weinberg, Zasha; Ruzzo, Walter L

    2006-01-01

    Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.

  6. Teaching learning based optimization-functional link artificial neural network filter for mixed noise reduction from magnetic resonance image.

    PubMed

    Kumar, M; Mishra, S K

    2017-01-01

    The clinical magnetic resonance imaging (MRI) images may get corrupted due to the presence of the mixture of different types of noises such as Rician, Gaussian, impulse, etc. Most of the available filtering algorithms are noise specific, linear, and non-adaptive. There is a need to develop a nonlinear adaptive filter that adapts itself according to the requirement and effectively applied for suppression of mixed noise from different MRI images. In view of this, a novel nonlinear neural network based adaptive filter i.e. functional link artificial neural network (FLANN) whose weights are trained by a recently developed derivative free meta-heuristic technique i.e. teaching learning based optimization (TLBO) is proposed and implemented. The performance of the proposed filter is compared with five other adaptive filters and analyzed by considering quantitative metrics and evaluating the nonparametric statistical test. The convergence curve and computational time are also included for investigating the efficiency of the proposed as well as competitive filters. The simulation outcomes of proposed filter outperform the other adaptive filters. The proposed filter can be hybridized with other evolutionary technique and utilized for removing different noise and artifacts from others medical images more competently.

  7. Usability of a patient education and motivation tool using heuristic evaluation.

    PubMed

    Joshi, Ashish; Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew

    2009-11-06

    Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. PEMT was evaluated by three usability experts using Nielsen's usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT.

  8. Usability of a Patient Education and Motivation Tool Using Heuristic Evaluation

    PubMed Central

    Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew

    2009-01-01

    Background Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. Objective The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. Methods PEMT was evaluated by three usability experts using Nielsen’s usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. Results A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). Conclusion We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT. PMID:19897458

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karp, Peter D.

    Pathway Tools is a systems-biology software package written by SRI International (SRI) that produces Pathway/Genome Databases (PGDBs) for organisms with a sequenced genome. Pathway Tools also provides a wide range of capabilities for analyzing predicted metabolic networks and user-generated omics data. More than 5,000 academic, industrial, and government groups have licensed Pathway Tools. This user community includes researchers at all three DOE bioenergy centers, as well as academic and industrial metabolic engineering (ME) groups. An integral part of the Pathway Tools software is MetaCyc, a large, multiorganism database of metabolic pathways and enzymes that SRI and its academic collaborators manuallymore » curate. This project included two main goals: I. Enhance the MetaCyc content of bioenergy-related enzymes and pathways. II. Develop computational tools for engineering metabolic pathways that satisfy specified design goals, in particular for bioenergy-related pathways. In part I, SRI proposed to significantly expand the coverage of bioenergy-related metabolic information in MetaCyc, followed by the generation of organism-specific PGDBs for all energy-relevant organisms sequenced at the DOE Joint Genome Institute (JGI). Part I objectives included: 1: Expand the content of MetaCyc to include bioenergy-related enzymes and pathways. 2: Enhance the Pathway Tools software to enable display of complex polymer degradation processes. 3: Create new PGDBs for the energy-related organisms sequenced by JGI, update existing PGDBs with new MetaCyc content, and make these data available to JBEI via the BioCyc website. In part II, SRI proposed to develop an efficient computational tool for the engineering of metabolic pathways. Part II objectives included: 4: Develop computational tools for generating metabolic pathways that satisfy specified design goals, enabling users to specify parameters such as starting and ending compounds, and preferred or disallowed intermediate compounds. The pathways were to be generated using metabolic reactions from a reference database (DB). 5: Develop computational tools for ranking the pathways generated in objective (4) according to their optimality. The ranking criteria include stoichiometric yield, the number and cost of additional inputs and the cofactor compounds required by the pathway, pathway length, and pathway energetics. 6: Develop tools for visualizing generated pathways to facilitate the evaluation of a large space of generated pathways.« less

  10. Fluency heuristic: a model of how the mind exploits a by-product of information retrieval.

    PubMed

    Hertwig, Ralph; Herzog, Stefan M; Schooler, Lael J; Reimer, Torsten

    2008-09-01

    Boundedly rational heuristics for inference can be surprisingly accurate and frugal for several reasons. They can exploit environmental structures, co-opt complex capacities, and elude effortful search by exploiting information that automatically arrives on the mental stage. The fluency heuristic is a prime example of a heuristic that makes the most of an automatic by-product of retrieval from memory, namely, retrieval fluency. In 4 experiments, the authors show that retrieval fluency can be a proxy for real-world quantities, that people can discriminate between two objects' retrieval fluencies, and that people's inferences are in line with the fluency heuristic (in particular fast inferences) and with experimentally manipulated fluency. The authors conclude that the fluency heuristic may be one tool in the mind's repertoire of strategies that artfully probes memory for encapsulated frequency information that can veridically reflect statistical regularities in the world. (c) 2008 APA, all rights reserved.

  11. A heuristic re-mapping algorithm reducing inter-level communication in SAMR applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steensland, Johan; Ray, Jaideep

    2003-07-01

    This paper aims at decreasing execution time for large-scale structured adaptive mesh refinement (SAMR) applications by proposing a new heuristic re-mapping algorithm and experimentally showing its effectiveness in reducing inter-level communication. Tests were done for five different SAMR applications. The overall goal is to engineer a dynamically adaptive meta-partitioner capable of selecting and configuring the most appropriate partitioning strategy at run-time based on current system and application state. Such a metapartitioner can significantly reduce execution times for general SAMR applications. Computer simulations of physical phenomena are becoming increasingly popular as they constitute an important complement to real-life testing. In manymore » cases, such simulations are based on solving partial differential equations by numerical methods. Adaptive methods are crucial to efficiently utilize computer resources such as memory and CPU. But even with adaption, the simulations are computationally demanding and yield huge data sets. Thus parallelization and the efficient partitioning of data become issues of utmost importance. Adaption causes the workload to change dynamically, calling for dynamic (re-) partitioning to maintain efficient resource utilization. The proposed heuristic algorithm reduced inter-level communication substantially. Since the complexity of the proposed algorithm is low, this decrease comes at a relatively low cost. As a consequence, we draw the conclusion that the proposed re-mapping algorithm would be useful to lower overall execution times for many large SAMR applications. Due to its usefulness and its parameterization, the proposed algorithm would constitute a natural and important component of the meta-partitioner.« less

  12. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE PAGES

    Theiler, James Patrick; Korber, Bette Tina Marie

    2017-01-29

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  13. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James Patrick; Korber, Bette Tina Marie

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  14. Inhibitory Control in a Notorious Brain Teaser: The Monty Hall Dilemma

    ERIC Educational Resources Information Center

    Saenen, Lore; Heyvaert, Mieke; Van Dooren, Wim; Onghena, Patrick

    2015-01-01

    The Monty Hall dilemma (MHD) is a counterintuitive probability problem in which participants often use misleading heuristics, such as the equiprobability bias. Finding the optimal solution to the MHD requires inhibition of these heuristics. In the current study, we investigated the relation between participants' equiprobability bias and their MHD…

  15. A Comparison of Heuristic Procedures for Minimum within-Cluster Sums of Squares Partitioning

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Steinley, Douglas

    2007-01-01

    Perhaps the most common criterion for partitioning a data set is the minimization of the within-cluster sums of squared deviation from cluster centroids. Although optimal solution procedures for within-cluster sums of squares (WCSS) partitioning are computationally feasible for small data sets, heuristic procedures are required for most practical…

  16. A problem solving and decision making toolbox for approaching clinical problems and decisions.

    PubMed

    Margolis, C; Jotkowitz, A; Sitter, H

    2004-08-01

    In this paper, we begin by presenting three real patients and then review all the practical conceptual tools that have been suggested for systematically analyzing clinical problems. Each of these conceptual tools (e.g. Evidence-Based Medicine, Clinical Practice Guidelines, Decision Analysis) deals mainly with a different type or aspect of clinical problems. We suggest that all of these conceptual tools can be thought of as belonging in the clinician's toolbox for solving clinical problems and making clinical decisions. A heuristic for guiding the clinician in using the tools is proposed. The heuristic is then used to analyze management of the three patients presented at the outset. Copyright 2004 Birkhäuser Verlag, Basel

  17. Local search heuristic for the discrete leader-follower problem with multiple follower objectives

    NASA Astrophysics Data System (ADS)

    Kochetov, Yury; Alekseeva, Ekaterina; Mezmaz, Mohand

    2016-10-01

    We study a discrete bilevel problem, called as well as leader-follower problem, with multiple objectives at the lower level. It is assumed that constraints at the upper level can include variables of both levels. For such ill-posed problem we define feasible and optimal solutions for pessimistic case. A central point of this work is a two stage method to get a feasible solution under the pessimistic case, given a leader decision. The target of the first stage is a follower solution that violates the leader constraints. The target of the second stage is a pessimistic feasible solution. Each stage calls a heuristic and a solver for a series of particular mixed integer programs. The method is integrated inside a local search based heuristic that is designed to find near-optimal leader solutions.

  18. Outbreak Column 16: Cognitive errors in outbreak decision making.

    PubMed

    Curran, Evonne T

    2015-01-01

    During outbreaks, decisions must be made without all the required information. People, including infection prevention and control teams (IPCTs), who have to make decisions during uncertainty use heuristics to fill the missing data gaps. Heuristics are mental model short cuts that by-and-large enable us to make good decisions quickly. However, these heuristics contain biases and effects that at times lead to cognitive (thinking) errors. These cognitive errors are not made to deliberately misrepresent any given situation; we are subject to heuristic biases when we are trying to perform optimally. The science of decision making is large; there are over 100 different biases recognised and described. Outbreak Column 16 discusses and relates these heuristics and biases to decision making during outbreak prevention, preparedness and management. Insights as to how we might recognise and avoid them are offered.

  19. Adaptive infinite impulse response system identification using modified-interior search algorithm with Lèvy flight.

    PubMed

    Kumar, Manjeet; Rawat, Tarun Kumar; Aggarwal, Apoorva

    2017-03-01

    In this paper, a new meta-heuristic optimization technique, called interior search algorithm (ISA) with Lèvy flight is proposed and applied to determine the optimal parameters of an unknown infinite impulse response (IIR) system for the system identification problem. ISA is based on aesthetics, which is commonly used in interior design and decoration processes. In ISA, composition phase and mirror phase are applied for addressing the nonlinear and multimodal system identification problems. System identification using modified-ISA (M-ISA) based method involves faster convergence, single parameter tuning and does not require derivative information because it uses a stochastic random search using the concepts of Lèvy flight. A proper tuning of control parameter has been performed in order to achieve a balance between intensification and diversification phases. In order to evaluate the performance of the proposed method, mean square error (MSE), computation time and percentage improvement are considered as the performance measure. To validate the performance of M-ISA based method, simulations has been carried out for three benchmarked IIR systems using same order and reduced order system. Genetic algorithm (GA), particle swarm optimization (PSO), cat swarm optimization (CSO), cuckoo search algorithm (CSA), differential evolution using wavelet mutation (DEWM), firefly algorithm (FFA), craziness based particle swarm optimization (CRPSO), harmony search (HS) algorithm, opposition based harmony search (OHS) algorithm, hybrid particle swarm optimization-gravitational search algorithm (HPSO-GSA) and ISA are also used to model the same examples and simulation results are compared. Obtained results confirm the efficiency of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Runway Scheduling Using Generalized Dynamic Programming

    NASA Technical Reports Server (NTRS)

    Montoya, Justin; Wood, Zachary; Rathinam, Sivakumar

    2011-01-01

    A generalized dynamic programming method for finding a set of pareto optimal solutions for a runway scheduling problem is introduced. The algorithm generates a set of runway fight sequences that are optimal for both runway throughput and delay. Realistic time-based operational constraints are considered, including miles-in-trail separation, runway crossings, and wake vortex separation. The authors also model divergent runway takeoff operations to allow for reduced wake vortex separation. A modeled Dallas/Fort Worth International airport and three baseline heuristics are used to illustrate preliminary benefits of using the generalized dynamic programming method. Simulated traffic levels ranged from 10 aircraft to 30 aircraft with each test case spanning 15 minutes. The optimal solution shows a 40-70 percent decrease in the expected delay per aircraft over the baseline schedulers. Computational results suggest that the algorithm is promising for real-time application with an average computation time of 4.5 seconds. For even faster computation times, two heuristics are developed. As compared to the optimal, the heuristics are within 5% of the expected delay per aircraft and 1% of the expected number of runway operations per hour ad can be 100x faster.

  1. Social interaction as a heuristic for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Fontanari, José F.

    2010-11-01

    We investigate the performance of a variant of Axelrod’s model for dissemination of culture—the Adaptive Culture Heuristic (ACH)—on solving an NP-Complete optimization problem, namely, the classification of binary input patterns of size F by a Boolean Binary Perceptron. In this heuristic, N agents, characterized by binary strings of length F which represent possible solutions to the optimization problem, are fixed at the sites of a square lattice and interact with their nearest neighbors only. The interactions are such that the agents’ strings (or cultures) become more similar to the low-cost strings of their neighbors resulting in the dissemination of these strings across the lattice. Eventually the dynamics freezes into a homogeneous absorbing configuration in which all agents exhibit identical solutions to the optimization problem. We find through extensive simulations that the probability of finding the optimal solution is a function of the reduced variable F/N1/4 so that the number of agents must increase with the fourth power of the problem size, N∝F4 , to guarantee a fixed probability of success. In this case, we find that the relaxation time to reach an absorbing configuration scales with F6 which can be interpreted as the overall computational cost of the ACH to find an optimal set of weights for a Boolean binary perceptron, given a fixed probability of success.

  2. Discrete bacteria foraging optimization algorithm for graph based problems - a transition from continuous to discrete

    NASA Astrophysics Data System (ADS)

    Sur, Chiranjib; Shukla, Anupam

    2018-03-01

    Bacteria Foraging Optimisation Algorithm is a collective behaviour-based meta-heuristics searching depending on the social influence of the bacteria co-agents in the search space of the problem. The algorithm faces tremendous hindrance in terms of its application for discrete problems and graph-based problems due to biased mathematical modelling and dynamic structure of the algorithm. This had been the key factor to revive and introduce the discrete form called Discrete Bacteria Foraging Optimisation (DBFO) Algorithm for discrete problems which exceeds the number of continuous domain problems represented by mathematical and numerical equations in real life. In this work, we have mainly simulated a graph-based road multi-objective optimisation problem and have discussed the prospect of its utilisation in other similar optimisation problems and graph-based problems. The various solution representations that can be handled by this DBFO has also been discussed. The implications and dynamics of the various parameters used in the DBFO are illustrated from the point view of the problems and has been a combination of both exploration and exploitation. The result of DBFO has been compared with Ant Colony Optimisation and Intelligent Water Drops Algorithms. Important features of DBFO are that the bacteria agents do not depend on the local heuristic information but estimates new exploration schemes depending upon the previous experience and covered path analysis. This makes the algorithm better in combination generation for graph-based problems and combination generation for NP hard problems.

  3. A note on resource allocation scheduling with group technology and learning effects on a single machine

    NASA Astrophysics Data System (ADS)

    Lu, Yuan-Yuan; Wang, Ji-Bo; Ji, Ping; He, Hongyu

    2017-09-01

    In this article, single-machine group scheduling with learning effects and convex resource allocation is studied. The goal is to find the optimal job schedule, the optimal group schedule, and resource allocations of jobs and groups. For the problem of minimizing the makespan subject to limited resource availability, it is proved that the problem can be solved in polynomial time under the condition that the setup times of groups are independent. For the general setup times of groups, a heuristic algorithm and a branch-and-bound algorithm are proposed, respectively. Computational experiments show that the performance of the heuristic algorithm is fairly accurate in obtaining near-optimal solutions.

  4. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  5. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.

  6. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  7. Automated bond order assignment as an optimization problem.

    PubMed

    Dehof, Anna Katharina; Rurainski, Alexander; Bui, Quang Bao Anh; Böcker, Sebastian; Lenhof, Hans-Peter; Hildebrandt, Andreas

    2011-03-01

    Numerous applications in Computational Biology process molecular structures and hence strongly rely not only on correct atomic coordinates but also on correct bond order information. For proteins and nucleic acids, bond orders can be easily deduced but this does not hold for other types of molecules like ligands. For ligands, bond order information is not always provided in molecular databases and thus a variety of approaches tackling this problem have been developed. In this work, we extend an ansatz proposed by Wang et al. that assigns connectivity-based penalty scores and tries to heuristically approximate its optimum. In this work, we present three efficient and exact solvers for the problem replacing the heuristic approximation scheme of the original approach: an A*, an ILP and an fixed-parameter approach (FPT) approach. We implemented and evaluated the original implementation, our A*, ILP and FPT formulation on the MMFF94 validation suite and the KEGG Drug database. We show the benefit of computing exact solutions of the penalty minimization problem and the additional gain when computing all optimal (or even suboptimal) solutions. We close with a detailed comparison of our methods. The A* and ILP solution are integrated into the open-source C++ LGPL library BALL and the molecular visualization and modelling tool BALLView and can be downloaded from our homepage www.ball-project.org. The FPT implementation can be downloaded from http://bio.informatik.uni-jena.de/software/.

  8. The development and validation of a meta-tool for quality appraisal of public health evidence: Meta Quality Appraisal Tool (MetaQAT).

    PubMed

    Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V

    2016-07-01

    Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Perspectives of young Chinese Singaporean women on seeking and processing information to decide about vaccinating against human papillomavirus.

    PubMed

    Basnyat, Iccha; Lim, Cheryl

    2017-07-06

    Human papillomavirus (HPV) vaccination uptake in Singapore is low among young women. Low uptake has been found to be linked to low awareness. Thus, this study aimed to understand active and passive vaccine information-seeking behavior. Furthermore, guided by the Elaboration Likelihood Model (ELM), this study examined young women's (aged 21-26 years) processing of information they acquired in their decision to get vaccinated. ELM postulates that information processing could be through the central (i.e., logic-based) or peripheral (i.e., heuristic-based) route. Twenty-six in-depth interviews were conducted from January to March 2016. Data were analyzed using thematic analysis. Two meta-themes-information acquisition and vaccination decision-revealed the heuristic-based information processing was employed. These young women acquired information passively within their social network and actively in healthcare settings. However, they used heuristic cues, such as closeness and trust, to process the information. Similarly, vaccination decisions revealed that women relied on heuristic cues, such as sense of belonging and validation among peers and source credibility and likability in medical settings, in their decision to get vaccinated. The findings of this study highlight that intervention efforts should focus on strengthening social support among personal networks to increase the uptake of the vaccine.

  10. Meta-heuristic algorithm to solve two-sided assembly line balancing problems

    NASA Astrophysics Data System (ADS)

    Wirawan, A. D.; Maruf, A.

    2016-02-01

    Two-sided assembly line is a set of sequential workstations where task operations can be performed at two sides of the line. This type of line is commonly used for the assembly of large-sized products: cars, buses, and trucks. This paper propose a Decoding Algorithm with Teaching-Learning Based Optimization (TLBO), a recently developed nature-inspired search method to solve the two-sided assembly line balancing problem (TALBP). The algorithm aims to minimize the number of mated-workstations for the given cycle time without violating the synchronization constraints. The correlation between the input parameters and the emergence point of objective function value is tested using scenarios generated by design of experiments. A two-sided assembly line operated in an Indonesia's multinational manufacturing company is considered as the object of this paper. The result of the proposed algorithm shows reduction of workstations and indicates that there is negative correlation between the emergence point of objective function value and the size of population used.

  11. How Can Bee Colony Algorithm Serve Medicine?

    PubMed Central

    Salehahmadi, Zeinab; Manafi, Amir

    2014-01-01

    Healthcare professionals usually should make complex decisions with far reaching consequences and associated risks in health care fields. As it was demonstrated in other industries, the ability to drill down into pertinent data to explore knowledge behind the data can greatly facilitate superior, informed decisions to ensue the facts. Nature has always inspired researchers to develop models of solving the problems. Bee colony algorithm (BCA), based on the self-organized behavior of social insects is one of the most popular member of the family of population oriented, nature inspired meta-heuristic swarm intelligence method which has been proved its superiority over some other nature inspired algorithms. The objective of this model was to identify valid novel, potentially useful, and understandable correlations and patterns in existing data. This review employs a thematic analysis of online series of academic papers to outline BCA in medical hive, reducing the response and computational time and optimizing the problems. To illustrate the benefits of this model, the cases of disease diagnose system are presented. PMID:25489530

  12. How can bee colony algorithm serve medicine?

    PubMed

    Salehahmadi, Zeinab; Manafi, Amir

    2014-07-01

    Healthcare professionals usually should make complex decisions with far reaching consequences and associated risks in health care fields. As it was demonstrated in other industries, the ability to drill down into pertinent data to explore knowledge behind the data can greatly facilitate superior, informed decisions to ensue the facts. Nature has always inspired researchers to develop models of solving the problems. Bee colony algorithm (BCA), based on the self-organized behavior of social insects is one of the most popular member of the family of population oriented, nature inspired meta-heuristic swarm intelligence method which has been proved its superiority over some other nature inspired algorithms. The objective of this model was to identify valid novel, potentially useful, and understandable correlations and patterns in existing data. This review employs a thematic analysis of online series of academic papers to outline BCA in medical hive, reducing the response and computational time and optimizing the problems. To illustrate the benefits of this model, the cases of disease diagnose system are presented.

  13. Reporting completeness and transparency of meta-analyses of depression screening tool accuracy: A comparison of meta-analyses published before and after the PRISMA statement.

    PubMed

    Rice, Danielle B; Kloda, Lorie A; Shrier, Ian; Thombs, Brett D

    2016-08-01

    Meta-analyses that are conducted rigorously and reported completely and transparently can provide accurate evidence to inform the best possible healthcare decisions. Guideline makers have raised concerns about the utility of existing evidence on the diagnostic accuracy of depression screening tools. The objective of our study was to evaluate the transparency and completeness of reporting in meta-analyses of the diagnostic accuracy of depression screening tools using the PRISMA tool adapted for diagnostic test accuracy meta-analyses. We searched MEDLINE and PsycINFO from January 1, 2005 through March 13, 2016 for recent meta-analyses in any language on the diagnostic accuracy of depression screening tools. Two reviewers independently assessed the transparency in reporting using the PRISMA tool with appropriate adaptations made for studies of diagnostic test accuracy. We identified 21 eligible meta-analyses. Twelve of 21 meta-analyses complied with at least 50% of adapted PRISMA items. Of 30 adapted PRISMA items, 11 were fulfilled by ≥80% of included meta-analyses, 3 by 50-79% of meta-analyses, 7 by 25-45% of meta-analyses, and 9 by <25%. On average, post-PRISMA meta-analyses complied with 17 of 30 items compared to 13 of 30 items pre-PRISMA. Deficiencies in the transparency of reporting in meta-analyses of the diagnostic test accuracy of depression screening tools of meta-analyses were identified. Authors, reviewers, and editors should adhere to the PRISMA statement to improve the reporting of meta-analyses of the diagnostic accuracy of depression screening tools. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Improving delivery routes using combined heuristic and optimization in a consumer goods distribution company

    NASA Astrophysics Data System (ADS)

    Wibisono, E.; Santoso, A.; Sunaryo, M. A.

    2017-11-01

    XYZ is a distributor of various consumer goods products. The company plans its delivery routes daily and in order to obtain route construction in a short amount of time, it simplifies the process by assigning drivers based on geographic regions. This approach results in inefficient use of vehicles leading to imbalance workloads. In this paper, we propose a combined method involving heuristic and optimization to obtain better solutions in acceptable computation time. The heuristic is based on a time-oriented, nearest neighbor (TONN) to form clusters if the number of locations is higher than a certain value. The optimization part uses a mathematical modeling formulation based on vehicle routing problem that considers heterogeneous vehicles, time windows, and fixed costs (HVRPTWF) and is used to solve routing problem in clusters. A case study using data from one month of the company’s operations is analyzed, and data from one day of operations are detailed in this paper. The analysis shows that the proposed method results in 24% cost savings on that month, but it can be as high as 54% in a day.

  15. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    PubMed

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  16. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    PubMed Central

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  17. Evaluating a Web-Based Interface for Internet Telemedicine

    NASA Technical Reports Server (NTRS)

    Lathan, Corinna E.; Newman, Dava J.; Sebrechts, Marc M.; Doarn, Charles R.

    1997-01-01

    The objective is to introduce the usability engineering methodology, heuristic evaluation, to the design and development of a web-based telemedicine system. Using a set of usability criteria, or heuristics, one evaluator examined the Spacebridge to Russia web-site for usability problems. Thirty-four usability problems were found in this preliminary study and all were assigned a severity rating. The value of heuristic analysis in the iterative design of a system is shown because the problems can be fixed before deployment of a system and the problems are of a different nature than those found by actual users of the system. It was therefore determined that there is potential value of heuristic evaluation paired with user testing as a strategy for optimal system performance design.

  18. A novel heuristic algorithm for capacitated vehicle routing problem

    NASA Astrophysics Data System (ADS)

    Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre

    2017-09-01

    The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.

  19. Solving large-scale fixed cost integer linear programming models for grid-based location problems with heuristic techniques

    NASA Astrophysics Data System (ADS)

    Noor-E-Alam, Md.; Doucette, John

    2015-08-01

    Grid-based location problems (GBLPs) can be used to solve location problems in business, engineering, resource exploitation, and even in the field of medical sciences. To solve these decision problems, an integer linear programming (ILP) model is designed and developed to provide the optimal solution for GBLPs considering fixed cost criteria. Preliminary results show that the ILP model is efficient in solving small to moderate-sized problems. However, this ILP model becomes intractable in solving large-scale instances. Therefore, a decomposition heuristic is proposed to solve these large-scale GBLPs, which demonstrates significant reduction of solution runtimes. To benchmark the proposed heuristic, results are compared with the exact solution via ILP. The experimental results show that the proposed method significantly outperforms the exact method in runtime with minimal (and in most cases, no) loss of optimality.

  20. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions

    PubMed Central

    2017-01-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469

  1. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.

    PubMed

    Nantha, Yogarabindranath Swarna

    2017-11-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.

  2. Gaussian mass optimization for kernel PCA parameters

    NASA Astrophysics Data System (ADS)

    Liu, Yong; Wang, Zulin

    2011-10-01

    This paper proposes a novel kernel parameter optimization method based on Gaussian mass, which aims to overcome the current brute force parameter optimization method in a heuristic way. Generally speaking, the choice of kernel parameter should be tightly related to the target objects while the variance between the samples, the most commonly used kernel parameter, doesn't possess much features of the target, which gives birth to Gaussian mass. Gaussian mass defined in this paper has the property of the invariance of rotation and translation and is capable of depicting the edge, topology and shape information. Simulation results show that Gaussian mass leads a promising heuristic optimization boost up for kernel method. In MNIST handwriting database, the recognition rate improves by 1.6% compared with common kernel method without Gaussian mass optimization. Several promising other directions which Gaussian mass might help are also proposed at the end of the paper.

  3. 1-D DC Resistivity Modeling and Interpretation in Anisotropic Media Using Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Pekşen, Ertan; Yas, Türker; Kıyak, Alper

    2014-09-01

    We examine the one-dimensional direct current method in anisotropic earth formation. We derive an analytic expression of a simple, two-layered anisotropic earth model. Further, we also consider a horizontally layered anisotropic earth response with respect to the digital filter method, which yields a quasi-analytic solution over anisotropic media. These analytic and quasi-analytic solutions are useful tests for numerical codes. A two-dimensional finite difference earth model in anisotropic media is presented in order to generate a synthetic data set for a simple one-dimensional earth. Further, we propose a particle swarm optimization method for estimating the model parameters of a layered anisotropic earth model such as horizontal and vertical resistivities, and thickness. The particle swarm optimization is a naturally inspired meta-heuristic algorithm. The proposed method finds model parameters quite successfully based on synthetic and field data. However, adding 5 % Gaussian noise to the synthetic data increases the ambiguity of the value of the model parameters. For this reason, the results should be controlled by a number of statistical tests. In this study, we use probability density function within 95 % confidence interval, parameter variation of each iteration and frequency distribution of the model parameters to reduce the ambiguity. The result is promising and the proposed method can be used for evaluating one-dimensional direct current data in anisotropic media.

  4. Introducing Heuristics of Cultural Dimensions into the Service-Level Technical Communication Classroom

    ERIC Educational Resources Information Center

    Schafer, Robert

    2009-01-01

    A significant problem for practitioners of technical communication is to gain the skills to compete in a global, multicultural work environment. Instructors of technical communication can provide future practitioners with the tools to compete and excel in this global environment by introducing heuristics of cultural dimensions into the…

  5. The Heuristic Sandbox: Developing Teacher Know-How through Play in simSchool

    ERIC Educational Resources Information Center

    Hopper, Susan B.

    2018-01-01

    simSchool is a game-based, virtual, and interactive tool that allows pre-service teachers to acquire new skills while constructing knowledge through experimentation with learning situations. Pre-service teachers develop know-how--or heuristic knowledge--through repeated practice in the "Personality Plus Higher-Order Thinking" module to…

  6. Lagrange Multipliers, Adjoint Equations, the Pontryagin Maximum Principle and Heuristic Proofs

    ERIC Educational Resources Information Center

    Ollerton, Richard L.

    2013-01-01

    Deeper understanding of important mathematical concepts by students may be promoted through the (initial) use of heuristic proofs, especially when the concepts are also related back to previously encountered mathematical ideas or tools. The approach is illustrated by use of the Pontryagin maximum principle which is then illuminated by reference to…

  7. A Heuristic Tool for Teaching Business Writing: Self-Assessment, Knowledge Transfer, and Writing Exercises

    ERIC Educational Resources Information Center

    Ortiz, Lorelei A.

    2013-01-01

    To teach effective business communication, instructors must target students’ current weaknesses in writing. One method for doing so is by assigning writing exercises. When used heuristically, writing exercises encourage students to practice self-assessment, self-evaluation, active learning, and knowledge transfer, all while reinforcing the basics…

  8. Developing Critical Thinking Skills Using the Science Writing Heuristic in the Chemistry Laboratory

    ERIC Educational Resources Information Center

    Stephenson, N. S.; Sadler-McKnight, N. P.

    2016-01-01

    The Science Writing Heuristic (SWH) laboratory approach is a teaching and learning tool which combines writing, inquiry, collaboration and reflection, and provides scaffolding for the development of critical thinking skills. In this study, the California Critical Thinking Skills Test (CCTST) was used to measure the critical thinking skills of…

  9. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  10. Heuristic Optimization Approach to Selecting a Transport Connection in City Public Transport

    NASA Astrophysics Data System (ADS)

    Kul'ka, Jozef; Mantič, Martin; Kopas, Melichar; Faltinová, Eva; Kachman, Daniel

    2017-02-01

    The article presents a heuristic optimization approach to select a suitable transport connection in the framework of a city public transport. This methodology was applied on a part of the public transport in Košice, because it is the second largest city in the Slovak Republic and its network of the public transport creates a complex transport system, which consists of three different transport modes, namely from the bus transport, tram transport and trolley-bus transport. This solution focused on examining the individual transport services and their interconnection in relevant interchange points.

  11. Multicast Routing and Wavelength Assignment with Shared Protection in Multi-Fiber WDM Mesh Networks: Optimal and Heuristic Solutions

    NASA Astrophysics Data System (ADS)

    Woradit, Kampol; Guyot, Matthieu; Vanichchanunt, Pisit; Saengudomlert, Poompat; Wuttisittikulkij, Lunchakorn

    While the problem of multicast routing and wavelength assignment (MC-RWA) in optical wavelength division multiplexing (WDM) networks has been investigated, relatively few researchers have considered network survivability for multicasting. This paper provides an optimization framework to solve the MC-RWA problem in a multi-fiber WDM network that can recover from a single-link failure with shared protection. Using the light-tree (LT) concept to support multicast sessions, we consider two protection strategies that try to reduce service disruptions after a link failure. The first strategy, called light-tree reconfiguration (LTR) protection, computes a new multicast LT for each session affected by the failure. The second strategy, called optical branch reconfiguration (OBR) protection, tries to restore a logical connection between two adjacent multicast members disconnected by the failure. To solve the MC-RWA problem optimally, we propose an integer linear programming (ILP) formulation that minimizes the total number of fibers required for both working and backup traffic. The ILP formulation takes into account joint routing of working and backup traffic, the wavelength continuity constraint, and the limited splitting degree of multicast-capable optical cross-connects (MC-OXCs). After showing some numerical results for optimal solutions, we propose heuristic algorithms that reduce the computational complexity and make the problem solvable for large networks. Numerical results suggest that the proposed heuristic yields efficient solutions compared to optimal solutions obtained from exact optimization.

  12. Design and usability of heuristic‐based deliberation tools for women facing amniocentesis

    PubMed Central

    Durand, Marie‐Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn

    2011-01-01

    Abstract Background  Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health‐care decisions), fast and frugal decision‐making strategies (heuristics) may perform better than complex rules of reasoning. Objective  To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). Design  The ‘Take The Best’ heuristic (i.e. selection of a ‘most important reason’) and ‘The Tallying’ integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web‐based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health‐care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of ‘your most important reason’ (Take The Best) and ‘weighing it up’ (Tallying). Results  Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health‐care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that ‘weighing it up’ and ‘your most important reason’ were not appropriate when facing such a difficult and emotional decision. Conclusion  Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real‐world decisions about amniocentesis. PMID:21241434

  13. Social heuristics and social roles: Intuition favors altruism for women but not for men.

    PubMed

    Rand, David G; Brescoll, Victoria L; Everett, Jim A C; Capraro, Valerio; Barcelo, Hélène

    2016-04-01

    Are humans intuitively altruistic, or does altruism require self-control? A theory of social heuristics, whereby intuitive responses favor typically successful behaviors, suggests that the answer may depend on who you are. In particular, evidence suggests that women are expected to behave altruistically, and are punished for failing to be altruistic, to a much greater extent than men. Thus, women (but not men) may internalize altruism as their intuitive response. Indeed, a meta-analysis of 13 new experiments and 9 experiments from other groups found that promoting intuition relative to deliberation increased giving in a Dictator Game among women, but not among men (Study 1, N = 4,366). Furthermore, this effect was shown to be moderated by explicit sex role identification (Study 2, N = 1,831): the more women described themselves using traditionally masculine attributes (e.g., dominance, independence) relative to traditionally feminine attributes (e.g., warmth, tenderness), the more deliberation reduced their altruism. Our findings shed light on the connection between gender and altruism, and highlight the importance of social heuristics in human prosociality. (c) 2016 APA, all rights reserved).

  14. A Comparison of Heuristic and Human Performance on Open Versions of the Traveling Salesperson Problem

    ERIC Educational Resources Information Center

    MacGregor, James N.; Chronicle, Edward P.; Ormerod, Thomas C.

    2006-01-01

    We compared the performance of three heuristics with that of subjects on variants of a well-known combinatorial optimization task, the Traveling Salesperson Problem (TSP). The present task consisted of finding the shortest path through an array of points from one side of the array to the other. Like the standard TSP, the task is computationally…

  15. A three-stage heuristic for harvest scheduling with access road network development

    Treesearch

    Mark M. Clark; Russell D. Meller; Timothy P. McDonald

    2000-01-01

    In this article we present a new model for the scheduling of forest harvesting with spatial and temporal constraints. Our approach is unique in that we incorporate access road network development into the harvest scheduling selection process. Due to the difficulty of solving the problem optimally, we develop a heuristic that consists of a solution construction stage...

  16. Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu

    Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.

  17. An Improved Heuristic Method for Subgraph Isomorphism Problem

    NASA Astrophysics Data System (ADS)

    Xiang, Yingzhuo; Han, Jiesi; Xu, Haijiang; Guo, Xin

    2017-09-01

    This paper focus on the subgraph isomorphism (SI) problem. We present an improved genetic algorithm, a heuristic method to search the optimal solution. The contribution of this paper is that we design a dedicated crossover algorithm and a new fitness function to measure the evolution process. Experiments show our improved genetic algorithm performs better than other heuristic methods. For a large graph, such as a subgraph of 40 nodes, our algorithm outperforms the traditional tree search algorithms. We find that the performance of our improved genetic algorithm does not decrease as the number of nodes in prototype graphs.

  18. Single-cultivar extra virgin olive oil classification using a potentiometric electronic tongue.

    PubMed

    Dias, Luís G; Fernandes, Andreia; Veloso, Ana C A; Machado, Adélio A S C; Pereira, José A; Peres, António M

    2014-10-01

    Label authentication of monovarietal extra virgin olive oils is of great importance. A novel approach based on a potentiometric electronic tongue is proposed to classify oils obtained from single olive cultivars (Portuguese cvs. Cobrançosa, Madural, Verdeal Transmontana; Spanish cvs. Arbequina, Hojiblanca, Picual). A meta-heuristic simulated annealing algorithm was applied to select the most informative sets of sensors to establish predictive linear discriminant models. Olive oils were correctly classified according to olive cultivar (sensitivities greater than 97%) and each Spanish olive oil was satisfactorily discriminated from the Portuguese ones with the exception of cv. Arbequina (sensitivities from 61% to 98%). Also, the discriminant ability was related to the polar compounds contents of olive oils and so, indirectly, with organoleptic properties like bitterness, astringency or pungency. Therefore the proposed E-tongue can be foreseen as a useful auxiliary tool for trained sensory panels for the classification of monovarietal extra virgin olive oils. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Fabrication, characterization, and heuristic trade space exploration of magnetically actuated Miura-Ori origami structures

    NASA Astrophysics Data System (ADS)

    Cowan, Brett; von Lockette, Paris R.

    2017-04-01

    The authors develop magnetically actuated Miura-Ori structures through observation, experiment, and computation using an initially heuristic strategy followed by trade space visualization and optimization. The work is novel, especially within origami engineering, in that beyond final target shape approximation, Miura-Ori structures in this work are additionally evaluated for the shape approximation while folding and for their efficient use of their embedded actuators. The structures consisted of neodymium magnets placed on the panels of silicone elastomer substrates cast in the Miura-Ori folding pattern. Initially four configurations, arrangements of magnets on the panels, were selected based on heuristic arguments that (1) maximized the amount of magnetic torque applied to the creases and (2) reduced the number of magnets needed to affect all creases in the pattern. The results of experimental and computational performance metrics were used in a weighted sum model to predict the optimum configuration, which was then fabricated and experimentally characterized for comparison to the initial prototypes. As expected, optimization of magnet placement and orientation was effective at increasing the degree of theoretical useful work. Somewhat unexpectedly, however, trade space results showed that even after optimization, the configuration with the most number of magnets was least effective, per magnet, at directing its actuation to the structure’s creases. Overall, though the winning configuration experimentally outperformed its initial, non-optimal counterparts, results showed that the choice of optimum configuration was heavily dependent on the weighting factors. These results highlight both the ability of the Miura-Ori to be actuated with external magnetic stimuli, the effectiveness of a heuristic design approach that focuses on the actuation mechanism, and the need to address path-dependent metrics in assessing performance in origami folding structures.

  20. On the Psychology of the Recognition Heuristic: Retrieval Primacy as a Key Determinant of Its Use

    ERIC Educational Resources Information Center

    Pachur, Thorsten; Hertwig, Ralph

    2006-01-01

    The recognition heuristic is a prime example of a boundedly rational mind tool that rests on an evolved capacity, recognition, and exploits environmental structures. When originally proposed, it was conjectured that no other probabilistic cue reverses the recognition-based inference (D. G. Goldstein & G. Gigerenzer, 2002). More recent studies…

  1. Optimally Stopped Optimization

    NASA Astrophysics Data System (ADS)

    Vinci, Walter; Lidar, Daniel

    We combine the fields of heuristic optimization and optimal stopping. We propose a strategy for benchmarking randomized optimization algorithms that minimizes the expected total cost for obtaining a good solution with an optimal number of calls to the solver. To do so, rather than letting the objective function alone define a cost to be minimized, we introduce a further cost-per-call of the algorithm. We show that this problem can be formulated using optimal stopping theory. The expected cost is a flexible figure of merit for benchmarking probabilistic solvers that can be computed when the optimal solution is not known, and that avoids the biases and arbitrariness that affect other measures. The optimal stopping formulation of benchmarking directly leads to a real-time, optimal-utilization strategy for probabilistic optimizers with practical impact. We apply our formulation to benchmark the performance of a D-Wave 2X quantum annealer and the HFS solver, a specialized classical heuristic algorithm designed for low tree-width graphs. On a set of frustrated-loop instances with planted solutions defined on up to N = 1098 variables, the D-Wave device is between one to two orders of magnitude faster than the HFS solver.

  2. Modeling and optimization by particle swarm embedded neural network for adsorption of zinc (II) by palm kernel shell based activated carbon from aqueous environment.

    PubMed

    Karri, Rama Rao; Sahu, J N

    2018-01-15

    Zn (II) is one the common pollutant among heavy metals found in industrial effluents. Removal of pollutant from industrial effluents can be accomplished by various techniques, out of which adsorption was found to be an efficient method. Applications of adsorption limits itself due to high cost of adsorbent. In this regard, a low cost adsorbent produced from palm oil kernel shell based agricultural waste is examined for its efficiency to remove Zn (II) from waste water and aqueous solution. The influence of independent process variables like initial concentration, pH, residence time, activated carbon (AC) dosage and process temperature on the removal of Zn (II) by palm kernel shell based AC from batch adsorption process are studied systematically. Based on the design of experimental matrix, 50 experimental runs are performed with each process variable in the experimental range. The optimal values of process variables to achieve maximum removal efficiency is studied using response surface methodology (RSM) and artificial neural network (ANN) approaches. A quadratic model, which consists of first order and second order degree regressive model is developed using the analysis of variance and RSM - CCD framework. The particle swarm optimization which is a meta-heuristic optimization is embedded on the ANN architecture to optimize the search space of neural network. The optimized trained neural network well depicts the testing data and validation data with R 2 equal to 0.9106 and 0.9279 respectively. The outcomes indicates that the superiority of ANN-PSO based model predictions over the quadratic model predictions provided by RSM. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Heuristic rules embedded genetic algorithm for in-core fuel management optimization

    NASA Astrophysics Data System (ADS)

    Alim, Fatih

    The objective of this study was to develop a unique methodology and a practical tool for designing loading pattern (LP) and burnable poison (BP) pattern for a given Pressurized Water Reactor (PWR) core. Because of the large number of possible combinations for the fuel assembly (FA) loading in the core, the design of the core configuration is a complex optimization problem. It requires finding an optimal FA arrangement and BP placement in order to achieve maximum cycle length while satisfying the safety constraints. Genetic Algorithms (GA) have been already used to solve this problem for LP optimization for both PWR and Boiling Water Reactor (BWR). The GA, which is a stochastic method works with a group of solutions and uses random variables to make decisions. Based on the theories of evaluation, the GA involves natural selection and reproduction of the individuals in the population for the next generation. The GA works by creating an initial population, evaluating it, and then improving the population by using the evaluation operators. To solve this optimization problem, a LP optimization package, GARCO (Genetic Algorithm Reactor Code Optimization) code is developed in the framework of this thesis. This code is applicable for all types of PWR cores having different geometries and structures with an unlimited number of FA types in the inventory. To reach this goal, an innovative GA is developed by modifying the classical representation of the genotype. To obtain the best result in a shorter time, not only the representation is changed but also the algorithm is changed to use in-core fuel management heuristics rules. The improved GA code was tested to demonstrate and verify the advantages of the new enhancements. The developed methodology is explained in this thesis and preliminary results are shown for the VVER-1000 reactor hexagonal geometry core and the TMI-1 PWR. The improved GA code was tested to verify the advantages of new enhancements. The core physics code used for VVER in this research is Moby-Dick, which was developed to analyze the VVER by SKODA Inc. The SIMULATE-3 code, which is an advanced two-group nodal code, is used to analyze the TMI-1.

  4. Model Specification Searches Using Ant Colony Optimization Algorithms

    ERIC Educational Resources Information Center

    Marcoulides, George A.; Drezner, Zvi

    2003-01-01

    Ant colony optimization is a recently proposed heuristic procedure inspired by the behavior of real ants. This article applies the procedure to model specification searches in structural equation modeling and reports the results. The results demonstrate the capabilities of ant colony optimization algorithms for conducting automated searches.

  5. Performance index and meta-optimization of a direct search optimization method

    NASA Astrophysics Data System (ADS)

    Krus, P.; Ölvander, J.

    2013-10-01

    Design optimization is becoming an increasingly important tool for design, often using simulation as part of the evaluation of the objective function. A measure of the efficiency of an optimization algorithm is of great importance when comparing methods. The main contribution of this article is the introduction of a singular performance criterion, the entropy rate index based on Shannon's information theory, taking both reliability and rate of convergence into account. It can also be used to characterize the difficulty of different optimization problems. Such a performance criterion can also be used for optimization of the optimization algorithms itself. In this article the Complex-RF optimization method is described and its performance evaluated and optimized using the established performance criterion. Finally, in order to be able to predict the resources needed for optimization an objective function temperament factor is defined that indicates the degree of difficulty of the objective function.

  6. L.U.St: a tool for approximated maximum likelihood supertree reconstruction.

    PubMed

    Akanni, Wasiu A; Creevey, Christopher J; Wilkinson, Mark; Pisani, Davide

    2014-06-12

    Supertrees combine disparate, partially overlapping trees to generate a synthesis that provides a high level perspective that cannot be attained from the inspection of individual phylogenies. Supertrees can be seen as meta-analytical tools that can be used to make inferences based on results of previous scientific studies. Their meta-analytical application has increased in popularity since it was realised that the power of statistical tests for the study of evolutionary trends critically depends on the use of taxon-dense phylogenies. Further to that, supertrees have found applications in phylogenomics where they are used to combine gene trees and recover species phylogenies based on genome-scale data sets. Here, we present the L.U.St package, a python tool for approximate maximum likelihood supertree inference and illustrate its application using a genomic data set for the placental mammals. L.U.St allows the calculation of the approximate likelihood of a supertree, given a set of input trees, performs heuristic searches to look for the supertree of highest likelihood, and performs statistical tests of two or more supertrees. To this end, L.U.St implements a winning sites test allowing ranking of a collection of a-priori selected hypotheses, given as a collection of input supertree topologies. It also outputs a file of input-tree-wise likelihood scores that can be used as input to CONSEL for calculation of standard tests of two trees (e.g. Kishino-Hasegawa, Shimidoara-Hasegawa and Approximately Unbiased tests). This is the first fully parametric implementation of a supertree method, it has clearly understood properties, and provides several advantages over currently available supertree approaches. It is easy to implement and works on any platform that has python installed. bitBucket page - https://afro-juju@bitbucket.org/afro-juju/l.u.st.git. Davide.Pisani@bristol.ac.uk.

  7. Offshore wind farm layout optimization

    NASA Astrophysics Data System (ADS)

    Elkinton, Christopher Neil

    Offshore wind energy technology is maturing in Europe and is poised to make a significant contribution to the U.S. energy production portfolio. Building on the knowledge the wind industry has gained to date, this dissertation investigates the influences of different site conditions on offshore wind farm micrositing---the layout of individual turbines within the boundaries of a wind farm. For offshore wind farms, these conditions include, among others, the wind and wave climates, water depths, and soil conditions at the site. An analysis tool has been developed that is capable of estimating the cost of energy (COE) from offshore wind farms. For this analysis, the COE has been divided into several modeled components: major costs (e.g. turbines, electrical interconnection, maintenance, etc.), energy production, and energy losses. By treating these component models as functions of site-dependent parameters, the analysis tool can investigate the influence of these parameters on the COE. Some parameters result in simultaneous increases of both energy and cost. In these cases, the analysis tool was used to determine the value of the parameter that yielded the lowest COE and, thus, the best balance of cost and energy. The models have been validated and generally compare favorably with existing offshore wind farm data. The analysis technique was then paired with optimization algorithms to form a tool with which to design offshore wind farm layouts for which the COE was minimized. Greedy heuristic and genetic optimization algorithms have been tuned and implemented. The use of these two algorithms in series has been shown to produce the best, most consistent solutions. The influences of site conditions on the COE have been studied further by applying the analysis and optimization tools to the initial design of a small offshore wind farm near the town of Hull, Massachusetts. The results of an initial full-site analysis and optimization were used to constrain the boundaries of the farm. A more thorough optimization highlighted the features of the area that would result in a minimized COE. The results showed reasonable layout designs and COE estimates that are consistent with existing offshore wind farms.

  8. External validity and anchoring heuristics: application of DUNDRUM-1 to secure service gatekeeping in South Wales.

    PubMed

    Lawrence, Daniel; Davies, Tracey-Lee; Bagshaw, Ruth; Hewlett, Paul; Taylor, Pamela; Watt, Andrew

    2018-02-01

    Aims and method Structured clinical judgement tools provide scope for the standardisation of forensic service gatekeeping and also allow identification of heuristics in this decision process. The DUNDRUM-1 triage tool was completed retrospectively for 121 first-time referrals to forensic services in South Wales. Fifty were admitted to medium security, 49 to low security and 22 remained in open conditions. DUNDRUM-1 total scores differed appropriately between different levels of security. However, regression revealed heuristic anchoring on the 'legal process' and 'immediacy of risk due to mental disorder' items. Clinical implications Patient placement was broadly aligned with DUNDRUM-1 recommendations. However, not all triage items informed gatekeeping decisions. It remains to be seen whether decisions anchored in this way are effective. Declaration of interest Dr Mark Freestone gave permission for AUC values from Freestone et al. (2015) to be presented here for comparison.

  9. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics.

    PubMed

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-08-01

    RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of [Formula: see text]. Subsequently, numerous faster 'Sankoff-style' approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity ([Formula: see text] quartic time). Breaking this barrier, we introduce the novel Sankoff-style algorithm 'sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)', which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff's original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. © The Author 2015. Published by Oxford University Press.

  10. Performance tradeoffs in static and dynamic load balancing strategies

    NASA Technical Reports Server (NTRS)

    Iqbal, M. A.; Saltz, J. H.; Bokhart, S. H.

    1986-01-01

    The problem of uniformly distributing the load of a parallel program over a multiprocessor system was considered. A program was analyzed whose structure permits the computation of the optimal static solution. Then four strategies for load balancing were described and their performance compared. The strategies are: (1) the optimal static assignment algorithm which is guaranteed to yield the best static solution, (2) the static binary dissection method which is very fast but sub-optimal, (3) the greedy algorithm, a static fully polynomial time approximation scheme, which estimates the optimal solution to arbitrary accuracy, and (4) the predictive dynamic load balancing heuristic which uses information on the precedence relationships within the program and outperforms any of the static methods. It is also shown that the overhead incurred by the dynamic heuristic is reduced considerably if it is started off with a static assignment provided by either of the other three strategies.

  11. Précis of Simple heuristics that make us smart.

    PubMed

    Todd, P M; Gigerenzer, G

    2000-10-01

    How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.

  12. Heuristic approach to image registration

    NASA Astrophysics Data System (ADS)

    Gertner, Izidor; Maslov, Igor V.

    2000-08-01

    Image registration, i.e. correct mapping of images obtained from different sensor readings onto common reference frame, is a critical part of multi-sensor ATR/AOR systems based on readings from different types of sensors. In order to fuse two different sensor readings of the same object, the readings have to be put into a common coordinate system. This task can be formulated as optimization problem in a space of all possible affine transformations of an image. In this paper, a combination of heuristic methods is explored to register gray- scale images. The modification of Genetic Algorithm is used as the first step in global search for optimal transformation. It covers the entire search space with (randomly or heuristically) scattered probe points and helps significantly reduce the search space to a subspace of potentially most successful transformations. Due to its discrete character, however, Genetic Algorithm in general can not converge while coming close to the optimum. Its termination point can be specified either as some predefined number of generations or as achievement of a certain acceptable convergence level. To refine the search, potential optimal subspaces are searched using more delicate and efficient for local search Taboo and Simulated Annealing methods.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, Ryan; Marnay, Chris

    The on-site generation of electricity can offer buildingowners and occupiers financial benefits as well as social benefits suchas reduced grid congestion, improved energy efficiency, and reducedgreenhouse gas emissions. Combined heat and power (CHP), or cogeneration,systems make use of the waste heat from the generator for site heatingneeds. Real-time optimal dispatch of CHP systems is difficult todetermine because of complicated electricity tariffs and uncertainty inCHP equipment availability, energy prices, and system loads. Typically,CHP systems use simple heuristic control strategies. This paper describesa method of determining optimal control in real-time and applies it to alight industrial site in San Diego, California, tomore » examine: 1) the addedbenefit of optimal over heuristic controls, 2) the price elasticity ofthe system, and 3) the site-attributable greenhouse gas emissions, allunder three different tariff structures. Results suggest that heuristiccontrols are adequate under the current tariff structure and relativelyhigh electricity prices, capturing 97 percent of the value of thedistributed generation system. Even more value could be captured bysimply not running the CHP system during times of unusually high naturalgas prices. Under hypothetical real-time pricing of electricity,heuristic controls would capture only 70 percent of the value ofdistributed generation.« less

  14. Optimism in the face of uncertainty supported by a statistically-designed multi-armed bandit algorithm.

    PubMed

    Kamiura, Moto; Sano, Kohei

    2017-10-01

    The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Sidelobe reduction and capacity improvement of open-loop collaborative beamforming in wireless sensor networks

    PubMed Central

    2017-01-01

    Collaborative beamforming (CBF) with a finite number of collaborating nodes (CNs) produces sidelobes that are highly dependent on the collaborating nodes’ locations. The sidelobes cause interference and affect the communication rate of unintended receivers located within the transmission range. Nulling is not possible in an open-loop CBF since the collaborating nodes are unable to receive feedback from the receivers. Hence, the overall sidelobe reduction is required to avoid interference in the directions of the unintended receivers. However, the impact of sidelobe reduction on the capacity improvement at the unintended receiver has never been reported in previous works. In this paper, the effect of peak sidelobe (PSL) reduction in CBF on the capacity of an unintended receiver is analyzed. Three meta-heuristic optimization methods are applied to perform PSL minimization, namely genetic algorithm (GA), particle swarm algorithm (PSO) and a simplified version of the PSO called the weightless swarm algorithm (WSA). An average reduction of 20 dB in PSL alongside 162% capacity improvement is achieved in the worst case scenario with the WSA optimization. It is discovered that the PSL minimization in the CBF provides capacity improvement at an unintended receiver only if the CBF cluster is small and dense. PMID:28464000

  16. Automatically Generated Algorithms for the Vertex Coloring Problem

    PubMed Central

    Contreras Bolton, Carlos; Gatica, Gustavo; Parada, Víctor

    2013-01-01

    The vertex coloring problem is a classical problem in combinatorial optimization that consists of assigning a color to each vertex of a graph such that no adjacent vertices share the same color, minimizing the number of colors used. Despite the various practical applications that exist for this problem, its NP-hardness still represents a computational challenge. Some of the best computational results obtained for this problem are consequences of hybridizing the various known heuristics. Automatically revising the space constituted by combining these techniques to find the most adequate combination has received less attention. In this paper, we propose exploring the heuristics space for the vertex coloring problem using evolutionary algorithms. We automatically generate three new algorithms by combining elementary heuristics. To evaluate the new algorithms, a computational experiment was performed that allowed comparing them numerically with existing heuristics. The obtained algorithms present an average 29.97% relative error, while four other heuristics selected from the literature present a 59.73% error, considering 29 of the more difficult instances in the DIMACS benchmark. PMID:23516506

  17. An efficient heuristic method for dynamic portfolio selection problem under transaction costs and uncertain conditions

    NASA Astrophysics Data System (ADS)

    Najafi, Amir Abbas; Pourahmadi, Zahra

    2016-04-01

    Selecting the optimal combination of assets in a portfolio is one of the most important decisions in investment management. As investment is a long term concept, looking into a portfolio optimization problem just in a single period may cause loss of some opportunities that could be exploited in a long term view. Hence, it is tried to extend the problem from single to multi-period model. We include trading costs and uncertain conditions to this model which made it more realistic and complex. Hence, we propose an efficient heuristic method to tackle this problem. The efficiency of the method is examined and compared with the results of the rolling single-period optimization and the buy and hold method which shows the superiority of the proposed method.

  18. Optimal Assembly of Psychological and Educational Tests.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.

    1998-01-01

    Reviews optimal test-assembly literature and introduces the contributions to this special issue. Discusses four approaches to computerized test assembly: (1) heuristic-based test assembly; (2) 0-1 linear programming; (3) network-flow programming; and (4) an optimal design approach. Contains a bibliography of 90 sources on test assembly.…

  19. Ckmeans.1d.dp: Optimal k-means Clustering in One Dimension by Dynamic Programming.

    PubMed

    Wang, Haizhou; Song, Mingzhou

    2011-12-01

    The heuristic k -means algorithm, widely used for cluster analysis, does not guarantee optimality. We developed a dynamic programming algorithm for optimal one-dimensional clustering. The algorithm is implemented as an R package called Ckmeans.1d.dp . We demonstrate its advantage in optimality and runtime over the standard iterative k -means algorithm.

  20. New insights into diversification of hyper-heuristics.

    PubMed

    Ren, Zhilei; Jiang, He; Xuan, Jifeng; Hu, Yan; Luo, Zhongxuan

    2014-10-01

    There has been a growing research trend of applying hyper-heuristics for problem solving, due to their ability of balancing the intensification and the diversification with low level heuristics. Traditionally, the diversification mechanism is mostly realized by perturbing the incumbent solutions to escape from local optima. In this paper, we report our attempt toward providing a new diversification mechanism, which is based on the concept of instance perturbation. In contrast to existing approaches, the proposed mechanism achieves the diversification by perturbing the instance under solving, rather than the solutions. To tackle the challenge of incorporating instance perturbation into hyper-heuristics, we also design a new hyper-heuristic framework HIP-HOP (recursive acronym of HIP-HOP is an instance perturbation-based hyper-heuristic optimization procedure), which employs a grammar guided high level strategy to manipulate the low level heuristics. With the expressive power of the grammar, the constraints, such as the feasibility of the output solution could be easily satisfied. Numerical results and statistical tests over both the Ising spin glass problem and the p -median problem instances show that HIP-HOP is able to achieve promising performances. Furthermore, runtime distribution analysis reveals that, although being relatively slow at the beginning, HIP-HOP is able to achieve competitive solutions once given sufficient time.

  1. Heuristic lipophilicity potential for computer-aided rational drug design: Optimizations of screening functions and parameters

    NASA Astrophysics Data System (ADS)

    Du, Qishi; Mezey, Paul G.

    1998-09-01

    In this research we test and compare three possible atom-basedscreening functions used in the heuristic molecular lipophilicity potential(HMLP). Screening function 1 is a power distance-dependent function, b_{{i}} /| {R_{{i}}- r} |^γ, screening function 2is an exponential distance-dependent function, biexp(-| {R_i- r} |/d_0 , and screening function 3 is aweighted distance-dependent function, {{sign}}( {b_i } ){{exp}}ξ ( {| {R_i- r} |/| {b_i } |} )For every screening function, the parameters (γ ,d0, and ξ are optimized using 41 common organic molecules of 4 types of compounds:aliphatic alcohols, aliphatic carboxylic acids, aliphatic amines, andaliphatic alkanes. The results of calculations show that screening function3 cannot give chemically reasonable results, however, both the powerscreening function and the exponential screening function give chemicallysatisfactory results. There are two notable differences between screeningfunctions 1 and 2. First, the exponential screening function has largervalues in the short distance than the power screening function, thereforemore influence from the nearest neighbors is involved using screeningfunction 2 than screening function 1. Second, the power screening functionhas larger values in the long distance than the exponential screeningfunction, therefore screening function 1 is effected by atoms at longdistance more than screening function 2. For screening function 1, thesuitable range of parameter d0 is 1.5 < d0 < 3.0, and d0 = 2.0 is recommended. HMLP developed in this researchprovides a potential tool for computer-aided three-dimensional drugdesign.

  2. Joint Power Charging and Routing in Wireless Rechargeable Sensor Networks.

    PubMed

    Jia, Jie; Chen, Jian; Deng, Yansha; Wang, Xingwei; Aghvami, Abdol-Hamid

    2017-10-09

    The development of wireless power transfer (WPT) technology has inspired the transition from traditional battery-based wireless sensor networks (WSNs) towards wireless rechargeable sensor networks (WRSNs). While extensive efforts have been made to improve charging efficiency, little has been done for routing optimization. In this work, we present a joint optimization model to maximize both charging efficiency and routing structure. By analyzing the structure of the optimization model, we first decompose the problem and propose a heuristic algorithm to find the optimal charging efficiency for the predefined routing tree. Furthermore, by coding the many-to-one communication topology as an individual, we further propose to apply a genetic algorithm (GA) for the joint optimization of both routing and charging. The genetic operations, including tree-based recombination and mutation, are proposed to obtain a fast convergence. Our simulation results show that the heuristic algorithm reduces the number of resident locations and the total moving distance. We also show that our proposed algorithm achieves a higher charging efficiency compared with existing algorithms.

  3. Sniffer Channel Selection for Monitoring Wireless LANs

    NASA Astrophysics Data System (ADS)

    Song, Yuan; Chen, Xian; Kim, Yoo-Ah; Wang, Bing; Chen, Guanling

    Wireless sniffers are often used to monitor APs in wireless LANs (WLANs) for network management, fault detection, traffic characterization, and optimizing deployment. It is cost effective to deploy single-radio sniffers that can monitor multiple nearby APs. However, since nearby APs often operate on orthogonal channels, a sniffer needs to switch among multiple channels to monitor its nearby APs. In this paper, we formulate and solve two optimization problems on sniffer channel selection. Both problems require that each AP be monitored by at least one sniffer. In addition, one optimization problem requires minimizing the maximum number of channels that a sniffer listens to, and the other requires minimizing the total number of channels that the sniffers listen to. We propose a novel LP-relaxation based algorithm, and two simple greedy heuristics for the above two optimization problems. Through simulation, we demonstrate that all the algorithms are effective in achieving their optimization goals, and the LP-based algorithm outperforms the greedy heuristics.

  4. Joint Power Charging and Routing in Wireless Rechargeable Sensor Networks

    PubMed Central

    Jia, Jie; Chen, Jian; Deng, Yansha; Wang, Xingwei; Aghvami, Abdol-Hamid

    2017-01-01

    The development of wireless power transfer (WPT) technology has inspired the transition from traditional battery-based wireless sensor networks (WSNs) towards wireless rechargeable sensor networks (WRSNs). While extensive efforts have been made to improve charging efficiency, little has been done for routing optimization. In this work, we present a joint optimization model to maximize both charging efficiency and routing structure. By analyzing the structure of the optimization model, we first decompose the problem and propose a heuristic algorithm to find the optimal charging efficiency for the predefined routing tree. Furthermore, by coding the many-to-one communication topology as an individual, we further propose to apply a genetic algorithm (GA) for the joint optimization of both routing and charging. The genetic operations, including tree-based recombination and mutation, are proposed to obtain a fast convergence. Our simulation results show that the heuristic algorithm reduces the number of resident locations and the total moving distance. We also show that our proposed algorithm achieves a higher charging efficiency compared with existing algorithms. PMID:28991200

  5. Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Lee, Charles H.

    2012-01-01

    We developed framework and the mathematical formulation for optimizing communication network using mixed integer programming. The design yields a system that is much smaller, in search space size, when compared to the earlier approach. Our constrained network optimization takes into account the dynamics of link performance within the network along with mission and operation requirements. A unique penalty function is introduced to transform the mixed integer programming into the more manageable problem of searching in a continuous space. The constrained optimization problem was proposed to solve in two stages: first using the heuristic Particle Swarming Optimization algorithm to get a good initial starting point, and then feeding the result into the Sequential Quadratic Programming algorithm to achieve the final optimal schedule. We demonstrate the above planning and scheduling methodology with a scenario of 20 spacecraft and 3 ground stations of a Deep Space Network site. Our approach and framework have been simple and flexible so that problems with larger number of constraints and network can be easily adapted and solved.

  6. Microscopy as a statistical, Rényi-Ulam, half-lie game: a new heuristic search strategy to accelerate imaging.

    PubMed

    Drumm, Daniel W; Greentree, Andrew D

    2017-11-07

    Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.

  7. Visualization for Hyper-Heuristics. Front-End Graphical User Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroenung, Lauren

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. While such automated design has great advantages, it can often be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address thesemore » issues of usability by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics to support practitioners, as well as scientific visualization of the produced automated designs. My contributions to this project are exhibited in the user-facing portion of the developed system and the detailed scientific visualizations created from back-end data.« less

  8. Application of heuristic satellite plan synthesis algorithms to requirements of the WARC-88 allotment plan

    NASA Technical Reports Server (NTRS)

    Heyward, Ann O.; Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Olen, Carl

    1990-01-01

    Creation of an Allotment Plan for the Fixed Satellite Service at the 1988 Space World Administrative Radio Conference (WARC) represented a complex satellite plan synthesis problem, involving a large number of planned and existing systems. Solutions to this problem at WARC-88 required the use of both automated and manual procedures to develop an acceptable set of system positions. Development of an Allotment Plan may also be attempted through solution of an optimization problem, known as the Satellite Location Problem (SLP). Three automated heuristic procedures, developed specifically to solve SLP, are presented. The heuristics are then applied to two specific WARC-88 scenarios. Solutions resulting from the fully automated heuristics are then compared with solutions obtained at WARC-88 through a combination of both automated and manual planning efforts.

  9. Parameter meta-optimization of metaheuristics of solving specific NP-hard facility location problem

    NASA Astrophysics Data System (ADS)

    Skakov, E. S.; Malysh, V. N.

    2018-03-01

    The aim of the work is to create an evolutionary method for optimizing the values of the control parameters of metaheuristics of solving the NP-hard facility location problem. A system analysis of the tuning process of optimization algorithms parameters is carried out. The problem of finding the parameters of a metaheuristic algorithm is formulated as a meta-optimization problem. Evolutionary metaheuristic has been chosen to perform the task of meta-optimization. Thus, the approach proposed in this work can be called “meta-metaheuristic”. Computational experiment proving the effectiveness of the procedure of tuning the control parameters of metaheuristics has been performed.

  10. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    NASA Astrophysics Data System (ADS)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  11. How the twain can meet: Prospect theory and models of heuristics in risky choice.

    PubMed

    Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph

    2017-03-01

    Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Performance of Optimization Heuristics for the Operational Planning of Multi-energy Storage Systems

    NASA Astrophysics Data System (ADS)

    Haas, J.; Schradi, J.; Nowak, W.

    2016-12-01

    In the transition to low-carbon energy sources, energy storage systems (ESS) will play an increasingly important role. Particularly in the context of solar power challenges (variability, uncertainty), ESS can provide valuable services: energy shifting, ramping, robustness against forecast errors, frequency support, etc. However, these qualities are rarely modelled in the operational planning of power systems because of the involved computational burden, especially when multiple ESS technologies are involved. This work assesses two optimization heuristics for speeding up the optimal operation problem. It compares their accuracy (in terms of costs) and speed against a reference solution. The first heuristic (H1) is based on a merit order. Here, the ESS are sorted from lower to higher operational costs (including cycling costs). For each time step, the cheapest available ESS is used first, followed by the second one and so on, until matching the net load (demand minus available renewable generation). The second heuristic (H2) uses the Fourier transform to detect the main frequencies that compose the net load. A specific ESS is assigned to each frequency range, aiming to smoothen the net load. Finally, the reference solution is obtained with a mixed integer linear program (MILP). H1, H2 and MILP are subject to technical constraints (energy/power balance, ramping rates, on/off states...). Costs due to operation, replacement (cycling) and unserved energy are considered. Four typical days of a system with a high share of solar energy were used in several test cases, varying the resolution from one second to fifteen minutes. H1 and H2 achieve accuracies of about 90% and 95% in average, and speed-up times of two to three and one to two orders of magnitude, respectively. The use of the heuristics looks promising in the context of planning the expansion of power systems, especially when their loss of accuracy is outweighed by solar or wind forecast errors.

  13. Parallel heuristics for scalable community detection

    DOE PAGES

    Lu, Hao; Halappanavar, Mahantesh; Kalyanaraman, Ananth

    2015-08-14

    Community detection has become a fundamental operation in numerous graph-theoretic applications. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method ismore » also inherently sequential, thereby limiting its scalability. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose heuristics that are designed to break the sequential barrier. For evaluation purposes, we implemented our heuristics using OpenMP multithreading, and tested them over real world graphs derived from multiple application domains. Compared to the serial Louvain implementation, our parallel implementation is able to produce community outputs with a higher modularity for most of the inputs tested, in comparable number or fewer iterations, while providing real speedups of up to 16x using 32 threads.« less

  14. Order-Constrained Solutions in K-Means Clustering: Even Better than Being Globally Optimal

    ERIC Educational Resources Information Center

    Steinley, Douglas; Hubert, Lawrence

    2008-01-01

    This paper proposes an order-constrained K-means cluster analysis strategy, and implements that strategy through an auxiliary quadratic assignment optimization heuristic that identifies an initial object order. A subsequent dynamic programming recursion is applied to optimally subdivide the object set subject to the order constraint. We show that…

  15. From task characteristics to learning: A systematic review.

    PubMed

    Wielenga-Meijer, Etty G A; Taris, Toon W; Kompier, Michiel A J; Wigboldus, Daniël H J

    2010-10-01

    Although many theoretical approaches propose that job characteristics affect employee learning, the question is why and how job characteristics influence learning. The present study reviews the evidence on the relationships among learning antecedents (i.e., job characteristics: demands, variety, autonomy and feedback), learning processes (including motivational, meta-cognitive, cognitive and behavioral processes) and learning consequences. Building on an integrative heuristic model, we quantitatively reviewed 85 studies published between 1969 and 2005. Our analyses revealed strong evidence for a positive relation between job demands and autonomy on the one hand and motivational and meta-cognitive learning processes on the other. Furthermore, these learning processes were positively related to learning consequences. © 2010 The Authors. Scandinavian Journal of Psychology © 2010 The Scandinavian Psychological Associations.

  16. The Historical Ideal-Type as a Heuristic Device for Academic Storytelling by Sport Scholars

    ERIC Educational Resources Information Center

    Tutka, Patrick; Seifried, Chad

    2015-01-01

    The goal of this research endeavor is to take the previous calls of sport scholars to expand into alternative research approaches (e.g., history, case study, law reviews, philosophy, etc.) and to show how storytelling can be an effective tool through the use of a heuristic device. The present analysis attempts to focus on the usage of the…

  17. Attractors in Sequence Space: Agent-Based Exploration of MHC I Binding Peptides.

    PubMed

    Jäger, Natalie; Wisniewska, Joanna M; Hiss, Jan A; Freier, Anja; Losch, Florian O; Walden, Peter; Wrede, Paul; Schneider, Gisbert

    2010-01-12

    Ant Colony Optimization (ACO) is a meta-heuristic that utilizes a computational analogue of ant trail pheromones to solve combinatorial optimization problems. The size of the ant colony and the representation of the ants' pheromone trails is unique referring to the given optimization problem. In the present study, we employed ACO to generate novel peptides that stabilize MHC I protein on the plasma membrane of a murine lymphoma cell line. A jury of feedforward neural network classifiers served as fitness function for peptide design by ACO. Bioactive murine MHC I H-2K(b) stabilizing as well as nonstabilizing octapeptides were designed, synthesized and tested. These peptides reveal residue motifs that are relevant for MHC I receptor binding. We demonstrate how the performance of the implemented ACO algorithm depends on the colony size and the size of the search space. The actual peptide design process by ACO constitutes a search path in sequence space that can be visualized as trajectories on a self-organizing map (SOM). By projecting the sequence space on a SOM we visualize the convergence of the different solutions that emerge during the optimization process in sequence space. The SOM representation reveals attractors in sequence space for MHC I binding peptides. The combination of ACO and SOM enables systematic peptide optimization. This technique allows for the rational design of various types of bioactive peptides with minimal experimental effort. Here, we demonstrate its successful application to the design of MHC-I binding and nonbinding peptides which exhibit substantial bioactivity in a cell-based assay. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    NASA Astrophysics Data System (ADS)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  19. Multicriteria meta-heuristics for AGV dispatching control based on computational intelligence.

    PubMed

    Naso, David; Turchiano, Biagio

    2005-04-01

    In many manufacturing environments, automated guided vehicles are used to move the processed materials between various pickup and delivery points. The assignment of vehicles to unit loads is a complex problem that is often solved in real-time with simple dispatching rules. This paper proposes an automated guided vehicles dispatching approach based on computational intelligence. We adopt a fuzzy multicriteria decision strategy to simultaneously take into account multiple aspects in every dispatching decision. Since the typical short-term view of dispatching rules is one of the main limitations of such real-time assignment heuristics, we also incorporate in the multicriteria algorithm a specific heuristic rule that takes into account the empty-vehicle travel on a longer time-horizon. Moreover, we also adopt a genetic algorithm to tune the weights associated to each decision criteria in the global decision algorithm. The proposed approach is validated by means of a comparison with other dispatching rules, and with other recently proposed multicriteria dispatching strategies also based on computational Intelligence. The analysis of the results obtained by the proposed dispatching approach in both nominal and perturbed operating conditions (congestions, faults) confirms its effectiveness.

  20. Electron Beam Melting and Refining of Metals: Computational Modeling and Optimization

    PubMed Central

    Vutova, Katia; Donchev, Veliko

    2013-01-01

    Computational modeling offers an opportunity for a better understanding and investigation of thermal transfer mechanisms. It can be used for the optimization of the electron beam melting process and for obtaining new materials with improved characteristics that have many applications in the power industry, medicine, instrument engineering, electronics, etc. A time-dependent 3D axis-symmetrical heat model for simulation of thermal transfer in metal ingots solidified in a water-cooled crucible at electron beam melting and refining (EBMR) is developed. The model predicts the change in the temperature field in the casting ingot during the interaction of the beam with the material. A modified Pismen-Rekford numerical scheme to discretize the analytical model is developed. These equation systems, describing the thermal processes and main characteristics of the developed numerical method, are presented. In order to optimize the technological regimes, different criteria for better refinement and obtaining dendrite crystal structures are proposed. Analytical problems of mathematical optimization are formulated, discretized and heuristically solved by cluster methods. Using important for the practice simulation results, suggestions can be made for EBMR technology optimization. The proposed tool is important and useful for studying, control, optimization of EBMR process parameters and improving of the quality of the newly produced materials. PMID:28788351

  1. A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints

    NASA Astrophysics Data System (ADS)

    Estiningsih, Y.; Farikhin; Tjahjana, R. H.

    2018-03-01

    Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.

  2. Query Optimization in Distributed Databases.

    DTIC Science & Technology

    1982-10-01

    general, the strategy a31 a11 a 3 is more time comsuming than the strategy a, a, and sually we do not use it. Since the semijoin of R.XJ> RS requires...analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are difficult to obtain, some...is the study of the analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are

  3. Solving multi-objective job shop scheduling problems using a non-dominated sorting genetic algorithm

    NASA Astrophysics Data System (ADS)

    Piroozfard, Hamed; Wong, Kuan Yew

    2015-05-01

    The efforts of finding optimal schedules for the job shop scheduling problems are highly important for many real-world industrial applications. In this paper, a multi-objective based job shop scheduling problem by simultaneously minimizing makespan and tardiness is taken into account. The problem is considered to be more complex due to the multiple business criteria that must be satisfied. To solve the problem more efficiently and to obtain a set of non-dominated solutions, a meta-heuristic based non-dominated sorting genetic algorithm is presented. In addition, task based representation is used for solution encoding, and tournament selection that is based on rank and crowding distance is applied for offspring selection. Swapping and insertion mutations are employed to increase diversity of population and to perform intensive search. To evaluate the modified non-dominated sorting genetic algorithm, a set of modified benchmarking job shop problems obtained from the OR-Library is used, and the results are considered based on the number of non-dominated solutions and quality of schedules obtained by the algorithm.

  4. Towards global optimization with adaptive simulated annealing

    NASA Astrophysics Data System (ADS)

    Forbes, Gregory W.; Jones, Andrew E.

    1991-01-01

    The structure of the simulated annealing algorithm is presented and its rationale is discussed. A unifying heuristic is then introduced which serves as a guide in the design of all of the sub-components of the algorithm. Simply put this heuristic principle states that at every cycle in the algorithm the occupation density should be kept as close as possible to the equilibrium distribution. This heuristic has been used as a guide to develop novel step generation and temperature control methods intended to improve the efficiency of the simulated annealing algorithm. The resulting algorithm has been used in attempts to locate good solutions for one of the lens design problems associated with this conference viz. the " monochromatic quartet" and a sample of the results is presented. 1 Global optimization in the context oflens design Whatever the context optimization algorithms relate to problems that take the following form: Given some configuration space with coordinates r (x1 . . x) and a merit function written asffr) find the point r whereftr) takes it lowest value. That is find the global minimum. In many cases there is also a set of auxiliary constraints that must be met so the problem statement becomes: Find the global minimum of the merit function within the region defined by E. (r) 0 j 1 2 . . . p and 0 j 1 2 . . . q.

  5. A Multi-Start Evolutionary Local Search for the Two-Echelon Location Routing Problem

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Phuong; Prins, Christian; Prodhon, Caroline

    This paper presents a new hybrid metaheuristic between a greedy randomized adaptive search procedure (GRASP) and an evolutionary/iterated local search (ELS/ILS), using Tabu list to solve the two-echelon location routing problem (LRP-2E). The GRASP uses in turn three constructive heuristics followed by local search to generate the initial solutions. From a solution of GRASP, an intensification strategy is carried out by a dynamic alternation between ELS and ILS. In this phase, each child is obtained by mutation and evaluated through a splitting procedure of giant tour followed by a local search. The tabu list, defined by two characteristics of solution (total cost and number of trips), is used to avoid searching a space already explored. The results show that our metaheuristic clearly outperforms all previously published methods on LRP-2E benchmark instances. Furthermore, it is competitive with the best meta-heuristic published for the single-echelon LRP.

  6. MEvoLib v1.0: the first molecular evolution library for Python.

    PubMed

    Álvarez-Jarreta, Jorge; Ruiz-Pesini, Eduardo

    2016-10-28

    Molecular evolution studies involve many different hard computational problems solved, in most cases, with heuristic algorithms that provide a nearly optimal solution. Hence, diverse software tools exist for the different stages involved in a molecular evolution workflow. We present MEvoLib, the first molecular evolution library for Python, providing a framework to work with different tools and methods involved in the common tasks of molecular evolution workflows. In contrast with already existing bioinformatics libraries, MEvoLib is focused on the stages involved in molecular evolution studies, enclosing the set of tools with a common purpose in a single high-level interface with fast access to their frequent parameterizations. The gene clustering from partial or complete sequences has been improved with a new method that integrates accessible external information (e.g. GenBank's features data). Moreover, MEvoLib adjusts the fetching process from NCBI databases to optimize the download bandwidth usage. In addition, it has been implemented using parallelization techniques to cope with even large-case scenarios. MEvoLib is the first library for Python designed to facilitate molecular evolution researches both for expert and novel users. Its unique interface for each common task comprises several tools with their most used parameterizations. It has also included a method to take advantage of biological knowledge to improve the gene partition of sequence datasets. Additionally, its implementation incorporates parallelization techniques to enhance computational costs when handling very large input datasets.

  7. Optimal pattern distributions in Rete-based production systems

    NASA Technical Reports Server (NTRS)

    Scott, Stephen L.

    1994-01-01

    Since its introduction into the AI community in the early 1980's, the Rete algorithm has been widely used. This algorithm has formed the basis for many AI tools, including NASA's CLIPS. One drawback of Rete-based implementation, however, is that the network structures used internally by the Rete algorithm make it sensitive to the arrangement of individual patterns within rules. Thus while rules may be more or less arbitrarily placed within source files, the distribution of individual patterns within these rules can significantly affect the overall system performance. Some heuristics have been proposed to optimize pattern placement, however, these suggestions can be conflicting. This paper describes a systematic effort to measure the effect of pattern distribution on production system performance. An overview of the Rete algorithm is presented to provide context. A description of the methods used to explore the pattern ordering problem area are presented, using internal production system metrics such as the number of partial matches, and coarse-grained operating system data such as memory usage and time. The results of this study should be of interest to those developing and optimizing software for Rete-based production systems.

  8. Lifelong Optimization

    DTIC Science & Technology

    2015-04-13

    cope with dynamic, online optimisation problems with uncertainty, we developed some powerful and sophisticated techniques for learning heuristics...NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) National ICT Australia United NICTA, Locked Bag 6016 Kensington...ABSTRACT Optimization solvers should learn to improve their performance over time. By learning both during the course of solving an optimization

  9. Quasi-Optimal Elimination Trees for 2D Grids with Singularities

    DOE PAGES

    Paszyńska, A.; Paszyński, M.; Jopek, K.; ...

    2015-01-01

    We consmore » truct quasi-optimal elimination trees for 2D finite element meshes with singularities. These trees minimize the complexity of the solution of the discrete system. The computational cost estimates of the elimination process model the execution of the multifrontal algorithms in serial and in parallel shared-memory executions. Since the meshes considered are a subspace of all possible mesh partitions, we call these minimizers quasi-optimal. We minimize the cost functionals using dynamic programming. Finding these minimizers is more computationally expensive than solving the original algebraic system. Nevertheless, from the insights provided by the analysis of the dynamic programming minima, we propose a heuristic construction of the elimination trees that has cost O N e log ⁡ N e , where N e is the number of elements in the mesh. We show that this heuristic ordering has similar computational cost to the quasi-optimal elimination trees found with dynamic programming and outperforms state-of-the-art alternatives in our numerical experiments.« less

  10. Quasi-Optimal Elimination Trees for 2D Grids with Singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paszyńska, A.; Paszyński, M.; Jopek, K.

    We consmore » truct quasi-optimal elimination trees for 2D finite element meshes with singularities. These trees minimize the complexity of the solution of the discrete system. The computational cost estimates of the elimination process model the execution of the multifrontal algorithms in serial and in parallel shared-memory executions. Since the meshes considered are a subspace of all possible mesh partitions, we call these minimizers quasi-optimal. We minimize the cost functionals using dynamic programming. Finding these minimizers is more computationally expensive than solving the original algebraic system. Nevertheless, from the insights provided by the analysis of the dynamic programming minima, we propose a heuristic construction of the elimination trees that has cost O N e log ⁡ N e , where N e is the number of elements in the mesh. We show that this heuristic ordering has similar computational cost to the quasi-optimal elimination trees found with dynamic programming and outperforms state-of-the-art alternatives in our numerical experiments.« less

  11. An Efficient Optimization Method for Solving Unsupervised Data Classification Problems.

    PubMed

    Shabanzadeh, Parvaneh; Yusof, Rubiyah

    2015-01-01

    Unsupervised data classification (or clustering) analysis is one of the most useful tools and a descriptive task in data mining that seeks to classify homogeneous groups of objects based on similarity and is used in many medical disciplines and various applications. In general, there is no single algorithm that is suitable for all types of data, conditions, and applications. Each algorithm has its own advantages, limitations, and deficiencies. Hence, research for novel and effective approaches for unsupervised data classification is still active. In this paper a heuristic algorithm, Biogeography-Based Optimization (BBO) algorithm, was adapted for data clustering problems by modifying the main operators of BBO algorithm, which is inspired from the natural biogeography distribution of different species. Similar to other population-based algorithms, BBO algorithm starts with an initial population of candidate solutions to an optimization problem and an objective function that is calculated for them. To evaluate the performance of the proposed algorithm assessment was carried on six medical and real life datasets and was compared with eight well known and recent unsupervised data classification algorithms. Numerical results demonstrate that the proposed evolutionary optimization algorithm is efficient for unsupervised data classification.

  12. An Optimal Hierarchical Decision Model for a Regional Logistics Network with Environmental Impact Consideration

    PubMed Central

    Zhang, Dezhi; Li, Shuangyan

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209

  13. An optimal hierarchical decision model for a regional logistics network with environmental impact consideration.

    PubMed

    Zhang, Dezhi; Li, Shuangyan; Qin, Jin

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.

  14. A Model for Developing Meta-Cognitive Tools in Teacher Apprenticeships

    ERIC Educational Resources Information Center

    Bray, Paige; Schatz, Steven

    2013-01-01

    This research investigates a model for developing meta-cognitive tools to be used by pre-service teachers during apprenticeship (student teaching) experience to operationalise the epistemological model of Cook and Brown (2009). Meta-cognitive tools have proven to be effective for increasing performance and retention of undergraduate students.…

  15. Towards improving searches for optimal phylogenies.

    PubMed

    Ford, Eric; St John, Katherine; Wheeler, Ward C

    2015-01-01

    Finding the optimal evolutionary history for a set of taxa is a challenging computational problem, even when restricting possible solutions to be "tree-like" and focusing on the maximum-parsimony optimality criterion. This has led to much work on using heuristic tree searches to find approximate solutions. We present an approach for finding exact optimal solutions that employs and complements the current heuristic methods for finding optimal trees. Given a set of taxa and a set of aligned sequences of characters, there may be subsets of characters that are compatible, and for each such subset there is an associated (possibly partially resolved) phylogeny with edges corresponding to each character state change. These perfect phylogenies serve as anchor trees for our constrained search space. We show that, for sequences with compatible sites, the parsimony score of any tree [Formula: see text] is at least the parsimony score of the anchor trees plus the number of inferred changes between [Formula: see text] and the anchor trees. As the maximum-parsimony optimality score is additive, the sum of the lower bounds on compatible character partitions provides a lower bound on the complete alignment of characters. This yields a region in the space of trees within which the best tree is guaranteed to be found; limiting the search for the optimal tree to this region can significantly reduce the number of trees that must be examined in a search of the space of trees. We analyze this method empirically using four different biological data sets as well as surveying 400 data sets from the TreeBASE repository, demonstrating the effectiveness of our technique in reducing the number of steps in exact heuristic searches for trees under the maximum-parsimony optimality criterion. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Evolutionary Artificial Neural Network Weight Tuning to Optimize Decision Making for an Abstract Game

    DTIC Science & Technology

    2010-03-01

    separate LoA heuristic. If any of the examined heuristics produced competitive player , then the final measurement was a success . Barring that, a...if offline training actually results in a successful player . Whereas offline learning plays many games and then trains as many networks as desired...a competitive Lines of Action player , shedding light on the difficulty of developing a neural network to model such a large and complex solution

  17. Intelligent Space Tube Optimization for speeding ground water remedial design.

    PubMed

    Kalwij, Ineke M; Peralta, Richard C

    2008-01-01

    An innovative Intelligent Space Tube Optimization (ISTO) two-stage approach facilitates solving complex nonlinear flow and contaminant transport management problems. It reduces computational effort of designing optimal ground water remediation systems and strategies for an assumed set of wells. ISTO's stage 1 defines an adaptive mobile space tube that lengthens toward the optimal solution. The space tube has overlapping multidimensional subspaces. Stage 1 generates several strategies within the space tube, trains neural surrogate simulators (NSS) using the limited space tube data, and optimizes using an advanced genetic algorithm (AGA) with NSS. Stage 1 speeds evaluating assumed well locations and combinations. For a large complex plume of solvents and explosives, ISTO stage 1 reaches within 10% of the optimal solution 25% faster than an efficient AGA coupled with comprehensive tabu search (AGCT) does by itself. ISTO input parameters include space tube radius and number of strategies used to train NSS per cycle. Larger radii can speed convergence to optimality for optimizations that achieve it but might increase the number of optimizations reaching it. ISTO stage 2 automatically refines the NSS-AGA stage 1 optimal strategy using heuristic optimization (we used AGCT), without using NSS surrogates. Stage 2 explores the entire solution space. ISTO is applicable for many heuristic optimization settings in which the numerical simulator is computationally intensive, and one would like to reduce that burden.

  18. Iterative pass optimization of sequence data

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum-cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete. This "tree alignment" problem has motivated the considerable effort placed in multiple sequence alignment procedures. Wheeler in 1996 proposed a heuristic method, direct optimization, to calculate cladogram costs without the intervention of multiple sequence alignment. This method, though more efficient in time and more effective in cladogram length than many alignment-based procedures, greedily optimizes nodes based on descendent information only. In their proposal of an exact multiple alignment solution, Sankoff et al. in 1976 described a heuristic procedure--the iterative improvement method--to create alignments at internal nodes by solving a series of median problems. The combination of a three-sequence direct optimization with iterative improvement and a branch-length-based cladogram cost procedure, provides an algorithm that frequently results in superior (i.e., lower) cladogram costs. This iterative pass optimization is both computation and memory intensive, but economies can be made to reduce this burden. An example in arthropod systematics is discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  19. Integrated optimization of location assignment and sequencing in multi-shuttle automated storage and retrieval systems under modified 2n-command cycle pattern

    NASA Astrophysics Data System (ADS)

    Yang, Peng; Peng, Yongfei; Ye, Bin; Miao, Lixin

    2017-09-01

    This article explores the integrated optimization problem of location assignment and sequencing in multi-shuttle automated storage/retrieval systems under the modified 2n-command cycle pattern. The decision of storage and retrieval (S/R) location assignment and S/R request sequencing are jointly considered. An integer quadratic programming model is formulated to describe this integrated optimization problem. The optimal travel cycles for multi-shuttle S/R machines can be obtained to process S/R requests in the storage and retrieval request order lists by solving the model. The small-sized instances are optimally solved using CPLEX. For large-sized problems, two tabu search algorithms are proposed, in which the first come, first served and nearest neighbour are used to generate initial solutions. Various numerical experiments are conducted to examine the heuristics' performance and the sensitivity of algorithm parameters. Furthermore, the experimental results are analysed from the viewpoint of practical application, and a parameter list for applying the proposed heuristics is recommended under different real-life scenarios.

  20. Coordinated distribution network control of tap changer transformers, capacitors and PV inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin

    A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less

  1. Coordinated distribution network control of tap changer transformers, capacitors and PV inverters

    DOE PAGES

    Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin

    2017-06-08

    A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less

  2. Size-guided multi-seed heuristic method for geometry optimization of clusters: Application to benzene clusters.

    PubMed

    Takeuchi, Hiroshi

    2018-05-08

    Since searching for the global minimum on the potential energy surface of a cluster is very difficult, many geometry optimization methods have been proposed, in which initial geometries are randomly generated and subsequently improved with different algorithms. In this study, a size-guided multi-seed heuristic method is developed and applied to benzene clusters. It produces initial configurations of the cluster with n molecules from the lowest-energy configurations of the cluster with n - 1 molecules (seeds). The initial geometries are further optimized with the geometrical perturbations previously used for molecular clusters. These steps are repeated until the size n satisfies a predefined one. The method locates putative global minima of benzene clusters with up to 65 molecules. The performance of the method is discussed using the computational cost, rates to locate the global minima, and energies of initial geometries. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  3. A heuristic for deriving the optimal number and placement of reconnaissance sensors

    NASA Astrophysics Data System (ADS)

    Nanda, S.; Weeks, J.; Archer, M.

    2008-04-01

    A key to mastering asymmetric warfare is the acquisition of accurate intelligence on adversaries and their assets in urban and open battlefields. To achieve this, one needs adequate numbers of tactical sensors placed in locations to optimize coverage, where optimality is realized by covering a given area of interest with the least number of sensors, or covering the largest possible subsection of an area of interest with a fixed set of sensors. Unfortunately, neither problem admits a polynomial time algorithm as a solution, and therefore, the placement of such sensors must utilize intelligent heuristics instead. In this paper, we present a scheme implemented on parallel SIMD processing architectures to yield significantly faster results, and that is highly scalable with respect to dynamic changes in the area of interest. Furthermore, the solution to the first problem immediately translates to serve as a solution to the latter if and when any sensors are rendered inoperable.

  4. Single machine total completion time minimization scheduling with a time-dependent learning effect and deteriorating jobs

    NASA Astrophysics Data System (ADS)

    Wang, Ji-Bo; Wang, Ming-Zheng; Ji, Ping

    2012-05-01

    In this article, we consider a single machine scheduling problem with a time-dependent learning effect and deteriorating jobs. By the effects of time-dependent learning and deterioration, we mean that the job processing time is defined by a function of its starting time and total normal processing time of jobs in front of it in the sequence. The objective is to determine an optimal schedule so as to minimize the total completion time. This problem remains open for the case of -1 < a < 0, where a denotes the learning index; we show that an optimal schedule of the problem is V-shaped with respect to job normal processing times. Three heuristic algorithms utilising the V-shaped property are proposed, and computational experiments show that the last heuristic algorithm performs effectively and efficiently in obtaining near-optimal solutions.

  5. Thermodynamic heuristics with case-based reasoning: combined insights for RNA pseudoknot secondary structure.

    PubMed

    Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni

    2011-08-01

    The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.

  6. Rarity-weighted richness: a simple and reliable alternative to integer programming and heuristic algorithms for minimum set and maximum coverage problems in conservation planning.

    PubMed

    Albuquerque, Fabio; Beier, Paul

    2015-01-01

    Here we report that prioritizing sites in order of rarity-weighted richness (RWR) is a simple, reliable way to identify sites that represent all species in the fewest number of sites (minimum set problem) or to identify sites that represent the largest number of species within a given number of sites (maximum coverage problem). We compared the number of species represented in sites prioritized by RWR to numbers of species represented in sites prioritized by the Zonation software package for 11 datasets in which the size of individual planning units (sites) ranged from <1 ha to 2,500 km2. On average, RWR solutions were more efficient than Zonation solutions. Integer programming remains the only guaranteed way find an optimal solution, and heuristic algorithms remain superior for conservation prioritizations that consider compactness and multiple near-optimal solutions in addition to species representation. But because RWR can be implemented easily and quickly in R or a spreadsheet, it is an attractive alternative to integer programming or heuristic algorithms in some conservation prioritization contexts.

  7. Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.

    PubMed

    Hutchinson, John M C; Gigerenzer, Gerd

    2005-05-31

    The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.

  8. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  9. Reserve design to maximize species persistence

    Treesearch

    Robert G. Haight; Laurel E. Travis

    2008-01-01

    We develop a reserve design strategy to maximize the probability of species persistence predicted by a stochastic, individual-based, metapopulation model. Because the population model does not fit exact optimization procedures, our strategy involves deriving promising solutions from theory, obtaining promising solutions from a simulation optimization heuristic, and...

  10. An analysis of iterated local search for job-shop scheduling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitley, L. Darrell; Howe, Adele E.; Watson, Jean-Paul

    2003-08-01

    Iterated local search, or ILS, is among the most straightforward meta-heuristics for local search. ILS employs both small-step and large-step move operators. Search proceeds via iterative modifications to a single solution, in distinct alternating phases. In the first phase, local neighborhood search (typically greedy descent) is used in conjunction with the small-step operator to transform solutions into local optima. In the second phase, the large-step operator is applied to generate perturbations to the local optima obtained in the first phase. Ideally, when local neighborhood search is applied to the resulting solution, search will terminate at a different local optimum, i.e.,more » the large-step perturbations should be sufficiently large to enable escape from the attractor basins of local optima. ILS has proven capable of delivering excellent performance on numerous N P-Hard optimization problems. [LMS03]. However, despite its implicity, very little is known about why ILS can be so effective, and under what conditions. The goal of this paper is to advance the state-of-the-art in the analysis of meta-heuristics, by providing answers to this research question. They focus on characterizing both the relationship between the structure of the underlying search space and ILS performance, and the dynamic behavior of ILS. The analysis proceeds in the context of the job-shop scheduling problem (JSP) [Tai94]. They begin by demonstrating that the attractor basins of local optima in the JSP are surprisingly weak, and can be escaped with high probaiblity by accepting a short random sequence of less-fit neighbors. this result is used to develop a new ILS algorithms for the JSP, I-JAR, whose performance is competitive with tabu search on difficult benchmark instances. They conclude by developing a very accurate behavioral model of I-JAR, which yields significant insights into the dynamics of search. The analysis is based on a set of 100 random 10 x 10 problem instances, in addition to some widely used benchmark instances. Both I-JAR and the tabu search algorithm they consider are based on the N1 move operator introduced by van Laarhoven et al. [vLAL92]. The N1 operator induces a connected search space, such that it is always possible to move from an arbitrary solution to an optimal solution; this property is integral to the development of a behavioral model of I-JAR. However, much of the analysis generalizes to other move operators, including that of Nowicki and Smutnick [NS96]. Finally the models are based on the distance between two solutions, which they take as the well-known disjunctive graph distance [MBK99].« less

  11. Test Scheduling for Core-Based SOCs Using Genetic Algorithm Based Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Giri, Chandan; Sarkar, Soumojit; Chattopadhyay, Santanu

    This paper presents a Genetic algorithm (GA) based solution to co-optimize test scheduling and wrapper design for core based SOCs. Core testing solutions are generated as a set of wrapper configurations, represented as rectangles with width equal to the number of TAM (Test Access Mechanism) channels and height equal to the corresponding testing time. A locally optimal best-fit heuristic based bin packing algorithm has been used to determine placement of rectangles minimizing the overall test times, whereas, GA has been utilized to generate the sequence of rectangles to be considered for placement. Experimental result on ITC'02 benchmark SOCs shows that the proposed method provides better solutions compared to the recent works reported in the literature.

  12. Social projection to ingroups and outgroups: a review and meta-analysis.

    PubMed

    Robbins, Jordan M; Krueger, Joachim I

    2005-01-01

    Social projection is the tendency to expect similarities between oneself and others. A review of the literature and a meta-analysis reveal that projection is stronger when people make judgments about ingroups than when they make judgments about outgroups. Analysis of moderator variables further reveals that ingroup projection is stronger for laboratory groups than for real social categories. The mode of analysis (i.e., nomothetic vs. idiographic) and the order of judgments (i.e., self or group judged first) have no discernable effects. Outgroup projection is positive, but small in size. Together, these findings support the view that projection can serve as an egocentric heuristic for inductive reasoning. The greater strength of ingroup projection can contribute to ingroup-favoritism, perceptions of ingroup homogeneity, and cooperation with ingroup members.

  13. Optimally Stopped Optimization

    NASA Astrophysics Data System (ADS)

    Vinci, Walter; Lidar, Daniel A.

    2016-11-01

    We combine the fields of heuristic optimization and optimal stopping. We propose a strategy for benchmarking randomized optimization algorithms that minimizes the expected total cost for obtaining a good solution with an optimal number of calls to the solver. To do so, rather than letting the objective function alone define a cost to be minimized, we introduce a further cost-per-call of the algorithm. We show that this problem can be formulated using optimal stopping theory. The expected cost is a flexible figure of merit for benchmarking probabilistic solvers that can be computed when the optimal solution is not known and that avoids the biases and arbitrariness that affect other measures. The optimal stopping formulation of benchmarking directly leads to a real-time optimal-utilization strategy for probabilistic optimizers with practical impact. We apply our formulation to benchmark simulated annealing on a class of maximum-2-satisfiability (MAX2SAT) problems. We also compare the performance of a D-Wave 2X quantum annealer to the Hamze-Freitas-Selby (HFS) solver, a specialized classical heuristic algorithm designed for low-tree-width graphs. On a set of frustrated-loop instances with planted solutions defined on up to N =1098 variables, the D-Wave device is 2 orders of magnitude faster than the HFS solver, and, modulo known caveats related to suboptimal annealing times, exhibits identical scaling with problem size.

  14. Nanoethics and the breaching of boundaries: a heuristic for going from encouragement to a fuller integration of ethical, legal and social issues and science : commentary on: "Adding to the mix: integrating ELSI into a National Nanoscale Science and Technology Center".

    PubMed

    Tuma, Julio R

    2011-12-01

    The intersection of ELSI and science forms a complicated nexus yet their integration is an important goal both for society and for the successful advancement of science. In what follows, I present a heuristic that makes boundary identification and crossing an important tool in the discovery of potential areas of ethical, legal, and social concern in science. A dynamic and iterative application of the heuristic can lead towards a fuller integration and appreciation of the concerns of ELSI and of science from both sides of the divide.

  15. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics

    PubMed Central

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-01-01

    Motivation: RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of O(n6). Subsequently, numerous faster ‘Sankoff-style’ approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity (≥ quartic time). Results: Breaking this barrier, we introduce the novel Sankoff-style algorithm ‘sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)’, which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff’s original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. Availability and implementation: SPARSE is freely available at http://www.bioinf.uni-freiburg.de/Software/SPARSE. Contact: backofen@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25838465

  16. A Critical Examination of the Introduction of Drug Detection Dogs for Policing of Illicit Drugs in New South Wales, Australia Using Kingdon's "Multiple Streams" Heuristic

    ERIC Educational Resources Information Center

    Lancaster, Kari; Ritter, Alison; Hughes, Caitlin; Hoppe, Robert

    2017-01-01

    This paper critically analyses the introduction of drug detection dogs as a tool for policing of illicit drugs in New South Wales, Australia. Using Kingdon's "multiple streams" heuristic as a lens for analysis, we identify how the issue of drugs policing became prominent on the policy agenda, and the conditions under which the…

  17. Engaging Undergraduate Math Majors in Geoscience Research using Interactive Simulations and Computer Art

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.

    2012-12-01

    As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.

  18. On the psychology of the recognition heuristic: retrieval primacy as a key determinant of its use.

    PubMed

    Pachur, Thorsten; Hertwig, Ralph

    2006-09-01

    The recognition heuristic is a prime example of a boundedly rational mind tool that rests on an evolved capacity, recognition, and exploits environmental structures. When originally proposed, it was conjectured that no other probabilistic cue reverses the recognition-based inference (D. G. Goldstein & G. Gigerenzer, 2002). More recent studies challenged this view and gave rise to the argument that recognition enters inferences just like any other probabilistic cue. By linking research on the heuristic with research on recognition memory, the authors argue that the retrieval of recognition information is not tantamount to the retrieval of other probabilistic cues. Specifically, the retrieval of subjective recognition precedes that of an objective probabilistic cue and occurs at little to no cognitive cost. This retrieval primacy gives rise to 2 predictions, both of which have been empirically supported: Inferences in line with the recognition heuristic (a) are made faster than inferences inconsistent with it and (b) are more prevalent under time pressure. Suspension of the heuristic, in contrast, requires additional time, and direct knowledge of the criterion variable, if available, can trigger such suspension. Copyright 2006 APA

  19. Three hybridization models based on local search scheme for job shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Balbi Fraga, Tatiana

    2015-05-01

    This work presents three different hybridization models based on the general schema of Local Search Heuristics, named Hybrid Successive Application, Hybrid Neighborhood, and Hybrid Improved Neighborhood. Despite similar approaches might have already been presented in the literature in other contexts, in this work these models are applied to analyzes the solution of the job shop scheduling problem, with the heuristics Taboo Search and Particle Swarm Optimization. Besides, we investigate some aspects that must be considered in order to achieve better solutions than those obtained by the original heuristics. The results demonstrate that the algorithms derived from these three hybrid models are more robust than the original algorithms and able to get better results than those found by the single Taboo Search.

  20. Parental investment: how an equity motive can produce inequality.

    PubMed

    Hertwig, Ralph; Davis, Jennifer Nerissa; Sulloway, Frank J

    2002-09-01

    The equity heuristic is a decision rule specifying that parents should attempt to subdivide resources more or less equally among their children. This investment rule coincides with the prescription from optimality models in economics and biology in cases in which expected future return for each offspring is equal. In this article, the authors present a counterintuitive implication of the equity heuristic: Whereas an equity motive produces a fair distribution at any given point in time, it yields a cumulative distribution of investments that is unequal. The authors test this analytical observation against evidence reported in studies exploring parental investment and show how the equity heuristic can provide an explanation of why the literature reports a diversity of birth order effects with respect to parental resource allocation.

  1. Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US

    Treesearch

    Wenshu Lin; Jingxin Wang; Edward Thomas

    2011-01-01

    A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...

  2. Sootblowing optimization for improved boiler performance

    DOEpatents

    James, John Robert; McDermott, John; Piche, Stephen; Pickard, Fred; Parikh, Neel J.

    2012-12-25

    A sootblowing control system that uses predictive models to bridge the gap between sootblower operation and boiler performance goals. The system uses predictive modeling and heuristics (rules) associated with different zones in a boiler to determine an optimal sequence of sootblower operations and achieve boiler performance targets. The system performs the sootblower optimization while observing any operational constraints placed on the sootblowers.

  3. Sootblowing optimization for improved boiler performance

    DOEpatents

    James, John Robert; McDermott, John; Piche, Stephen; Pickard, Fred; Parikh, Neel J

    2013-07-30

    A sootblowing control system that uses predictive models to bridge the gap between sootblower operation and boiler performance goals. The system uses predictive modeling and heuristics (rules) associated with different zones in a boiler to determine an optimal sequence of sootblower operations and achieve boiler performance targets. The system performs the sootblower optimization while observing any operational constraints placed on the sootblowers.

  4. Quad-rotor flight path energy optimization

    NASA Astrophysics Data System (ADS)

    Kemper, Edward

    Quad-Rotor unmanned areal vehicles (UAVs) have been a popular area of research and development in the last decade, especially with the advent of affordable microcontrollers like the MSP 430 and the Raspberry Pi. Path-Energy Optimization is an area that is well developed for linear systems. In this thesis, this idea of path-energy optimization is extended to the nonlinear model of the Quad-rotor UAV. The classical optimization technique is adapted to the nonlinear model that is derived for the problem at hand, coming up with a set of partial differential equations and boundary value conditions to solve these equations. Then, different techniques to implement energy optimization algorithms are tested using simulations in Python. First, a purely nonlinear approach is used. This method is shown to be computationally intensive, with no practical solution available in a reasonable amount of time. Second, heuristic techniques to minimize the energy of the flight path are tested, using Ziegler-Nichols' proportional integral derivative (PID) controller tuning technique. Finally, a brute force look-up table based PID controller is used. Simulation results of the heuristic method show that both reliable control of the system and path-energy optimization are achieved in a reasonable amount of time.

  5. Gene selection using hybrid binary black hole algorithm and modified binary particle swarm optimization.

    PubMed

    Pashaei, Elnaz; Pashaei, Elham; Aydin, Nizamettin

    2018-04-14

    In cancer classification, gene selection is an important data preprocessing technique, but it is a difficult task due to the large search space. Accordingly, the objective of this study is to develop a hybrid meta-heuristic Binary Black Hole Algorithm (BBHA) and Binary Particle Swarm Optimization (BPSO) (4-2) model that emphasizes gene selection. In this model, the BBHA is embedded in the BPSO (4-2) algorithm to make the BPSO (4-2) more effective and to facilitate the exploration and exploitation of the BPSO (4-2) algorithm to further improve the performance. This model has been associated with Random Forest Recursive Feature Elimination (RF-RFE) pre-filtering technique. The classifiers which are evaluated in the proposed framework are Sparse Partial Least Squares Discriminant Analysis (SPLSDA); k-nearest neighbor and Naive Bayes. The performance of the proposed method was evaluated on two benchmark and three clinical microarrays. The experimental results and statistical analysis confirm the better performance of the BPSO (4-2)-BBHA compared with the BBHA, the BPSO (4-2) and several state-of-the-art methods in terms of avoiding local minima, convergence rate, accuracy and number of selected genes. The results also show that the BPSO (4-2)-BBHA model can successfully identify known biologically and statistically significant genes from the clinical datasets. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Comparing the performance of expert user heuristics and an integer linear program in aircraft carrier deck operations.

    PubMed

    Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas

    2014-06-01

    Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.

  7. Heuristics as Bayesian inference under extreme priors.

    PubMed

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Development of the PEBLebl Traveling Salesman Problem Computerized Testbed

    ERIC Educational Resources Information Center

    Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew

    2015-01-01

    The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…

  9. A generalized network flow model for the multi-mode resource-constrained project scheduling problem with discounted cash flows

    NASA Astrophysics Data System (ADS)

    Chen, Miawjane; Yan, Shangyao; Wang, Sin-Siang; Liu, Chiu-Lan

    2015-02-01

    An effective project schedule is essential for enterprises to increase their efficiency of project execution, to maximize profit, and to minimize wastage of resources. Heuristic algorithms have been developed to efficiently solve the complicated multi-mode resource-constrained project scheduling problem with discounted cash flows (MRCPSPDCF) that characterize real problems. However, the solutions obtained in past studies have been approximate and are difficult to evaluate in terms of optimality. In this study, a generalized network flow model, embedded in a time-precedence network, is proposed to formulate the MRCPSPDCF with the payment at activity completion times. Mathematically, the model is formulated as an integer network flow problem with side constraints, which can be efficiently solved for optimality, using existing mathematical programming software. To evaluate the model performance, numerical tests are performed. The test results indicate that the model could be a useful planning tool for project scheduling in the real world.

  10. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Approximation algorithms for the min-power symmetric connectivity problem

    NASA Astrophysics Data System (ADS)

    Plotnikov, Roman; Erzin, Adil; Mladenovic, Nenad

    2016-10-01

    We consider the NP-hard problem of synthesis of optimal spanning communication subgraph in a given arbitrary simple edge-weighted graph. This problem occurs in the wireless networks while minimizing the total transmission power consumptions. We propose several new heuristics based on the variable neighborhood search metaheuristic for the approximation solution of the problem. We have performed a numerical experiment where all proposed algorithms have been executed on the randomly generated test samples. For these instances, on average, our algorithms outperform the previously known heuristics.

  12. Feature selection methods for big data bioinformatics: A survey from the search perspective.

    PubMed

    Wang, Lipo; Wang, Yaoli; Chang, Qing

    2016-12-01

    This paper surveys main principles of feature selection and their recent applications in big data bioinformatics. Instead of the commonly used categorization into filter, wrapper, and embedded approaches to feature selection, we formulate feature selection as a combinatorial optimization or search problem and categorize feature selection methods into exhaustive search, heuristic search, and hybrid methods, where heuristic search methods may further be categorized into those with or without data-distilled feature ranking measures. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A Meta-Cognitive Tool for Courseware Development, Maintenance, and Reuse

    ERIC Educational Resources Information Center

    Coffey, John W.

    2007-01-01

    Novak and Iuli [Novak, J. D. & Iuli, R. J. (1991). The use of meta-cognitive tools to facilitate knowledge production. In "A paper presented at the fourth Florida AI research symposium (FLAIRS '91)," Pensacola Beach, FL, May, 1991.] discuss the use of Concept Maps as meta-cognitive tools that help people to think about thinking. This work…

  14. Teaching meta-analysis using MetaLight.

    PubMed

    Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark

    2012-10-18

    Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.

  15. The E-health Literacy Demands of Australia's My Health Record: A Heuristic Evaluation of Usability.

    PubMed

    Walsh, Louisa; Hemsley, Bronwyn; Allan, Meredith; Adams, Natalie; Balandin, Susan; Georgiou, Andrew; Higgins, Isabel; McCarthy, Shaun; Hill, Sophie

    2017-01-01

    My Health Record is Australia's electronic personal health record system, which was introduced in July 2012. As of August 2017, approximately 21 percent of Australia's total population was registered to use My Health Record. Internationally, usability issues have been shown to negatively influence the uptake and use of electronic health record systems, and this scenario may particularly affect people who have low e-health literacy. It is likely that usability issues are negatively affecting the uptake and use of My Health Record in Australia. To identify potential e-health literacy-related usability issues within My Health Record through a heuristic evaluation method. Between September 14 and October 12, 2016, three of the authors conducted a heuristic evaluation of the two consumer-facing components of My Health Record-the information website and the electronic health record itself. These two components were evaluated against two sets of heuristics-the Health Literacy Online checklist and the Monkman Heuristics. The Health Literacy Online checklist and Monkman Heuristics are evidence-based checklists of web design elements with a focus on design for audiences with low health literacy. During this heuristic evaluation, the investigators individually navigated through the consumer-facing components of My Health Record, recording instances where the My Health Record did not conform to the checklist criteria. After the individual evaluations were completed, the investigators conferred and aggregated their results. From this process, a list of usability violations was constructed. When evaluated against the Health Literacy Online Checklist, the information website demonstrated violations in 12 of 35 criteria, and the electronic health record demonstrated violations in 16 of 35 criteria. When evaluated against the Monkman Heuristics, the information website demonstrated violations in 7 of 11 criteria, and the electronic health record demonstrated violations in 9 of 11 criteria. The identified violations included usability issues with the reading levels used within My Health Record, the graphic design elements, the layout of web pages, and a lack of images and audiovisual tools to support learning. Other important usability issues included a lack of translated resources, difficulty using accessibility tools, and complexity of the registration processes. My Health Record is an important piece of technology that has the potential to facilitate better communication between consumers and their health providers. However, this heuristic evaluation demonstrated that many usability-related elements of My Health Record cater poorly to users at risk of having low e-health literacy. Usability issues have been identified as an important barrier to use of personal health records internationally, and the findings of this heuristic evaluation demonstrate that usability issues may be substantial barriers to the uptake and use of My Health Record.

  16. ITO-based evolutionary algorithm to solve traveling salesman problem

    NASA Astrophysics Data System (ADS)

    Dong, Wenyong; Sheng, Kang; Yang, Chuanhua; Yi, Yunfei

    2014-03-01

    In this paper, a ITO algorithm inspired by ITO stochastic process is proposed for Traveling Salesmen Problems (TSP), so far, many meta-heuristic methods have been successfully applied to TSP, however, as a member of them, ITO needs further demonstration for TSP. So starting from designing the key operators, which include the move operator, wave operator, etc, the method based on ITO for TSP is presented, and moreover, the ITO algorithm performance under different parameter sets and the maintenance of population diversity information are also studied.

  17. A criterion autoscheduler for long range planning

    NASA Technical Reports Server (NTRS)

    Sponsler, Jeffrey L.

    1994-01-01

    A constraint-based scheduling system called SPIKE is used to create long-term schedules for the Hubble Space Telescope. A meta-level scheduler called the Criterion Autoscheduler for Long range planning (CASL) was created to guide SPIKE's schedule generation according to the agenda of the planning scientists. It is proposed that sufficient flexibility exists in a schedule to allow high level planning heuristics to be applied without adversely affected crucial constraints such as spacecraft efficiency. This hypothesis is supported by test data which is described.

  18. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  19. The identification of complete domains within protein sequences using accurate E-values for semi-global alignment

    PubMed Central

    Kann, Maricel G.; Sheetlin, Sergey L.; Park, Yonil; Bryant, Stephen H.; Spouge, John L.

    2007-01-01

    The sequencing of complete genomes has created a pressing need for automated annotation of gene function. Because domains are the basic units of protein function and evolution, a gene can be annotated from a domain database by aligning domains to the corresponding protein sequence. Ideally, complete domains are aligned to protein subsequences, in a ‘semi-global alignment’. Local alignment, which aligns pieces of domains to subsequences, is common in high-throughput annotation applications, however. It is a mature technique, with the heuristics and accurate E-values required for screening large databases and evaluating the screening results. Hidden Markov models (HMMs) provide an alternative theoretical framework for semi-global alignment, but their use is limited because they lack heuristic acceleration and accurate E-values. Our new tool, GLOBAL, overcomes some limitations of previous semi-global HMMs: it has accurate E-values and the possibility of the heuristic acceleration required for high-throughput applications. Moreover, according to a standard of truth based on protein structure, two semi-global HMM alignment tools (GLOBAL and HMMer) had comparable performance in identifying complete domains, but distinctly outperformed two tools based on local alignment. When searching for complete protein domains, therefore, GLOBAL avoids disadvantages commonly associated with HMMs, yet maintains their superior retrieval performance. PMID:17596268

  20. Automated planning of tangential breast intensity-modulated radiotherapy using heuristic optimization.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Letourneau, Daniel; Hill, Christine; Sharpe, Michael B

    2011-10-01

    To present an automated technique for two-field tangential breast intensity-modulated radiotherapy (IMRT) treatment planning. A total of 158 planned patients with Stage 0, I, and II breast cancer treated using whole-breast IMRT were retrospectively replanned using automated treatment planning tools. The tools developed are integrated into the existing clinical treatment planning system (Pinnacle(3)) and are designed to perform the manual volume delineation, beam placement, and IMRT treatment planning steps carried out by the treatment planning radiation therapist. The automated algorithm, using only the radio-opaque markers placed at CT simulation as inputs, optimizes the tangential beam parameters to geometrically minimize the amount of lung and heart treated while covering the whole-breast volume. The IMRT parameters are optimized according to the automatically delineated whole-breast volume. The mean time to generate a complete treatment plan was 6 min, 50 s ± 1 min 12 s. For the automated plans, 157 of 158 plans (99%) were deemed clinically acceptable, and 138 of 158 plans (87%) were deemed clinically improved or equal to the corresponding clinical plan when reviewed in a randomized, double-blinded study by one experienced breast radiation oncologist. In addition, overall the automated plans were dosimetrically equivalent to the clinical plans when scored for target coverage and lung and heart doses. We have developed robust and efficient automated tools for fully inversed planned tangential breast IMRT planning that can be readily integrated into clinical practice. The tools produce clinically acceptable plans using only the common anatomic landmarks from the CT simulation process as an input. We anticipate the tools will improve patient access to high-quality IMRT treatment by simplifying the planning process and will reduce the effort and cost of incorporating more advanced planning into clinical practice. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  1. Double Sampling with Multiple Imputation to Answer Large Sample Meta-Research Questions: Introduction and Illustration by Evaluating Adherence to Two Simple CONSORT Guidelines

    PubMed Central

    Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.

    2015-01-01

    Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135

  2. Optimizing the Placement of Burnable Poisons in PWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Serkan; Ivanov, Kostadin; Levine, Samuel

    2005-07-15

    The principal focus of this work is on developing a practical tool for designing the minimum amount of burnable poisons (BPs) for a pressurized water reactor using a typical Three Mile Island Unit 1 2-yr cycle as the reference design. The results of this study are to be applied to future reload designs. A new method, the Modified Power Shape Forced Diffusion (MPSFD) method, is presented that initially computes the BP cross section to force the power distribution into a desired shape. The method employs a simple formula that expresses the BP cross section as a function of the differencemore » between the calculated radial power distributions (RPDs) and the limit set for the maximum RPD. This method places BPs into all fresh fuel assemblies (FAs) having an RPD greater than the limit. The MPSFD method then reduces the BP content by reducing the BPs in fresh FAs with the lowest RPDs. Finally, the minimum BP content is attained via a heuristic fine-tuning procedure.This new BP design program has been automated by incorporating the new MPSFD method in conjunction with the heuristic fine-tuning program. The program has automatically produced excellent results for the reference core, and has the potential to reduce fuel costs and save manpower.« less

  3. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  4. Influence maximization in complex networks through optimal percolation

    NASA Astrophysics Data System (ADS)

    Morone, Flaviano; Makse, Hernán A.

    2015-08-01

    The whole frame of interconnections in complex networks hinges on a specific set of structural nodes, much smaller than the total size, which, if activated, would cause the spread of information to the whole network, or, if immunized, would prevent the diffusion of a large scale epidemic. Localizing this optimal, that is, minimal, set of structural nodes, called influencers, is one of the most important problems in network science. Despite the vast use of heuristic strategies to identify influential spreaders, the problem remains unsolved. Here we map the problem onto optimal percolation in random networks to identify the minimal set of influencers, which arises by minimizing the energy of a many-body system, where the form of the interactions is fixed by the non-backtracking matrix of the network. Big data analyses reveal that the set of optimal influencers is much smaller than the one predicted by previous heuristic centralities. Remarkably, a large number of previously neglected weakly connected nodes emerges among the optimal influencers. These are topologically tagged as low-degree nodes surrounded by hierarchical coronas of hubs, and are uncovered only through the optimal collective interplay of all the influencers in the network. The present theoretical framework may hold a larger degree of universality, being applicable to other hard optimization problems exhibiting a continuous transition from a known phase.

  5. Influence maximization in complex networks through optimal percolation.

    PubMed

    Morone, Flaviano; Makse, Hernán A

    2015-08-06

    The whole frame of interconnections in complex networks hinges on a specific set of structural nodes, much smaller than the total size, which, if activated, would cause the spread of information to the whole network, or, if immunized, would prevent the diffusion of a large scale epidemic. Localizing this optimal, that is, minimal, set of structural nodes, called influencers, is one of the most important problems in network science. Despite the vast use of heuristic strategies to identify influential spreaders, the problem remains unsolved. Here we map the problem onto optimal percolation in random networks to identify the minimal set of influencers, which arises by minimizing the energy of a many-body system, where the form of the interactions is fixed by the non-backtracking matrix of the network. Big data analyses reveal that the set of optimal influencers is much smaller than the one predicted by previous heuristic centralities. Remarkably, a large number of previously neglected weakly connected nodes emerges among the optimal influencers. These are topologically tagged as low-degree nodes surrounded by hierarchical coronas of hubs, and are uncovered only through the optimal collective interplay of all the influencers in the network. The present theoretical framework may hold a larger degree of universality, being applicable to other hard optimization problems exhibiting a continuous transition from a known phase.

  6. Discriminative motif optimization based on perceptron training

    PubMed Central

    Patel, Ronak Y.; Stormo, Gary D.

    2014-01-01

    Motivation: Generating accurate transcription factor (TF) binding site motifs from data generated using the next-generation sequencing, especially ChIP-seq, is challenging. The challenge arises because a typical experiment reports a large number of sequences bound by a TF, and the length of each sequence is relatively long. Most traditional motif finders are slow in handling such enormous amount of data. To overcome this limitation, tools have been developed that compromise accuracy with speed by using heuristic discrete search strategies or limited optimization of identified seed motifs. However, such strategies may not fully use the information in input sequences to generate motifs. Such motifs often form good seeds and can be further improved with appropriate scoring functions and rapid optimization. Results: We report a tool named discriminative motif optimizer (DiMO). DiMO takes a seed motif along with a positive and a negative database and improves the motif based on a discriminative strategy. We use area under receiver-operating characteristic curve (AUC) as a measure of discriminating power of motifs and a strategy based on perceptron training that maximizes AUC rapidly in a discriminative manner. Using DiMO, on a large test set of 87 TFs from human, drosophila and yeast, we show that it is possible to significantly improve motifs identified by nine motif finders. The motifs are generated/optimized using training sets and evaluated on test sets. The AUC is improved for almost 90% of the TFs on test sets and the magnitude of increase is up to 39%. Availability and implementation: DiMO is available at http://stormo.wustl.edu/DiMO Contact: rpatel@genetics.wustl.edu, ronakypatel@gmail.com PMID:24369152

  7. Nonlinear Multidimensional Assignment Problems Efficient Conic Optimization Methods and Applications

    DTIC Science & Technology

    2015-06-24

    WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Arizona State University School of Mathematical & Statistical Sciences 901 S...SUPPLEMENTARY NOTES 14. ABSTRACT The major goals of this project were completed: the exact solution of previously unsolved challenging combinatorial optimization... combinatorial optimization problem, the Directional Sensor Problem, was solved in two ways. First, heuristically in an engineering fashion and second, exactly

  8. Fruit fly optimization based least square support vector regression for blind image restoration

    NASA Astrophysics Data System (ADS)

    Zhang, Jiao; Wang, Rui; Li, Junshan; Yang, Yawei

    2014-11-01

    The goal of image restoration is to reconstruct the original scene from a degraded observation. It is a critical and challenging task in image processing. Classical restorations require explicit knowledge of the point spread function and a description of the noise as priors. However, it is not practical for many real image processing. The recovery processing needs to be a blind image restoration scenario. Since blind deconvolution is an ill-posed problem, many blind restoration methods need to make additional assumptions to construct restrictions. Due to the differences of PSF and noise energy, blurring images can be quite different. It is difficult to achieve a good balance between proper assumption and high restoration quality in blind deconvolution. Recently, machine learning techniques have been applied to blind image restoration. The least square support vector regression (LSSVR) has been proven to offer strong potential in estimating and forecasting issues. Therefore, this paper proposes a LSSVR-based image restoration method. However, selecting the optimal parameters for support vector machine is essential to the training result. As a novel meta-heuristic algorithm, the fruit fly optimization algorithm (FOA) can be used to handle optimization problems, and has the advantages of fast convergence to the global optimal solution. In the proposed method, the training samples are created from a neighborhood in the degraded image to the central pixel in the original image. The mapping between the degraded image and the original image is learned by training LSSVR. The two parameters of LSSVR are optimized though FOA. The fitness function of FOA is calculated by the restoration error function. With the acquired mapping, the degraded image can be recovered. Experimental results show the proposed method can obtain satisfactory restoration effect. Compared with BP neural network regression, SVR method and Lucy-Richardson algorithm, it speeds up the restoration rate and performs better. Both objective and subjective restoration performances are studied in the comparison experiments.

  9. Open shop scheduling problem to minimize total weighted completion time

    NASA Astrophysics Data System (ADS)

    Bai, Danyu; Zhang, Zhihai; Zhang, Qiang; Tang, Mengqian

    2017-01-01

    A given number of jobs in an open shop scheduling environment must each be processed for given amounts of time on each of a given set of machines in an arbitrary sequence. This study aims to achieve a schedule that minimizes total weighted completion time. Owing to the strong NP-hardness of the problem, the weighted shortest processing time block (WSPTB) heuristic is presented to obtain approximate solutions for large-scale problems. Performance analysis proves the asymptotic optimality of the WSPTB heuristic in the sense of probability limits. The largest weight block rule is provided to seek optimal schedules in polynomial time for a special case. A hybrid discrete differential evolution algorithm is designed to obtain high-quality solutions for moderate-scale problems. Simulation experiments demonstrate the effectiveness of the proposed algorithms.

  10. The development of a culture of problem solving with secondary students through heuristic strategies

    NASA Astrophysics Data System (ADS)

    Eisenmann, Petr; Novotná, Jarmila; Přibyl, Jiří; Břehovský, Jiří

    2015-12-01

    The article reports the results of a longitudinal research study conducted in three mathematics classes in Czech schools with 62 pupils aged 12-18 years. The pupils were exposed to the use of selected heuristic strategies in mathematical problem solving for a period of 16 months. This was done through solving problems where the solution was the most efficient if heuristic strategies were used. The authors conducted a two-dimensional classification of the use of heuristic strategies based on the work of Pólya (2004) and Schoenfeld (1985). We developed a tool that allows for the description of a pupil's ability to solve problems. Named, the Culture of Problem Solving (CPS), this tool consists of four components: intelligence, text comprehension, creativity and the ability to use existing knowledge. The pupils' success rate in problem solving and the changes in some of the CPS factors pre- and post-experiment were monitored. The pupils appeared to considerably improve in the creativity component. In addition, the results indicate a positive change in the students' attitude to problem solving. As far as the teachers participating in the experiment are concerned, a significant change was in their teaching style to a more constructivist, inquiry-based approach, as well as their willingness to accept a student's non-standard approach to solving a problem. Another important outcome of the research was the identification of the heuristic strategies that can be taught via long-term guided solutions of suitable problems and those that cannot. Those that can be taught include systematic experimentation, guess-check-revise and introduction of an auxiliary element. Those that cannot be taught (or can only be taught with difficulty) include the strategies of specification and generalization and analogy.

  11. Optimization Techniques for Analysis of Biological and Social Networks

    DTIC Science & Technology

    2012-03-28

    analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational

  12. A chance constraint estimation approach to optimizing resource management under uncertainty

    Treesearch

    Michael Bevers

    2007-01-01

    Chance-constrained optimization is an important method for managing risk arising from random variations in natural resource systems, but the probabilistic formulations often pose mathematical programming problems that cannot be solved with exact methods. A heuristic estimation method for these problems is presented that combines a formulation for order statistic...

  13. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  14. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  15. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  16. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  17. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.

    PubMed

    Aitchison, Laurence; Bang, Dan; Bahrami, Bahador; Latham, Peter E

    2015-10-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people's confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality.

  18. A Lifetime Maximization Relay Selection Scheme in Wireless Body Area Networks.

    PubMed

    Zhang, Yu; Zhang, Bing; Zhang, Shi

    2017-06-02

    Network Lifetime is one of the most important metrics in Wireless Body Area Networks (WBANs). In this paper, a relay selection scheme is proposed under the topology constrains specified in the IEEE 802.15.6 standard to maximize the lifetime of WBANs through formulating and solving an optimization problem where relay selection of each node acts as optimization variable. Considering the diversity of the sensor nodes in WBANs, the optimization problem takes not only energy consumption rate but also energy difference among sensor nodes into account to improve the network lifetime performance. Since it is Non-deterministic Polynomial-hard (NP-hard) and intractable, a heuristic solution is then designed to rapidly address the optimization. The simulation results indicate that the proposed relay selection scheme has better performance in network lifetime compared with existing algorithms and that the heuristic solution has low time complexity with only a negligible performance degradation gap from optimal value. Furthermore, we also conduct simulations based on a general WBAN model to comprehensively illustrate the advantages of the proposed algorithm. At the end of the evaluation, we validate the feasibility of our proposed scheme via an implementation discussion.

  19. Simulation-based planning for theater air warfare

    NASA Astrophysics Data System (ADS)

    Popken, Douglas A.; Cox, Louis A., Jr.

    2004-08-01

    Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.

  20. An efficient approach to improve the usability of e-learning resources: the role of heuristic evaluation.

    PubMed

    Davids, Mogamat Razeen; Chikte, Usuf M E; Halperin, Mitchell L

    2013-09-01

    Optimizing the usability of e-learning materials is necessary to maximize their potential educational impact, but this is often neglected when time and other resources are limited, leading to the release of materials that cannot deliver the desired learning outcomes. As clinician-teachers in a resource-constrained environment, we investigated whether heuristic evaluation of our multimedia e-learning resource by a panel of experts would be an effective and efficient alternative to testing with end users. We engaged six inspectors, whose expertise included usability, e-learning, instructional design, medical informatics, and the content area of nephrology. They applied a set of commonly used heuristics to identify usability problems, assigning severity scores to each problem. The identification of serious problems was compared with problems previously found by user testing. The panel completed their evaluations within 1 wk and identified a total of 22 distinct usability problems, 11 of which were considered serious. The problems violated the heuristics of visibility of system status, user control and freedom, match with the real world, intuitive visual layout, consistency and conformity to standards, aesthetic and minimalist design, error prevention and tolerance, and help and documentation. Compared with user testing, heuristic evaluation found most, but not all, of the serious problems. Combining heuristic evaluation and user testing, with each involving a small number of participants, may be an effective and efficient way of improving the usability of e-learning materials. Heuristic evaluation should ideally be used first to identify the most obvious problems and, once these are fixed, should be followed by testing with typical end users.

  1. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  2. Heuristic use of perceptual evidence leads to dissociation between performance and metacognitive sensitivity.

    PubMed

    Maniscalco, Brian; Peters, Megan A K; Lau, Hakwan

    2016-04-01

    Zylberberg et al. [Zylberberg, Barttfeld, & Sigman (Frontiers in Integrative Neuroscience, 6; 79, 2012), Frontiers in Integrative Neuroscience 6:79] found that confidence decisions, but not perceptual decisions, are insensitive to evidence against a selected perceptual choice. We present a signal detection theoretic model to formalize this insight, which gave rise to a counter-intuitive empirical prediction: that depending on the observer's perceptual choice, increasing task performance can be associated with decreasing metacognitive sensitivity (i.e., the trial-by-trial correspondence between confidence and accuracy). The model also provides an explanation as to why metacognitive sensitivity tends to be less than optimal in actual subjects. These predictions were confirmed robustly in a psychophysics experiment. In a second experiment we found that, in at least some subjects, the effects were replicated even under performance feedback designed to encourage optimal behavior. However, some subjects did show improvement under feedback, suggesting the tendency to ignore evidence against a selected perceptual choice may be a heuristic adopted by the perceptual decision-making system, rather than reflecting inherent biological limitations. We present a Bayesian modeling framework that explains why this heuristic strategy may be advantageous in real-world contexts.

  3. Multi-objective optimization in spatial planning: Improving the effectiveness of multi-objective evolutionary algorithms (non-dominated sorting genetic algorithm II)

    NASA Astrophysics Data System (ADS)

    Karakostas, Spiros

    2015-05-01

    The multi-objective nature of most spatial planning initiatives and the numerous constraints that are introduced in the planning process by decision makers, stakeholders, etc., synthesize a complex spatial planning context in which the concept of solid and meaningful optimization is a unique challenge. This article investigates new approaches to enhance the effectiveness of multi-objective evolutionary algorithms (MOEAs) via the adoption of a well-known metaheuristic: the non-dominated sorting genetic algorithm II (NSGA-II). In particular, the contribution of a sophisticated crossover operator coupled with an enhanced initialization heuristic is evaluated against a series of metrics measuring the effectiveness of MOEAs. Encouraging results emerge for both the convergence rate of the evolutionary optimization process and the occupation of valuable regions of the objective space by non-dominated solutions, facilitating the work of spatial planners and decision makers. Based on the promising behaviour of both heuristics, topics for further research are proposed to improve their effectiveness.

  4. Search asymmetries: parallel processing of uncertain sensory information.

    PubMed

    Vincent, Benjamin T

    2011-08-01

    What is the mechanism underlying search phenomena such as search asymmetry? Two-stage models such as Feature Integration Theory and Guided Search propose parallel pre-attentive processing followed by serial post-attentive processing. They claim search asymmetry effects are indicative of finding pairs of features, one processed in parallel, the other in serial. An alternative proposal is that a 1-stage parallel process is responsible, and search asymmetries occur when one stimulus has greater internal uncertainty associated with it than another. While the latter account is simpler, only a few studies have set out to empirically test its quantitative predictions, and many researchers still subscribe to the 2-stage account. This paper examines three separate parallel models (Bayesian optimal observer, max rule, and a heuristic decision rule). All three parallel models can account for search asymmetry effects and I conclude that either people can optimally utilise the uncertain sensory data available to them, or are able to select heuristic decision rules which approximate optimal performance. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Approximation, abstraction and decomposition in search and optimization

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1992-01-01

    In this paper, I discuss four different areas of my research. One portion of my research has focused on automatic synthesis of search control heuristics for constraint satisfaction problems (CSPs). I have developed techniques for automatically synthesizing two types of heuristics for CSPs: Filtering functions are used to remove portions of a search space from consideration. Another portion of my research is focused on automatic synthesis of hierarchic algorithms for solving constraint satisfaction problems (CSPs). I have developed a technique for constructing hierarchic problem solvers based on numeric interval algebra. Another portion of my research is focused on automatic decomposition of design optimization problems. We are using the design of racing yacht hulls as a testbed domain for this research. Decomposition is especially important in the design of complex physical shapes such as yacht hulls. Another portion of my research is focused on intelligent model selection in design optimization. The model selection problem results from the difficulty of using exact models to analyze the performance of candidate designs.

  6. Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning

    NASA Technical Reports Server (NTRS)

    Otterstatter, Matthew R.

    2005-01-01

    The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.

  7. An Evolutionary Optimization of the Refueling Simulation for a CANDU Reactor

    NASA Astrophysics Data System (ADS)

    Do, Q. B.; Choi, H.; Roh, G. H.

    2006-10-01

    This paper presents a multi-cycle and multi-objective optimization method for the refueling simulation of a 713 MWe Canada deuterium uranium (CANDU-6) reactor based on a genetic algorithm, an elitism strategy and a heuristic rule. The proposed algorithm searches for the optimal refueling patterns for a single cycle that maximizes the average discharge burnup, minimizes the maximum channel power and minimizes the change in the zone controller unit water fills while satisfying the most important safety-related neutronic parameters of the reactor core. The heuristic rule generates an initial population of individuals very close to a feasible solution and it reduces the computing time of the optimization process. The multi-cycle optimization is carried out based on a single cycle refueling simulation. The proposed approach was verified by a refueling simulation of a natural uranium CANDU-6 reactor for an operation period of 6 months at an equilibrium state and compared with the experience-based automatic refueling simulation and the generalized perturbation theory. The comparison has shown that the simulation results are consistent from each other and the proposed approach is a reasonable optimization method of the refueling simulation that controls all the safety-related parameters of the reactor core during the simulation

  8. “Up Means Good”

    PubMed Central

    Tourangeau, Roger

    2013-01-01

    This paper presents results from six experiments that examine the effect of the position of an item on the screen on the evaluative ratings it receives. The experiments are based on the idea that respondents expect “good” things—those they view positively—to be higher up on the screen than “bad” things. The experiments use items on different topics (Congress and HMOs, a variety of foods, and six physician specialties) and different methods for varying their vertical position on the screen. A meta-analysis of all six experiments demonstrates a small but reliable effect of the item’s screen position on mean ratings of the item; the ratings are significantly more positive when the item appears in a higher position on the screen than when it appears farther down. These results are consistent with the hypothesis that respondents follow the “Up means good” heuristic, using the vertical position of the item as a cue in evaluating it. Respondents seem to rely on heuristics both in interpreting response scales and in forming judgments. PMID:24634546

  9. A heuristic evaluation of long-term global sea level acceleration

    NASA Astrophysics Data System (ADS)

    Spada, Giorgio; Olivieri, Marco; Galassi, Gaia

    2015-05-01

    In view of the scientific and social implications, the global mean sea level rise (GMSLR) and its possible causes and future trend have been a challenge for so long. For the twentieth century, reconstructions generally indicate a rate of GMSLR in the range of 1.5 to 2.0 mm yr-1. However, the existence of nonlinear trends is still debated, and current estimates of the secular acceleration are subject to ample uncertainties. Here we use various GMSLR estimates published on scholarly journals since the 1940s for a heuristic assessment of global sea level acceleration. The approach, alternative to sea level reconstructions, is based on simple statistical methods and exploits the principles of meta-analysis. Our results point to a global sea level acceleration of 0.54 ± 0.27 mm/yr/century (1σ) between 1898 and 1975. This supports independent estimates and suggests that a sea level acceleration since the early 1900s is more likely than currently believed.

  10. A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics.

    PubMed

    Halloran, John T; Rocke, David M

    2018-05-04

    Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l 2 -SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l 2 -SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l 2 -SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator_upgrade .

  11. EnRICH: Extraction and Ranking using Integration and Criteria Heuristics.

    PubMed

    Zhang, Xia; Greenlee, M Heather West; Serb, Jeanne M

    2013-01-15

    High throughput screening technologies enable biologists to generate candidate genes at a rate that, due to time and cost constraints, cannot be studied by experimental approaches in the laboratory. Thus, it has become increasingly important to prioritize candidate genes for experiments. To accomplish this, researchers need to apply selection requirements based on their knowledge, which necessitates qualitative integration of heterogeneous data sources and filtration using multiple criteria. A similar approach can also be applied to putative candidate gene relationships. While automation can assist in this routine and imperative procedure, flexibility of data sources and criteria must not be sacrificed. A tool that can optimize the trade-off between automation and flexibility to simultaneously filter and qualitatively integrate data is needed to prioritize candidate genes and generate composite networks from heterogeneous data sources. We developed the java application, EnRICH (Extraction and Ranking using Integration and Criteria Heuristics), in order to alleviate this need. Here we present a case study in which we used EnRICH to integrate and filter multiple candidate gene lists in order to identify potential retinal disease genes. As a result of this procedure, a candidate pool of several hundred genes was narrowed down to five candidate genes, of which four are confirmed retinal disease genes and one is associated with a retinal disease state. We developed a platform-independent tool that is able to qualitatively integrate multiple heterogeneous datasets and use different selection criteria to filter each of them, provided the datasets are tables that have distinct identifiers (required) and attributes (optional). With the flexibility to specify data sources and filtering criteria, EnRICH automatically prioritizes candidate genes or gene relationships for biologists based on their specific requirements. Here, we also demonstrate that this tool can be effectively and easily used to apply highly specific user-defined criteria and can efficiently identify high quality candidate genes from relatively sparse datasets.

  12. Selected Topics on Decision Making for Electric Vehicles

    NASA Astrophysics Data System (ADS)

    Sweda, Timothy Matthew

    Electric vehicles (EVs) are an attractive alternative to conventional gasoline-powered vehicles due to their lower emissions, fuel costs, and maintenance costs. Range anxiety, or the fear of running out of charge prior to reaching one's destination, remains a significant concern, however. In this dissertation, we address the issue of range anxiety by developing a set of decision support tools for both charging infrastructure providers and EV drivers. In Chapter 1, we present an agent-based information system for identifying patterns in residential EV ownership and driving activities to enable strategic deployment of new charging infrastructure. Driver agents consider their own driving activities within the simulated environment, in addition to the presence of charging stations and the vehicle ownership of others in their social networks, when purchasing a new vehicle. The Chicagoland area is used as a case study to demonstrate the model, and several deployment scenarios are analyzed. In Chapter 2, we address the problem of finding an optimal recharging policy for an EV along a given path. The path consists of a sequence of nodes, each representing a charging station, and the driver must decide where to stop and how much to recharge at each stop. We present efficient algorithms for finding an optimal policy in general instances with deterministic travel costs and homogeneous charging stations, and also for two specialized cases. In addition, we develop two heuristic procedures that we characterize analytically and explore empirically. We further analyze and test our solution methods on model variations that include stochastic travel costs and nonhomogeneous charging stations. In Chapter 3, we study the problem of finding an optimal routing and recharging policy for an electric vehicle in a grid network. Each node in the network represents a charging station and has an associated probability of being available at any point in time or occupied by another vehicle. We present an efficient algorithm for finding an optimal a priori route and recharging policy as well as heuristic methods for finding adaptive policies. We conduct numerical experiments to demonstrate the empirical performance of our solutions.

  13. The E-health Literacy Demands of Australia's My Health Record: A Heuristic Evaluation of Usability

    PubMed Central

    Walsh, Louisa; Hemsley, Bronwyn; Allan, Meredith; Adams, Natalie; Balandin, Susan; Georgiou, Andrew; Higgins, Isabel; McCarthy, Shaun; Hill, Sophie

    2017-01-01

    Background My Health Record is Australia's electronic personal health record system, which was introduced in July 2012. As of August 2017, approximately 21 percent of Australia's total population was registered to use My Health Record. Internationally, usability issues have been shown to negatively influence the uptake and use of electronic health record systems, and this scenario may particularly affect people who have low e-health literacy. It is likely that usability issues are negatively affecting the uptake and use of My Health Record in Australia. Objective To identify potential e-health literacy–related usability issues within My Health Record through a heuristic evaluation method. Methods Between September 14 and October 12, 2016, three of the authors conducted a heuristic evaluation of the two consumer-facing components of My Health Record—the information website and the electronic health record itself. These two components were evaluated against two sets of heuristics—the Health Literacy Online checklist and the Monkman Heuristics. The Health Literacy Online checklist and Monkman Heuristics are evidence-based checklists of web design elements with a focus on design for audiences with low health literacy. During this heuristic evaluation, the investigators individually navigated through the consumer-facing components of My Health Record, recording instances where the My Health Record did not conform to the checklist criteria. After the individual evaluations were completed, the investigators conferred and aggregated their results. From this process, a list of usability violations was constructed. Results When evaluated against the Health Literacy Online Checklist, the information website demonstrated violations in 12 of 35 criteria, and the electronic health record demonstrated violations in 16 of 35 criteria. When evaluated against the Monkman Heuristics, the information website demonstrated violations in 7 of 11 criteria, and the electronic health record demonstrated violations in 9 of 11 criteria. The identified violations included usability issues with the reading levels used within My Health Record, the graphic design elements, the layout of web pages, and a lack of images and audiovisual tools to support learning. Other important usability issues included a lack of translated resources, difficulty using accessibility tools, and complexity of the registration processes. Conclusion My Health Record is an important piece of technology that has the potential to facilitate better communication between consumers and their health providers. However, this heuristic evaluation demonstrated that many usability-related elements of My Health Record cater poorly to users at risk of having low e-health literacy. Usability issues have been identified as an important barrier to use of personal health records internationally, and the findings of this heuristic evaluation demonstrate that usability issues may be substantial barriers to the uptake and use of My Health Record. PMID:29118683

  14. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  15. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  16. A model for distribution centers location-routing problem on a multimodal transportation network with a meta-heuristic solving approach

    NASA Astrophysics Data System (ADS)

    Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai

    2017-07-01

    Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.

  17. A model for distribution centers location-routing problem on a multimodal transportation network with a meta-heuristic solving approach

    NASA Astrophysics Data System (ADS)

    Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai

    2018-07-01

    Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.

  18. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  19. Optimization Models for Scheduling of Jobs

    PubMed Central

    Indika, S. H. Sathish; Shier, Douglas R.

    2006-01-01

    This work is motivated by a particular scheduling problem that is faced by logistics centers that perform aircraft maintenance and modification. Here we concentrate on a single facility (hangar) which is equipped with several work stations (bays). Specifically, a number of jobs have already been scheduled for processing at the facility; the starting times, durations, and work station assignments for these jobs are assumed to be known. We are interested in how best to schedule a number of new jobs that the facility will be processing in the near future. We first develop a mixed integer quadratic programming model (MIQP) for this problem. Since the exact solution of this MIQP formulation is time consuming, we develop a heuristic procedure, based on existing bin packing techniques. This heuristic is further enhanced by application of certain local optimality conditions. PMID:27274921

  20. Optimal and heuristic algorithms of planning of low-rise residential buildings

    NASA Astrophysics Data System (ADS)

    Kartak, V. M.; Marchenko, A. A.; Petunin, A. A.; Sesekin, A. N.; Fabarisova, A. I.

    2017-10-01

    The problem of the optimal layout of low-rise residential building is considered. Each apartment must be no less than the corresponding apartment from the proposed list. Also all requests must be made and excess of the total square over of the total square of apartment from the list must be minimized. The difference in the squares formed due to with the discreteness of distances between bearing walls and a number of other technological limitations. It shown, that this problem is NP-hard. The authors built a linear-integer model and conducted her qualitative analysis. As well, authors developed a heuristic algorithm for the solution tasks of a high dimension. The computational experiment was conducted which confirming the efficiency of the proposed approach. Practical recommendations on the use the proposed algorithms are given.

  1. Implied alignment: a synapomorphy-based multiple-sequence alignment method and its use in cladogram search

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    A method to align sequence data based on parsimonious synapomorphy schemes generated by direct optimization (DO; earlier termed optimization alignment) is proposed. DO directly diagnoses sequence data on cladograms without an intervening multiple-alignment step, thereby creating topology-specific, dynamic homology statements. Hence, no multiple-alignment is required to generate cladograms. Unlike general and globally optimal multiple-alignment procedures, the method described here, implied alignment (IA), takes these dynamic homologies and traces them back through a single cladogram, linking the unaligned sequence positions in the terminal taxa via DO transformation series. These "lines of correspondence" link ancestor-descendent states and, when displayed as linearly arrayed columns without hypothetical ancestors, are largely indistinguishable from standard multiple alignment. Since this method is based on synapomorphy, the treatment of certain classes of insertion-deletion (indel) events may be different from that of other alignment procedures. As with all alignment methods, results are dependent on parameter assumptions such as indel cost and transversion:transition ratios. Such an IA could be used as a basis for phylogenetic search, but this would be questionable since the homologies derived from the implied alignment depend on its natal cladogram and any variance, between DO and IA + Search, due to heuristic approach. The utility of this procedure in heuristic cladogram searches using DO and the improvement of heuristic cladogram cost calculations are discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  2. Applying User-Centered Design Methods to the Development of an mHealth Application for Use in the Hospital Setting by Patients and Care Partners.

    PubMed

    Couture, Brittany; Lilley, Elizabeth; Chang, Frank; DeBord Smith, Ann; Cleveland, Jessica; Ergai, Awatef; Katsulis, Zachary; Benneyan, James; Gershanik, Esteban; Bates, David W; Collins, Sarah A

    2018-04-01

     Developing an optimized and user-friendly mHealth application for patients and family members in the hospital environment presents unique challenges given the diverse patient population and patients' various states of well-being.  This article describes user-centered design methods and results for developing the patient and family facing user interface and functionality of MySafeCare, a safety reporting tool for hospitalized patients and their family members.  Individual and group usability sessions were conducted with specific testing scenarios for participants to follow to test the usability and functionality of the tool. Participants included patients, family members, and Patient and Family Advisory Council (PFAC) members. Engagement rounds were also conducted on study units and lessons learned provided additional information to the usability work. Usability results were aligned with Nielsen's Usability Heuristics.  Eleven patients and family members and 25 PFAC members participated in usability testing and over 250 patients and family members were engaged during research team rounding. Specific themes resulting from the usability testing sessions influenced the changes made to the user interface design, workflow functionality, and terminology.  User-centered design should focus on workflow functionality, terminology, and user interface issues for mHealth applications. These themes illustrated issues aligned with four of Nielsen's Usability Heuristics: match between system and the real world, consistency and standards, flexibility and efficiency of use, and aesthetic and minimalist design. We identified workflow and terminology issues that may be specific to the use of an mHealth application focused on safety and used by hospitalized patients and their families. Schattauer GmbH Stuttgart.

  3. Optimization of Boiling Water Reactor Loading Pattern Using Two-Stage Genetic Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2002-10-15

    A new two-stage optimization method based on genetic algorithms (GAs) using an if-then heuristic rule was developed to generate optimized boiling water reactor (BWR) loading patterns (LPs). In the first stage, the LP is optimized using an improved GA operator. In the second stage, an exposure-dependent control rod pattern (CRP) is sought using GA with an if-then heuristic rule. The procedure of the improved GA is based on deterministic operators that consist of crossover, mutation, and selection. The handling of the encoding technique and constraint conditions by that GA reflects the peculiar characteristics of the BWR. In addition, strategies suchmore » as elitism and self-reproduction are effectively used in order to improve the search speed. The LP evaluations were performed with a three-dimensional diffusion code that coupled neutronic and thermal-hydraulic models. Strong axial heterogeneities and constraints dependent on three dimensions have always necessitated the use of three-dimensional core simulators for BWRs, so that optimization of computational efficiency is required. The proposed algorithm is demonstrated by successfully generating LPs for an actual BWR plant in two phases. One phase is only LP optimization applying the Haling technique. The other phase is an LP optimization that considers the CRP during reactor operation. In test calculations, candidates that shuffled fresh and burned fuel assemblies within a reasonable computation time were obtained.« less

  4. Augmented neural networks and problem structure-based heuristics for the bin-packing problem

    NASA Astrophysics Data System (ADS)

    Kasap, Nihat; Agarwal, Anurag

    2012-08-01

    In this article, we report on a research project where we applied augmented-neural-networks (AugNNs) approach for solving the classical bin-packing problem (BPP). AugNN is a metaheuristic that combines a priority rule heuristic with the iterative search approach of neural networks to generate good solutions fast. This is the first time this approach has been applied to the BPP. We also propose a decomposition approach for solving harder BPP, in which subproblems are solved using a combination of AugNN approach and heuristics that exploit the problem structure. We discuss the characteristics of problems on which such problem structure-based heuristics could be applied. We empirically show the effectiveness of the AugNN and the decomposition approach on many benchmark problems in the literature. For the 1210 benchmark problems tested, 917 problems were solved to optimality and the average gap between the obtained solution and the upper bound for all the problems was reduced to under 0.66% and computation time averaged below 33 s per problem. We also discuss the computational complexity of our approach.

  5. Implementing and Bounding a Cascade Heuristic for Large-Scale Optimization

    DTIC Science & Technology

    2017-06-01

    solving the monolith, we develop a method for producing lower bounds to the optimal objective function value. To do this, we solve a new integer...as developing and analyzing methods for producing lower bounds to the optimal objective function value of the seminal problem monolith, which this...length of the window decreases, the end effects of the model typically increase (Zerr, 2016). There are four primary methods for correcting end

  6. Heuristic decomposition for non-hierarchic systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; Hajela, P.

    1991-01-01

    Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.

  7. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert

    1985-01-01

    The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.

  8. a New Multimodal Multi-Criteria Route Planning Model by Integrating a Fuzzy-Ahp Weighting Method and a Simulated Annealing Algorithm

    NASA Astrophysics Data System (ADS)

    Ghaderi, F.; Pahlavani, P.

    2015-12-01

    A multimodal multi-criteria route planning (MMRP) system provides an optimal multimodal route from an origin point to a destination point considering two or more criteria in a way this route can be a combination of public and private transportation modes. In this paper, the simulate annealing (SA) and the fuzzy analytical hierarchy process (fuzzy AHP) were combined in order to find this route. In this regard, firstly, the effective criteria that are significant for users in their trip were determined. Then the weight of each criterion was calculated using the fuzzy AHP weighting method. The most important characteristic of this weighting method is the use of fuzzy numbers that aids the users to consider their uncertainty in pairwise comparison of criteria. After determining the criteria weights, the proposed SA algorithm were used for determining an optimal route from an origin to a destination. One of the most important problems in a meta-heuristic algorithm is trapping in local minima. In this study, five transportation modes, including subway, bus rapid transit (BRT), taxi, walking, and bus were considered for moving between nodes. Also, the fare, the time, the user's bother, and the length of the path were considered as effective criteria for solving the problem. The proposed model was implemented in an area in centre of Tehran in a GUI MATLAB programming language. The results showed a high efficiency and speed of the proposed algorithm that support our analyses.

  9. Development of a smart home simulator for use as a heuristic tool for management of sensor distribution.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2009-01-01

    Smart Homes offer potential solutions for various forms of independent living for the elderly. The assistive and protective environment afforded by smart homes offer a safe, relatively inexpensive, dependable and viable alternative to vulnerable inhabitants. Nevertheless, the success of a smart home rests upon the quality of information its decision support system receives and this in turn places great importance on the issue of correct sensor deployment. In this article we present a software tool that has been developed to address the elusive issue of sensor distribution within smart homes. Details of the tool will be presented and it will be shown how it can be used to emulate any real world environment whereby virtual sensor distributions can be rapidly implemented and assessed without the requirement for physical deployment for evaluation. As such, this approach offers the potential of tailoring sensor distributions to the specific needs of a patient in a non-evasive manner. The heuristics based tool presented here has been developed as the first part of a three stage project.

  10. Kant on biological teleology: Towards a two-level interpretation.

    PubMed

    Quarfood, Marcel

    2006-12-01

    According to Kant's Critique of the power of judgment, teleological considerations are unavoidable for conceptualizing organisms. Does this mean that teleology is more than merely heuristic? Kant stresses the regulative status of teleological attributions, but sometimes he seems to treat teleology as a constitutive condition for biology. To clarify this issue, the concept of natural purpose and its role for biology are examined. I suggest that the concept serves an identificatory function: it singles out objects as natural purposes, whereby the special science of biology is constituted. This relative constitutivity of teleology is explicated by means of a distinction of levels: on the object level of biological science, teleology is taken as constitutive, though it is merely regulative on the philosophical meta level. This distinction also concerns the place of Aristotelian teleology in Kant: on the object level, the Aristotelian view is accepted, whereas on the meta level, an agnostic stance is taken concerning teleology.

  11. Public Inaccuracy in Meta-perceptions of Climate Change

    NASA Astrophysics Data System (ADS)

    Swim, J.; Fraser, J.

    2012-12-01

    Public perceptions of climate change and meta-perceptions of the public and climate scientist's perceptions of climate change were assessed to benchmark the National Network for Climate Change Interpretation's impacts. Meta-perceptions are important to examine because they can have implications for willingness to take action to address climate change. For instance, recent research suggests a tendency to misperceive that there is disagreement among climate scientists is predictive of lack of support for climate change policies. Underestimating public concern about climate change could also be problematic: it could lead individuals to withdraw from personal efforts to reduce impact and engage others in discussions about climate change. Presented results will demonstrate that respondents in a national survey underestimated the percent of the public who were very concerned, concerned or cautious about climate change and overestimated the extent others were disengaged, doubted, or non-believers. They underestimated the percent of the public who likely believed that humans caused climate change and overestimate the percent that believed climate change was not happening nor human induced. Finally, they underestimated the percent of the public that believed climate change threatened ocean health. The results also explore sources of misperceptions. First, correlates with TV viewing habits suggest that inaccuracy is a result of too little attention to network news, with one exception: Greater attention to FOX among doubters reduced accuracy. Second, adding to other evidence that basic cognitive heuristics (such as availability heuristic) influence perceptions of climate change, we show that that false consensus effects account for meta-perceptions of the public and climate scientists beliefs. The false consensus effect, in combination with underestimating concern among the public, results in those most concerned about climate change and those who believe it to be human caused to be more accurate in their meta-perceptions than their disbelieving counterparts. Yet, even this group underestimates the public's concern about climate change and the presence of the false consensus effect suggests that greater accuracy is not a result of greater knowledge about other's beliefs but rather a result of personal cognitive or motivational biases counteracting a general trend toward underestimating the general public's concern. We conclude that there is need to inform the public about wide-spread agreement that human caused climate change and its impacts on oceans is believed by the majority of the public and to increase the public's confidence in climate scientist agreement about the existence, causes, and impacts of climate change.; Perceptions and metaperceptions of concern about climate change

  12. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  13. Data Farming and Defense Applications

    NASA Technical Reports Server (NTRS)

    Horne, Gary; Meyer, Ted

    2011-01-01

    .Data farm,ing uses simulation modeling, high performance computing, experimental design and analysis to examine questions of interest with large possibility spaces. This methodology allows for the examination of whole landscapes of potential outcomes and provides the capability of executing enough experiments so that outliers might be captured and examined for insights. It can be used to conduct sensitivity studies, to support validation and verification of models, to iteratively optimize outputs using heuristic search and discovery, and as an aid to decision-makers in understanding complex relationships of factors. In this paper we describe efforts at the Naval Postgraduate School in developing these new and emerging tools. We also discuss data farming in the context of application to questions inherent in military decision-making. The particular application we illustrate here is social network modeling to support the countering of improvised explosive devices.

  14. Performance Analysis and Portability of the PLUM Load Balancing System

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1998-01-01

    The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.

  15. Scalable Heuristics for Planning, Placement and Sizing of Flexible AC Transmission System Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frolov, Vladmir; Backhaus, Scott N.; Chertkov, Michael

    Aiming to relieve transmission grid congestion and improve or extend feasibility domain of the operations, we build optimization heuristics, generalizing standard AC Optimal Power Flow (OPF), for placement and sizing of Flexible Alternating Current Transmission System (FACTS) devices of the Series Compensation (SC) and Static VAR Compensation (SVC) type. One use of these devices is in resolving the case when the AC OPF solution does not exist because of congestion. Another application is developing a long-term investment strategy for placement and sizing of the SC and SVC devices to reduce operational cost and improve power system operation. SC and SVCmore » devices are represented by modification of the transmission line inductances and reactive power nodal corrections respectively. We find one placement and sizing of FACTs devices for multiple scenarios and optimal settings for each scenario simultaneously. Our solution of the nonlinear and nonconvex generalized AC-OPF consists of building a convergent sequence of convex optimizations containing only linear constraints and shows good computational scaling to larger systems. The approach is illustrated on single- and multi-scenario examples of the Matpower case-30 model.« less

  16. Mitigation of epidemics in contact networks through optimal contact adaptation *

    PubMed Central

    Youssef, Mina; Scoglio, Caterina

    2013-01-01

    This paper presents an optimal control problem formulation to minimize the total number of infection cases during the spread of susceptible-infected-recovered SIR epidemics in contact networks. In the new approach, contact weighted are reduced among nodes and a global minimum contact level is preserved in the network. In addition, the infection cost and the cost associated with the contact reduction are linearly combined in a single objective function. Hence, the optimal control formulation addresses the tradeoff between minimization of total infection cases and minimization of contact weights reduction. Using Pontryagin theorem, the obtained solution is a unique candidate representing the dynamical weighted contact network. To find the near-optimal solution in a decentralized way, we propose two heuristics based on Bang-Bang control function and on a piecewise nonlinear control function, respectively. We perform extensive simulations to evaluate the two heuristics on different networks. Our results show that the piecewise nonlinear control function outperforms the well-known Bang-Bang control function in minimizing both the total number of infection cases and the reduction of contact weights. Finally, our results show awareness of the infection level at which the mitigation strategies are effectively applied to the contact weights. PMID:23906209

  17. Mitigation of epidemics in contact networks through optimal contact adaptation.

    PubMed

    Youssef, Mina; Scoglio, Caterina

    2013-08-01

    This paper presents an optimal control problem formulation to minimize the total number of infection cases during the spread of susceptible-infected-recovered SIR epidemics in contact networks. In the new approach, contact weighted are reduced among nodes and a global minimum contact level is preserved in the network. In addition, the infection cost and the cost associated with the contact reduction are linearly combined in a single objective function. Hence, the optimal control formulation addresses the tradeoff between minimization of total infection cases and minimization of contact weights reduction. Using Pontryagin theorem, the obtained solution is a unique candidate representing the dynamical weighted contact network. To find the near-optimal solution in a decentralized way, we propose two heuristics based on Bang-Bang control function and on a piecewise nonlinear control function, respectively. We perform extensive simulations to evaluate the two heuristics on different networks. Our results show that the piecewise nonlinear control function outperforms the well-known Bang-Bang control function in minimizing both the total number of infection cases and the reduction of contact weights. Finally, our results show awareness of the infection level at which the mitigation strategies are effectively applied to the contact weights.

  18. Social Milieu Oriented Routing: A New Dimension to Enhance Network Security in WSNs.

    PubMed

    Liu, Lianggui; Chen, Li; Jia, Huiling

    2016-02-19

    In large-scale wireless sensor networks (WSNs), in order to enhance network security, it is crucial for a trustor node to perform social milieu oriented routing to a target a trustee node to carry out trust evaluation. This challenging social milieu oriented routing with more than one end-to-end Quality of Trust (QoT) constraint has proved to be NP-complete. Heuristic algorithms with polynomial and pseudo-polynomial-time complexities are often used to deal with this challenging problem. However, existing solutions cannot guarantee the efficiency of searching; that is, they can hardly avoid obtaining partial optimal solutions during a searching process. Quantum annealing (QA) uses delocalization and tunneling to avoid falling into local minima without sacrificing execution time. This has been proven a promising way to many optimization problems in recently published literatures. In this paper, for the first time, with the help of a novel approach, that is, configuration path-integral Monte Carlo (CPIMC) simulations, a QA-based optimal social trust path (QA_OSTP) selection algorithm is applied to the extraction of the optimal social trust path in large-scale WSNs. Extensive experiments have been conducted, and the experiment results demonstrate that QA_OSTP outperforms its heuristic opponents.

  19. Using scenarios and personas to enhance the effectiveness of heuristic usability evaluations for older adults and their care team.

    PubMed

    Kneale, Laura; Mikles, Sean; Choi, Yong K; Thompson, Hilaire; Demiris, George

    2017-09-01

    Using heuristics to evaluate user experience is a common methodology for human-computer interaction studies. One challenge of this method is the inability to tailor results towards specific end-user needs. This manuscript reports on a method that uses validated scenarios and personas of older adults and care team members to enhance heuristics evaluations of the usability of commercially available personal health records for homebound older adults. Our work extends the Chisnell and Redish heuristic evaluation methodology by using a protocol that relies on multiple expert reviews of each system. It further standardizes the heuristic evaluation process through the incorporation of task-based scenarios. We were able to use the modified version of the Chisnell and Redish heuristic evaluation methodology to identify potential usability challenges of two commercially available personal health record systems. This allowed us to: (1) identify potential usability challenges for specific types of users, (2) describe improvements that would be valuable to all end-users of the system, and (3) better understand how the interactions of different users may vary within a single personal health record. The methodology described in this paper may help designers of consumer health information technology tools, such as personal health records, understand the needs of diverse end-user populations. Such methods may be particularly helpful when designing systems for populations that are difficult to recruit for end-user evaluations through traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. HSA: a heuristic splice alignment tool.

    PubMed

    Bu, Jingde; Chi, Xuebin; Jin, Zhong

    2013-01-01

    RNA-Seq methodology is a revolutionary transcriptomics sequencing technology, which is the representative of Next generation Sequencing (NGS). With the high throughput sequencing of RNA-Seq, we can acquire much more information like differential expression and novel splice variants from deep sequence analysis and data mining. But the short read length brings a great challenge to alignment, especially when the reads span two or more exons. A two steps heuristic splice alignment tool is generated in this investigation. First, map raw reads to reference with unspliced aligner--BWA; second, split initial unmapped reads into three equal short reads (seeds), align each seed to the reference, filter hits, search possible split position of read and extend hits to a complete match. Compare with other splice alignment tools like SOAPsplice and Tophat2, HSA has a better performance in call rate and efficiency, but its results do not as accurate as the other software to some extent. HSA is an effective spliced aligner of RNA-Seq reads mapping, which is available at https://github.com/vlcc/HSA.

  1. Heuristic-based information acquisition and decision making among pilots.

    PubMed

    Wiggins, Mark W; Bollwerk, Sandra

    2006-01-01

    This research was designed to examine the impact of heuristic-based approaches to the acquisition of task-related information on the selection of an optimal alternative during simulated in-flight decision making. The work integrated features of naturalistic and normative decision making and strategies of information acquisition within a computer-based, decision support framework. The study comprised two phases, the first of which involved familiarizing pilots with three different heuristic-based strategies of information acquisition: frequency, elimination by aspects, and majority of confirming decisions. The second stage enabled participants to choose one of the three strategies of information acquisition to resolve a fourth (choice) scenario. The results indicated that task-oriented experience, rather than the information acquisition strategies, predicted the selection of the optimal alternative. It was also evident that of the three strategies available, the elimination by aspects information acquisition strategy was preferred by most participants. It was concluded that task-oriented experience, rather than the process of information acquisition, predicted task accuracy during the decision-making task. It was also concluded that pilots have a preference for one particular approach to information acquisition. Applications of outcomes of this research include the development of decision support systems that adapt to the information-processing capabilities and preferences of users.

  2. Accelerated Profile HMM Searches

    PubMed Central

    Eddy, Sean R.

    2011-01-01

    Profile hidden Markov models (profile HMMs) and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the “multiple segment Viterbi” (MSV) algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call “sparse rescaling”. These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches. PMID:22039361

  3. Heuristic-based scheduling algorithm for high level synthesis

    NASA Technical Reports Server (NTRS)

    Mohamed, Gulam; Tan, Han-Ngee; Chng, Chew-Lye

    1992-01-01

    A new scheduling algorithm is proposed which uses a combination of a resource utilization chart, a heuristic algorithm to estimate the minimum number of hardware units based on operator mobilities, and a list-scheduling technique to achieve fast and near optimal schedules. The schedule time of this algorithm is almost independent of the length of mobilities of operators as can be seen from the benchmark example (fifth order digital elliptical wave filter) presented when the cycle time was increased from 17 to 18 and then to 21 cycles. It is implemented in C on a SUN3/60 workstation.

  4. A method for brain 3D surface reconstruction from MR images

    NASA Astrophysics Data System (ADS)

    Zhao, De-xin

    2014-09-01

    Due to the encephalic tissues are highly irregular, three-dimensional (3D) modeling of brain always leads to complicated computing. In this paper, we explore an efficient method for brain surface reconstruction from magnetic resonance (MR) images of head, which is helpful to surgery planning and tumor localization. A heuristic algorithm is proposed for surface triangle mesh generation with preserved features, and the diagonal length is regarded as the heuristic information to optimize the shape of triangle. The experimental results show that our approach not only reduces the computational complexity, but also completes 3D visualization with good quality.

  5. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm.

    PubMed

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-12-14

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits.

  6. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm

    PubMed Central

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-01-01

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits. PMID:27983633

  7. Optimal shortening of uniform covering arrays

    PubMed Central

    Rangel-Valdez, Nelson; Avila-George, Himer; Carrizalez-Turrubiates, Oscar

    2017-01-01

    Software test suites based on the concept of interaction testing are very useful for testing software components in an economical way. Test suites of this kind may be created using mathematical objects called covering arrays. A covering array, denoted by CA(N; t, k, v), is an N × k array over Zv={0,…,v-1} with the property that every N × t sub-array covers all t-tuples of Zvt at least once. Covering arrays can be used to test systems in which failures occur as a result of interactions among components or subsystems. They are often used in areas such as hardware Trojan detection, software testing, and network design. Because system testing is expensive, it is critical to reduce the amount of testing required. This paper addresses the Optimal Shortening of Covering ARrays (OSCAR) problem, an optimization problem whose objective is to construct, from an existing covering array matrix of uniform level, an array with dimensions of (N − δ) × (k − Δ) such that the number of missing t-tuples is minimized. Two applications of the OSCAR problem are (a) to produce smaller covering arrays from larger ones and (b) to obtain quasi-covering arrays (covering arrays in which the number of missing t-tuples is small) to be used as input to a meta-heuristic algorithm that produces covering arrays. In addition, it is proven that the OSCAR problem is NP-complete, and twelve different algorithms are proposed to solve it. An experiment was performed on 62 problem instances, and the results demonstrate the effectiveness of solving the OSCAR problem to facilitate the construction of new covering arrays. PMID:29267343

  8. An effective PSO-based memetic algorithm for flow shop scheduling.

    PubMed

    Liu, Bo; Wang, Ling; Jin, Yi-Hui

    2007-02-01

    This paper proposes an effective particle swarm optimization (PSO)-based memetic algorithm (MA) for the permutation flow shop scheduling problem (PFSSP) with the objective to minimize the maximum completion time, which is a typical non-deterministic polynomial-time (NP) hard combinatorial optimization problem. In the proposed PSO-based MA (PSOMA), both PSO-based searching operators and some special local searching operators are designed to balance the exploration and exploitation abilities. In particular, the PSOMA applies the evolutionary searching mechanism of PSO, which is characterized by individual improvement, population cooperation, and competition to effectively perform exploration. On the other hand, the PSOMA utilizes several adaptive local searches to perform exploitation. First, to make PSO suitable for solving PFSSP, a ranked-order value rule based on random key representation is presented to convert the continuous position values of particles to job permutations. Second, to generate an initial swarm with certain quality and diversity, the famous Nawaz-Enscore-Ham (NEH) heuristic is incorporated into the initialization of population. Third, to balance the exploration and exploitation abilities, after the standard PSO-based searching operation, a new local search technique named NEH_1 insertion is probabilistically applied to some good particles selected by using a roulette wheel mechanism with a specified probability. Fourth, to enrich the searching behaviors and to avoid premature convergence, a simulated annealing (SA)-based local search with multiple different neighborhoods is designed and incorporated into the PSOMA. Meanwhile, an effective adaptive meta-Lamarckian learning strategy is employed to decide which neighborhood to be used in SA-based local search. Finally, to further enhance the exploitation ability, a pairwise-based local search is applied after the SA-based search. Simulation results based on benchmarks demonstrate the effectiveness of the PSOMA. Additionally, the effects of some parameters on optimization performances are also discussed.

  9. A meta-analysis of mismatch negativity in children with attention deficit-hyperactivity disorders.

    PubMed

    Cheng, Chia-Hsiung; Chan, Pei-Ying S; Hsieh, Yu-Wei; Chen, Kuan-Fu

    2016-01-26

    Mismatch negativity (MMN) is an optimal neurophysiological signal to assess the integrity of auditory sensory memory and involuntary attention switch. The generation of MMN is independent of overt behavioral requirements, concentration or motivation, and thus serves as a suitable tool to study the perceptual function in children with attention deficit-hyperactivity disorders (ADHD). It remains unclear whether ADHD children showed altered MMN responses. Therefore we performed a meta-analysis of peer-reviewed MMN studies that had targeted both typically developed and ADHD children to examine the pooled effect size. The published articles between 1990 and 2014 were searched in PubMed, Medline, Cochrane, and CINAHL. The mean effect size and a 95% confidence interval (CI) were estimated. Six studies, consisting of 10 individual investigations, were included in the final analysis. A significant effect size of 0.28 was found (p=0.028, 95% CI at 0.03-0.53). These results were also free from publication bias or heterogeneity. In conclusion, our meta-analysis results suggest ADHD children demonstrated a reduced MMN amplitude compared to healthy controls. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. An international road map to improve pain assessment in people with impaired cognition: the development of the Pain Assessment in Impaired Cognition (PAIC) meta-tool.

    PubMed

    Corbett, Anne; Achterberg, Wilco; Husebo, Bettina; Lobbezoo, Frank; de Vet, Henrica; Kunz, Miriam; Strand, Liv; Constantinou, Marios; Tudose, Catalina; Kappesser, Judith; de Waal, Margot; Lautenbacher, Stefan

    2014-12-10

    Pain is common in people with dementia, yet identification is challenging. A number of pain assessment tools exist, utilizing observation of pain-related behaviours, vocalizations and facial expressions. Whilst they have been developed robustly, these often lack sufficient evidence of psychometric properties, like reliability, face and construct validity, responsiveness and usability, and are not internationally implemented. The EU-COST initiative "Pain in impaired cognition, especially dementia" aims to combine the expertise of clinicians and researchers to address this important issue by building on previous research in the area, identifying existing pain assessment tools for dementia, and developing consensus for items for a new universal meta-tool for use in research and clinical settings. This paper reports on the initial phase of this collaboration task. All existing observational pain behaviour tools were identified and elements categorised using a three-step reduction process. Selection and refinement of items for the draft Pain Assessment in Impaired Cognition (PAIC) meta-tool was achieved through scrutiny of the evidence, consensus of expert opinion, frequency of use and alignment with the American Geriatric Society guidelines. The main aim of this process was to identify key items with potential empirical, rather than theoretical value to take forward for testing. 12 eligible assessment tools were identified, and pain items categorised according to behaviour, facial expression and vocalisation according to the AGS guidelines (Domains 1 - 3). This has been refined to create the PAIC meta-tool for validation and further refinement. A decision was made to create a supporting comprehensive toolkit to support the core assessment tool to provide additional resources for the assessment of overlapping symptoms in dementia, including AGS domains four to six, identification of specific types of pain and assessment of duration and location of pain. This multidisciplinary, cross-cultural initiative has created a draft meta-tool for capturing pain behaviour to be used across languages and culture, based on the most promising items used in existing tools. The draft PAIC meta-tool will now be taken forward for evaluation according to COSMIN guidelines and the EU-COST protocol in order to exclude invalid items, refine included items and optimise the meta-tool.

  11. Exact and Heuristic Algorithms for Runway Scheduling

    NASA Technical Reports Server (NTRS)

    Malik, Waqar A.; Jung, Yoon C.

    2016-01-01

    This paper explores the Single Runway Scheduling (SRS) problem with arrivals, departures, and crossing aircraft on the airport surface. Constraints for wake vortex separations, departure area navigation separations and departure time window restrictions are explicitly considered. The main objective of this research is to develop exact and heuristic based algorithms that can be used in real-time decision support tools for Air Traffic Control Tower (ATCT) controllers. The paper provides a multi-objective dynamic programming (DP) based algorithm that finds the exact solution to the SRS problem, but may prove unusable for application in real-time environment due to large computation times for moderate sized problems. We next propose a second algorithm that uses heuristics to restrict the search space for the DP based algorithm. A third algorithm based on a combination of insertion and local search (ILS) heuristics is then presented. Simulation conducted for the east side of Dallas/Fort Worth International Airport allows comparison of the three proposed algorithms and indicates that the ILS algorithm performs favorably in its ability to find efficient solutions and its computation times.

  12. Regional Value Analysis at Threat Evaluation

    DTIC Science & Technology

    2014-06-01

    targets based on information entropy and fuzzy optimization theory. in Industrial Engineering and Engineering Management (IEEM), 2011 IEEE...Assignment by Virtual Permutation and Tabu Search Heuristics. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 2010

  13. Massively Parallel Dantzig-Wolfe Decomposition Applied to Traffic Flow Scheduling

    NASA Technical Reports Server (NTRS)

    Rios, Joseph Lucio; Ross, Kevin

    2009-01-01

    Optimal scheduling of air traffic over the entire National Airspace System is a computationally difficult task. To speed computation, Dantzig-Wolfe decomposition is applied to a known linear integer programming approach for assigning delays to flights. The optimization model is proven to have the block-angular structure necessary for Dantzig-Wolfe decomposition. The subproblems for this decomposition are solved in parallel via independent computation threads. Experimental evidence suggests that as the number of subproblems/threads increases (and their respective sizes decrease), the solution quality, convergence, and runtime improve. A demonstration of this is provided by using one flight per subproblem, which is the finest possible decomposition. This results in thousands of subproblems and associated computation threads. This massively parallel approach is compared to one with few threads and to standard (non-decomposed) approaches in terms of solution quality and runtime. Since this method generally provides a non-integral (relaxed) solution to the original optimization problem, two heuristics are developed to generate an integral solution. Dantzig-Wolfe followed by these heuristics can provide a near-optimal (sometimes optimal) solution to the original problem hundreds of times faster than standard (non-decomposed) approaches. In addition, when massive decomposition is employed, the solution is shown to be more likely integral, which obviates the need for an integerization step. These results indicate that nationwide, real-time, high fidelity, optimal traffic flow scheduling is achievable for (at least) 3 hour planning horizons.

  14. MetaCRAST: reference-guided extraction of CRISPR spacers from unassembled metagenomes.

    PubMed

    Moller, Abraham G; Liang, Chun

    2017-01-01

    Clustered regularly interspaced short palindromic repeat (CRISPR) systems are the adaptive immune systems of bacteria and archaea against viral infection. While CRISPRs have been exploited as a tool for genetic engineering, their spacer sequences can also provide valuable insights into microbial ecology by linking environmental viruses to their microbial hosts. Despite this importance, metagenomic CRISPR detection remains a major challenge. Here we present a reference-guided CRISPR spacer detection tool ( Meta genomic C RISPR R eference- A ided S earch T ool-MetaCRAST) that constrains searches based on user-specified direct repeats (DRs). These DRs could be expected from assembly or taxonomic profiles of metagenomes. We compared the performance of MetaCRAST to those of two existing metagenomic CRISPR detection tools-Crass and MinCED-using both real and simulated acid mine drainage (AMD) and enhanced biological phosphorus removal (EBPR) metagenomes. Our evaluation shows MetaCRAST improves CRISPR spacer detection in real metagenomes compared to the de novo CRISPR detection methods Crass and MinCED. Evaluation on simulated metagenomes show it performs better than de novo tools for Illumina metagenomes and comparably for 454 metagenomes. It also has comparable performance dependence on read length and community composition, run time, and accuracy to these tools. MetaCRAST is implemented in Perl, parallelizable through the Many Core Engine (MCE), and takes metagenomic sequence reads and direct repeat queries (FASTA or FASTQ) as input. It is freely available for download at https://github.com/molleraj/MetaCRAST.

  15. Two-machine flow shop scheduling integrated with preventive maintenance planning

    NASA Astrophysics Data System (ADS)

    Wang, Shijin; Liu, Ming

    2016-02-01

    This paper investigates an integrated optimisation problem of production scheduling and preventive maintenance (PM) in a two-machine flow shop with time to failure of each machine subject to a Weibull probability distribution. The objective is to find the optimal job sequence and the optimal PM decisions before each job such that the expected makespan is minimised. To investigate the value of integrated scheduling solution, computational experiments on small-scale problems with different configurations are conducted with total enumeration method, and the results are compared with those of scheduling without maintenance but with machine degradation, and individual job scheduling combined with independent PM planning. Then, for large-scale problems, four genetic algorithm (GA) based heuristics are proposed. The numerical results with several large problem sizes and different configurations indicate the potential benefits of integrated scheduling solution and the results also show that proposed GA-based heuristics are efficient for the integrated problem.

  16. Simultaneous delivery time and aperture shape optimization for the volumetric-modulated arc therapy (VMAT) treatment planning problem

    NASA Astrophysics Data System (ADS)

    Mahnam, Mehdi; Gendreau, Michel; Lahrichi, Nadia; Rousseau, Louis-Martin

    2017-07-01

    In this paper, we propose a novel heuristic algorithm for the volumetric-modulated arc therapy treatment planning problem, optimizing the trade-off between delivery time and treatment quality. We present a new mixed integer programming model in which the multi-leaf collimator leaf positions, gantry speed, and dose rate are determined simultaneously. Our heuristic is based on column generation; the aperture configuration is modeled in the columns and the dose distribution and time restriction in the rows. To reduce the number of voxels and increase the efficiency of the master model, we aggregate similar voxels using a clustering technique. The efficiency of the algorithm and the treatment quality are evaluated on a benchmark clinical prostate cancer case. The computational results show that a high-quality treatment is achievable using a four-thread CPU. Finally, we analyze the effects of the various parameters and two leaf-motion strategies.

  17. Adaptive decision rules for the acquisition of nature reserves.

    PubMed

    Turner, Will R; Wilcove, David S

    2006-04-01

    Although reserve-design algorithms have shown promise for increasing the efficiency of conservation planning, recent work casts doubt on the usefulness of some of these approaches in practice. Using three data sets that vary widely in size and complexity, we compared various decision rules for acquiring reserve networks over multiyear periods. We explored three factors that are often important in real-world conservation efforts: uncertain availability of sites for acquisition, degradation of sites, and overall budget constraints. We evaluated the relative strengths and weaknesses of existing optimal and heuristic decision rules and developed a new set of adaptive decision rules that combine the strengths of existing optimal and heuristic approaches. All three of the new adaptive rules performed better than the existing rules we tested under virtually all scenarios of site availability, site degradation, and budget constraints. Moreover, the adaptive rules required no additional data beyond what was readily available and were relatively easy to compute.

  18. Minimal-delay traffic grooming for WDM star networks

    NASA Astrophysics Data System (ADS)

    Choi, Hongsik; Garg, Nikhil; Choi, Hyeong-Ah

    2003-10-01

    All-optical networks face the challenge of reducing slower opto-electronic conversions by managing assignment of traffic streams to wavelengths in an intelligent manner, while at the same time utilizing bandwidth resources to the maximum. This challenge becomes harder in networks closer to the end users that have insufficient data to saturate single wavelengths as well as traffic streams outnumbering the usable wavelengths, resulting in traffic grooming which requires costly traffic analysis at access nodes. We study the problem of traffic grooming that reduces the need to analyze traffic, for a class of network architecture most used by Metropolitan Area Networks; the star network. The problem being NP-complete, we provide an efficient twice-optimal-bound greedy heuristic for the same, that can be used to intelligently groom traffic at the LANs to reduce latency at the access nodes. Simulation results show that our greedy heuristic achieves a near-optimal solution.

  19. Self-Adaptive Stepsize Search Applied to Optimal Structural Design

    NASA Astrophysics Data System (ADS)

    Nolle, L.; Bland, J. A.

    Structural engineering often involves the design of space frames that are required to resist predefined external forces without exhibiting plastic deformation. The weight of the structure and hence the weight of its constituent members has to be as low as possible for economical reasons without violating any of the load constraints. Design spaces are usually vast and the computational costs for analyzing a single design are usually high. Therefore, not every possible design can be evaluated for real-world problems. In this work, a standard structural design problem, the 25-bar problem, has been solved using self-adaptive stepsize search (SASS), a relatively new search heuristic. This algorithm has only one control parameter and therefore overcomes the drawback of modern search heuristics, i.e. the need to first find a set of optimum control parameter settings for the problem at hand. In this work, SASS outperforms simulated-annealing, genetic algorithms, tabu search and ant colony optimization.

  20. Multidimensional heuristic process for high-yield production of astaxanthin and fragrance molecules in Escherichia coli.

    PubMed

    Zhang, Congqiang; Seow, Vui Yin; Chen, Xixian; Too, Heng-Phon

    2018-05-11

    Optimization of metabolic pathways consisting of large number of genes is challenging. Multivariate modular methods (MMMs) are currently available solutions, in which reduced regulatory complexities are achieved by grouping multiple genes into modules. However, these methods work well for balancing the inter-modules but not intra-modules. In addition, application of MMMs to the 15-step heterologous route of astaxanthin biosynthesis has met with limited success. Here, we expand the solution space of MMMs and develop a multidimensional heuristic process (MHP). MHP can simultaneously balance different modules by varying promoter strength and coordinating intra-module activities by using ribosome binding sites (RBSs) and enzyme variants. Consequently, MHP increases enantiopure 3S,3'S-astaxanthin production to 184 mg l -1 day -1 or 320 mg l -1 . Similarly, MHP improves the yields of nerolidol and linalool. MHP may be useful for optimizing other complex biochemical pathways.

  1. A New Combinatorial Optimization Approach for Integrated Feature Selection Using Different Datasets: A Prostate Cancer Transcriptomic Study

    PubMed Central

    Puthiyedth, Nisha; Riveros, Carlos; Berretta, Regina; Moscato, Pablo

    2015-01-01

    Background The joint study of multiple datasets has become a common technique for increasing statistical power in detecting biomarkers obtained from smaller studies. The approach generally followed is based on the fact that as the total number of samples increases, we expect to have greater power to detect associations of interest. This methodology has been applied to genome-wide association and transcriptomic studies due to the availability of datasets in the public domain. While this approach is well established in biostatistics, the introduction of new combinatorial optimization models to address this issue has not been explored in depth. In this study, we introduce a new model for the integration of multiple datasets and we show its application in transcriptomics. Methods We propose a new combinatorial optimization problem that addresses the core issue of biomarker detection in integrated datasets. Optimal solutions for this model deliver a feature selection from a panel of prospective biomarkers. The model we propose is a generalised version of the (α,β)-k-Feature Set problem. We illustrate the performance of this new methodology via a challenging meta-analysis task involving six prostate cancer microarray datasets. The results are then compared to the popular RankProd meta-analysis tool and to what can be obtained by analysing the individual datasets by statistical and combinatorial methods alone. Results Application of the integrated method resulted in a more informative signature than the rank-based meta-analysis or individual dataset results, and overcomes problems arising from real world datasets. The set of genes identified is highly significant in the context of prostate cancer. The method used does not rely on homogenisation or transformation of values to a common scale, and at the same time is able to capture markers associated with subgroups of the disease. PMID:26106884

  2. Proteomics Versus Clinical Data and Stochastic Local Search Based Feature Selection for Acute Myeloid Leukemia Patients' Classification.

    PubMed

    Chebouba, Lokmane; Boughaci, Dalila; Guziolowski, Carito

    2018-06-04

    The use of data issued from high throughput technologies in drug target problems is widely widespread during the last decades. This study proposes a meta-heuristic framework using stochastic local search (SLS) combined with random forest (RF) where the aim is to specify the most important genes and proteins leading to the best classification of Acute Myeloid Leukemia (AML) patients. First we use a stochastic local search meta-heuristic as a feature selection technique to select the most significant proteins to be used in the classification task step. Then we apply RF to classify new patients into their corresponding classes. The evaluation technique is to run the RF classifier on the training data to get a model. Then, we apply this model on the test data to find the appropriate class. We use as metrics the balanced accuracy (BAC) and the area under the receiver operating characteristic curve (AUROC) to measure the performance of our model. The proposed method is evaluated on the dataset issued from DREAM 9 challenge. The comparison is done with a pure random forest (without feature selection), and with the two best ranked results of the DREAM 9 challenge. We used three types of data: only clinical data, only proteomics data, and finally clinical and proteomics data combined. The numerical results show that the highest scores are obtained when using clinical data alone, and the lowest is obtained when using proteomics data alone. Further, our method succeeds in finding promising results compared to the methods presented in the DREAM challenge.

  3. A genetic meta-algorithm-assisted inversion approach: hydrogeological study for the determination of volumetric rock properties and matrix and fluid parameters in unsaturated formations

    NASA Astrophysics Data System (ADS)

    Szabó, Norbert Péter

    2018-03-01

    An evolutionary inversion approach is suggested for the interpretation of nuclear and resistivity logs measured by direct-push tools in shallow unsaturated sediments. The efficiency of formation evaluation is improved by estimating simultaneously (1) the petrophysical properties that vary rapidly along a drill hole with depth and (2) the zone parameters that can be treated as constant, in one inversion procedure. In the workflow, the fractional volumes of water, air, matrix and clay are estimated in adjacent depths by linearized inversion, whereas the clay and matrix properties are updated using a float-encoded genetic meta-algorithm. The proposed inversion method provides an objective estimate of the zone parameters that appear in the tool response equations applied to solve the forward problem, which can significantly increase the reliability of the petrophysical model as opposed to setting these parameters arbitrarily. The global optimization meta-algorithm not only assures the best fit between the measured and calculated data but also gives a reliable solution, practically independent of the initial model, as laboratory data are unnecessary in the inversion procedure. The feasibility test uses engineering geophysical sounding logs observed in an unsaturated loessy-sandy formation in Hungary. The multi-borehole extension of the inversion technique is developed to determine the petrophysical properties and their estimation errors along a profile of drill holes. The genetic meta-algorithmic inversion method is recommended for hydrogeophysical logging applications of various kinds to automatically extract the volumetric ratios of rock and fluid constituents as well as the most important zone parameters in a reliable inversion procedure.

  4. Diaphragm and Lung Ultrasound to Predict Weaning Outcome: Systematic Review and Meta-Analysis.

    PubMed

    Llamas-Álvarez, Ana M; Tenza-Lozano, Eva M; Latour-Pérez, Jaime

    2017-12-01

    Deciding the optimal timing for extubation in patients who are mechanically ventilated can be challenging, and traditional weaning predictor tools are not very accurate. The aim of this systematic review and meta-analysis was to assess the accuracy of lung and diaphragm ultrasound for predicting weaning outcomes in critically ill adults. MEDLINE, the Cochrane Library, Web of Science, Scopus, LILACS, Teseo, Tesis Doctorales en Red, and OpenGrey were searched, and the bibliographies of relevant studies were reviewed. Two researchers independently selected studies that met the inclusion criteria and assessed study quality in accordance with the Quality Assessment of Diagnostic Accuracy Studies-2 tool. The summary receiver-operating characteristic curve and pooled diagnostic OR (DOR) were estimated by using a bivariate random effects analysis. Sources of heterogeneity were explored by using predefined subgroup analyses and bivariate meta-regression. Nineteen studies involving 1,071 people were included in the study. For diaphragm thickening fraction, the area under the summary receiver-operating characteristic curve was 0.87, and DOR was 21 (95% CI, 11-40). Regarding diaphragmatic excursion, pooled sensitivity was 75% (95% CI, 65-85); pooled specificity, 75% (95% CI, 60-85); and DOR, 10 (95% CI, 4-24). For lung ultrasound, the area under the summary receiver-operating characteristic curve was 0.77, and DOR was 38 (95% CI, 7-198). Based on bivariate meta-regression analysis, a significantly higher specificity for diaphragm thickening fraction and higher sensitivity for diaphragmatic excursion was detected in studies with applicability concerns. Lung and diaphragm ultrasound can help predict weaning outcome, but its accuracy may vary depending on the patient subpopulation. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  5. A simple and fast heuristic for protein structure comparison.

    PubMed

    Pelta, David A; González, Juan R; Moreno Vega, Marcos

    2008-03-25

    Protein structure comparison is a key problem in bioinformatics. There exist several methods for doing protein comparison, being the solution of the Maximum Contact Map Overlap problem (MAX-CMO) one of the alternatives available. Although this problem may be solved using exact algorithms, researchers require approximate algorithms that obtain good quality solutions using less computational resources than the formers. We propose a variable neighborhood search metaheuristic for solving MAX-CMO. We analyze this strategy in two aspects: 1) from an optimization point of view the strategy is tested on two different datasets, obtaining an error of 3.5%(over 2702 pairs) and 1.7% (over 161 pairs) with respect to optimal values; thus leading to high accurate solutions in a simpler and less expensive way than exact algorithms; 2) in terms of protein structure classification, we conduct experiments on three datasets and show that is feasible to detect structural similarities at SCOP's family and CATH's architecture levels using normalized overlap values. Some limitations and the role of normalization are outlined for doing classification at SCOP's fold level. We designed, implemented and tested.a new tool for solving MAX-CMO, based on a well-known metaheuristic technique. The good balance between solution's quality and computational effort makes it a valuable tool. Moreover, to the best of our knowledge, this is the first time the MAX-CMO measure is tested at SCOP's fold and CATH's architecture levels with encouraging results.

  6. Contribution of Geographic Information Systems and location models to planning of wastewater systems.

    PubMed

    Leitão, J P; Matos, J S; Gonçalves, A B; Matos, J L

    2005-01-01

    This paper presents the contributions of Geographic Information Systems (GIS) and location models towards planning regional wastewater systems (sewers and wastewater treatment plants) serving small agglomerations, i.e. agglomerations with less than 2,000 inhabitants. The main goal was to develop a decision support tool for tracing and locating regional wastewater systems. The main results of the model are expressed in terms of number, capacity and location of Wastewater Treatment Plants (WWTP) and the length of main sewers. The decision process concerning the location and capacity of wastewater systems has a number of parameters that can be optimized. These parameters include the total sewer length and number, capacity and location of WWTP. The optimization of parameters should lead to the minimization of construction and operation costs of the integrated system. Location models have been considered as tools for decision support, mainly when a geo-referenced database can be used. In these cases, the GIS may represent an important role for the analysis of data and results especially in the preliminary stage of planning and design. After selecting the spatial location model and the heuristics, two greedy algorithms were implemented in Visual Basic for Applications on the ArcGIS software environment. To illustrate the application of these algorithms a case study was developed, in a rural area located in the central part of Portugal.

  7. SURE (Science User Resource Expert): A science planning and scheduling assistant for a resource based environment

    NASA Technical Reports Server (NTRS)

    Thalman, Nancy E.; Sparn, Thomas P.

    1990-01-01

    SURE (Science User Resource Expert) is one of three components that compose the SURPASS (Science User Resource Planning and Scheduling System). This system is a planning and scheduling tool which supports distributed planning and scheduling, based on resource allocation and optimization. Currently SURE is being used within the SURPASS by the UARS (Upper Atmospheric Research Satellite) SOLSTICE instrument to build a daily science plan and activity schedule and in a prototyping effort with NASA GSFC to demonstrate distributed planning and scheduling for the SOLSTICE II instrument on the EOS platform. For the SOLSTICE application the SURE utilizes a rule-based system. Development of a rule-based program using Ada CLIPS as opposed to using conventional programming, allows for capture of the science planning and scheduling heuristics in rules and provides flexibility in inserting or removing rules as the scientific objectives and mission constraints change. The SURE system's role as a component in the SURPASS, the purpose of the SURE planning and scheduling tool, the SURE knowledge base, and the software architecture of the SURE component are described.

  8. A Scheme to Smooth Aggregated Traffic from Sensors with Periodic Reports

    PubMed Central

    Oh, Sungmin; Jang, Ju Wook

    2017-01-01

    The possibility of smoothing aggregated traffic from sensors with varying reporting periods and frame sizes to be carried on an access link is investigated. A straightforward optimization would take O(pn) time, whereas our heuristic scheme takes O(np) time where n, p denote the number of sensors and size of periods, respectively. Our heuristic scheme performs local optimization sensor by sensor, starting with the smallest to largest periods. This is based on an observation that sensors with large offsets have more choices in offsets to avoid traffic peaks than the sensors with smaller periods. A MATLAB simulation shows that our scheme excels the known scheme by M. Grenier et al. in a similar situation (aggregating periodic traffic in a controller area network) for almost all possible permutations. The performance of our scheme is very close to the straightforward optimization, which compares all possible permutations. We expect that our scheme would greatly contribute in smoothing the traffic from an ever-increasing number of IoT sensors to the gateway, reducing the burden on the access link to the Internet. PMID:28273831

  9. Ant system: optimization by a colony of cooperating agents.

    PubMed

    Dorigo, M; Maniezzo, V; Colorni, A

    1996-01-01

    An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

  10. A Scheme to Smooth Aggregated Traffic from Sensors with Periodic Reports.

    PubMed

    Oh, Sungmin; Jang, Ju Wook

    2017-03-03

    The possibility of smoothing aggregated traffic from sensors with varying reporting periods and frame sizes to be carried on an access link is investigated. A straightforward optimization would take O(pn) time, whereas our heuristic scheme takes O(np) time where n, p denote the number of sensors and size of periods, respectively. Our heuristic scheme performs local optimization sensor by sensor, starting with the smallest to largest periods. This is based on an observation that sensors with large offsets have more choices in offsets to avoid traffic peaks than the sensors with smaller periods. A MATLAB simulation shows that our scheme excels the known scheme by M. Grenier et al. in a similar situation (aggregating periodic traffic in a controller area network) for almost all possible permutations. The performance of our scheme is very close to the straightforward optimization, which compares all possible permutations. We expect that our scheme would greatly contribute in smoothing the traffic from an ever-increasing number of IoT sensors to the gateway, reducing the burden on the access link to the Internet.

  11. Learning as a Problem Solving Tool. Technical Report CS74018-R.

    ERIC Educational Resources Information Center

    Claybrook, Billy G.

    This paper explores the use of learning as a practical tool in problem solving. The idea that learning should and eventually will be a vital component of most Artificial Intelligence programs is pursued. Current techniques in learning systems are compared. A detailed discussion of the problems of representing, modifying, and creating heuristics is…

  12. Students' Understanding of Chemical Formulae: A Review of Empirical Research

    ERIC Educational Resources Information Center

    Taskin, Vahide; Bernholt, Sascha

    2014-01-01

    The fluent use of the chemical language is a major tool for successfully passing chemistry courses at school or university as well as for working as a chemist, since chemical formulae are both a descriptive and a heuristic tool. However, numerous studies have revealed remarkable difficulties of students with chemical formulae both at school and at…

  13. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  14. Gravity inversion of a fault by Particle swarm optimization (PSO).

    PubMed

    Toushmalani, Reza

    2013-01-01

    Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. In this paper we introduce and use this method in gravity inverse problem. We discuss the solution for the inverse problem of determining the shape of a fault whose gravity anomaly is known. Application of the proposed algorithm to this problem has proven its capability to deal with difficult optimization problems. The technique proved to work efficiently when tested to a number of models.

  15. MetaABC--an integrated metagenomics platform for data adjustment, binning and clustering.

    PubMed

    Su, Chien-Hao; Hsu, Ming-Tsung; Wang, Tse-Yi; Chiang, Sufeng; Cheng, Jen-Hao; Weng, Francis C; Kao, Cheng-Yan; Wang, Daryi; Tsai, Huai-Kuang

    2011-08-15

    MetaABC is a metagenomic platform that integrates several binning tools coupled with methods for removing artifacts, analyzing unassigned reads and controlling sampling biases. It allows users to arrive at a better interpretation via series of distinct combinations of analysis tools. After execution, MetaABC provides outputs in various visual formats such as tables, pie and bar charts as well as clustering result diagrams. MetaABC source code and documentation are available at http://bits2.iis.sinica.edu.tw/MetaABC/ CONTACT: dywang@gate.sinica.edu.tw; hktsai@iis.sinica.edu.tw Supplementary data are available at Bioinformatics online.

  16. Column generation algorithms for virtual network embedding in flexi-grid optical networks.

    PubMed

    Lin, Rongping; Luo, Shan; Zhou, Jingwei; Wang, Sheng; Chen, Bin; Zhang, Xiaoning; Cai, Anliang; Zhong, Wen-De; Zukerman, Moshe

    2018-04-16

    Network virtualization provides means for efficient management of network resources by embedding multiple virtual networks (VNs) to share efficiently the same substrate network. Such virtual network embedding (VNE) gives rise to a challenging problem of how to optimize resource allocation to VNs and to guarantee their performance requirements. In this paper, we provide VNE algorithms for efficient management of flexi-grid optical networks. We provide an exact algorithm aiming to minimize the total embedding cost in terms of spectrum cost and computation cost for a single VN request. Then, to achieve scalability, we also develop a heuristic algorithm for the same problem. We apply these two algorithms for a dynamic traffic scenario where many VN requests arrive one-by-one. We first demonstrate by simulations for the case of a six-node network that the heuristic algorithm obtains very close blocking probabilities to exact algorithm (about 0.2% higher). Then, for a network of realistic size (namely, USnet) we demonstrate that the blocking probability of our new heuristic algorithm is about one magnitude lower than a simpler heuristic algorithm, which was a component of an earlier published algorithm.

  17. Fitness landscapes, heuristics and technological paradigms: A critique on random search models in evolutionary economics

    NASA Astrophysics Data System (ADS)

    Frenken, Koen

    2001-06-01

    The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.

  18. Species tree inference by minimizing deep coalescences.

    PubMed

    Than, Cuong; Nakhleh, Luay

    2009-09-01

    In a 1997 seminal paper, W. Maddison proposed minimizing deep coalescences, or MDC, as an optimization criterion for inferring the species tree from a set of incongruent gene trees, assuming the incongruence is exclusively due to lineage sorting. In a subsequent paper, Maddison and Knowles provided and implemented a search heuristic for optimizing the MDC criterion, given a set of gene trees. However, the heuristic is not guaranteed to compute optimal solutions, and its hill-climbing search makes it slow in practice. In this paper, we provide two exact solutions to the problem of inferring the species tree from a set of gene trees under the MDC criterion. In other words, our solutions are guaranteed to find the tree that minimizes the total number of deep coalescences from a set of gene trees. One solution is based on a novel integer linear programming (ILP) formulation, and another is based on a simple dynamic programming (DP) approach. Powerful ILP solvers, such as CPLEX, make the first solution appealing, particularly for very large-scale instances of the problem, whereas the DP-based solution eliminates dependence on proprietary tools, and its simplicity makes it easy to integrate with other genomic events that may cause gene tree incongruence. Using the exact solutions, we analyze a data set of 106 loci from eight yeast species, a data set of 268 loci from eight Apicomplexan species, and several simulated data sets. We show that the MDC criterion provides very accurate estimates of the species tree topologies, and that our solutions are very fast, thus allowing for the accurate analysis of genome-scale data sets. Further, the efficiency of the solutions allow for quick exploration of sub-optimal solutions, which is important for a parsimony-based criterion such as MDC, as we show. We show that searching for the species tree in the compatibility graph of the clusters induced by the gene trees may be sufficient in practice, a finding that helps ameliorate the computational requirements of optimization solutions. Further, we study the statistical consistency and convergence rate of the MDC criterion, as well as its optimality in inferring the species tree. Finally, we show how our solutions can be used to identify potential horizontal gene transfer events that may have caused some of the incongruence in the data, thus augmenting Maddison's original framework. We have implemented our solutions in the PhyloNet software package, which is freely available at: http://bioinfo.cs.rice.edu/phylonet.

  19. Complex interventions can increase osteoporosis investigations and treatment: a systematic review and meta-analysis.

    PubMed

    Kastner, M; Perrier, L; Munce, S E P; Adhihetty, C C; Lau, A; Hamid, J; Treister, V; Chan, J; Lai, Y; Straus, S E

    2018-01-01

    Osteoporosis is affecting over 200 million people worldwide. Despite available guidelines, care for these patients remains sub-optimal. We developed an osteoporosis tool to address the multiple dimensions of chronic disease management. Findings from its evaluation showed a significant increase from baseline in osteoporosis investigations and treatment, so we are revising this tool to include multiple chronic conditions including an update of evidence about osteoporosis. Our objectives were to conduct a systematic review of osteoporosis interventions in adults at risk for osteoporosis. We searched bibliometric databases for randomized controlled trials (RCTs) in any language evaluating osteoporosis disease management interventions in adults at risk for osteoporosis. Reviewer pairs independently screened citations and full-text articles, extracted data, and assessed risk of bias. Analysis included random effects meta-analysis. Primary outcomes were osteoporosis investigations and treatment, and fragility fractures. Fifty-five RCTs and one companion report were included in the analysis representing 165,703 patients. Our findings from 55 RCTs and 18 sub-group meta-analyses showed that complex implementation interventions with multiple components consisting of at least education + feedback + follow-up significantly increased the initiation of osteoporosis medications, and interventions with at least education + follow-up significantly increased the initiation of osteoporosis investigations. No significant impact was found for any type of intervention to reduce fracture. Complex interventions that include at least education + follow-up or feedback have the most potential for increasing osteoporosis investigations and treatment. Patient education appears to be an important component in osteoporosis disease management.

  20. RSQRT: AN HEURISTIC FOR ESTIMATING THE NUMBER OF CLUSTERS TO REPORT.

    PubMed

    Carlis, John; Bruso, Kelsey

    2012-03-01

    Clustering can be a valuable tool for analyzing large datasets, such as in e-commerce applications. Anyone who clusters must choose how many item clusters, K, to report. Unfortunately, one must guess at K or some related parameter. Elsewhere we introduced a strongly-supported heuristic, RSQRT, which predicts K as a function of the attribute or item count, depending on attribute scales. We conducted a second analysis where we sought confirmation of the heuristic, analyzing data sets from theUCImachine learning benchmark repository. For the 25 studies where sufficient detail was available, we again found strong support. Also, in a side-by-side comparison of 28 studies, RSQRT best-predicted K and the Bayesian information criterion (BIC) predicted K are the same. RSQRT has a lower cost of O(log log n) versus O(n(2)) for BIC, and is more widely applicable. Using RSQRT prospectively could be much better than merely guessing.

  1. RSQRT: AN HEURISTIC FOR ESTIMATING THE NUMBER OF CLUSTERS TO REPORT

    PubMed Central

    Bruso, Kelsey

    2012-01-01

    Clustering can be a valuable tool for analyzing large datasets, such as in e-commerce applications. Anyone who clusters must choose how many item clusters, K, to report. Unfortunately, one must guess at K or some related parameter. Elsewhere we introduced a strongly-supported heuristic, RSQRT, which predicts K as a function of the attribute or item count, depending on attribute scales. We conducted a second analysis where we sought confirmation of the heuristic, analyzing data sets from theUCImachine learning benchmark repository. For the 25 studies where sufficient detail was available, we again found strong support. Also, in a side-by-side comparison of 28 studies, RSQRT best-predicted K and the Bayesian information criterion (BIC) predicted K are the same. RSQRT has a lower cost of O(log log n) versus O(n2) for BIC, and is more widely applicable. Using RSQRT prospectively could be much better than merely guessing. PMID:22773923

  2. Figure and ground in physician misdiagnosis: metacognition and diagnostic norms.

    PubMed

    Hamm, Robert M

    2014-01-01

    Meta-cognitive awareness, or self reflection informed by the "heuristics and biases" theory of how experts make cognitive errors, has been offered as a partial solution for diagnostic errors in medicine. I argue that this approach is not as easy nor as effective as one might hope. We should also promote mastery of the basic principles of diagnosis in medical school, continuing medical education, and routine reflection and review. While it may seem difficult to attend to both levels simultaneously, there is more to be gained from attending to both than from focusing only on one.

  3. On the relationship between the causal-inference and meta-analytic paradigms for the validation of surrogate endpoints.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2015-03-01

    The increasing cost of drug development has raised the demand for surrogate endpoints when evaluating new drugs in clinical trials. However, over the years, it has become clear that surrogate endpoints need to be statistically evaluated and deemed valid, before they can be used as substitutes of "true" endpoints in clinical studies. Nowadays, two paradigms, based on causal-inference and meta-analysis, dominate the scene. Nonetheless, although the literature emanating from these paradigms is wide, till now the relationship between them has largely been left unexplored. In the present work, we discuss the conceptual framework underlying both approaches and study the relationship between them using theoretical elements and the analysis of a real case study. Furthermore, we show that the meta-analytic approach can be embedded within a causal-inference framework on the one hand and that it can be heuristically justified why surrogate endpoints successfully evaluated using this approach will often be appealing from a causal-inference perspective as well, on the other. A newly developed and user friendly R package Surrogate is provided to carry out the evaluation exercise. © 2014, The International Biometric Society.

  4. A new numerical approach to solve Thomas-Fermi model of an atom using bio-inspired heuristics integrated with sequential quadratic programming.

    PubMed

    Raja, Muhammad Asif Zahoor; Zameer, Aneela; Khan, Aziz Ullah; Wazwaz, Abdul Majid

    2016-01-01

    In this study, a novel bio-inspired computing approach is developed to analyze the dynamics of nonlinear singular Thomas-Fermi equation (TFE) arising in potential and charge density models of an atom by exploiting the strength of finite difference scheme (FDS) for discretization and optimization through genetic algorithms (GAs) hybrid with sequential quadratic programming. The FDS procedures are used to transform the TFE differential equations into a system of nonlinear equations. A fitness function is constructed based on the residual error of constituent equations in the mean square sense and is formulated as the minimization problem. Optimization of parameters for the system is carried out with GAs, used as a tool for viable global search integrated with SQP algorithm for rapid refinement of the results. The design scheme is applied to solve TFE for five different scenarios by taking various step sizes and different input intervals. Comparison of the proposed results with the state of the art numerical and analytical solutions reveals that the worth of our scheme in terms of accuracy and convergence. The reliability and effectiveness of the proposed scheme are validated through consistently getting optimal values of statistical performance indices calculated for a sufficiently large number of independent runs to establish its significance.

  5. Approximating Optimal Behavioural Strategies Down to Rules-of-Thumb: Energy Reserve Changes in Pairs of Social Foragers

    PubMed Central

    Rands, Sean A.

    2011-01-01

    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These ‘best’ strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or ‘rules-of-thumb’ that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose – particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour. PMID:21765938

  6. Approximating optimal behavioural strategies down to rules-of-thumb: energy reserve changes in pairs of social foragers.

    PubMed

    Rands, Sean A

    2011-01-01

    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.

  7. "Up Means Good": The Effect of Screen Position on Evaluative Ratings in Web Surveys.

    PubMed

    Tourangeau, Roger; Couper, Mick P; Conrad, Frederick G

    2013-01-01

    This paper presents results from six experiments that examine the effect of the position of an item on the screen on the evaluative ratings it receives. The experiments are based on the idea that respondents expect "good" things-those they view positively-to be higher up on the screen than "bad" things. The experiments use items on different topics (Congress and HMOs, a variety of foods, and six physician specialties) and different methods for varying their vertical position on the screen. A meta-analysis of all six experiments demonstrates a small but reliable effect of the item's screen position on mean ratings of the item; the ratings are significantly more positive when the item appears in a higher position on the screen than when it appears farther down. These results are consistent with the hypothesis that respondents follow the "Up means good" heuristic, using the vertical position of the item as a cue in evaluating it. Respondents seem to rely on heuristics both in interpreting response scales and in forming judgments.

  8. Heuristics Applied in the Development of Advanced Space Mission Concepts

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.

    1998-01-01

    Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.

  9. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    PubMed

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.

  10. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses—an overview and application of NetMetaXL

    PubMed Central

    2014-01-01

    Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416

  11. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  12. Energy-Efficient Next-Generation Passive Optical Networks Based on Sleep Mode and Heuristic Optimization

    NASA Astrophysics Data System (ADS)

    Zulai, Luis G. T.; Durand, Fábio R.; Abrão, Taufik

    2015-05-01

    In this article, an energy-efficiency mechanism for next-generation passive optical networks is investigated through heuristic particle swarm optimization. Ten-gigabit Ethernet-wavelength division multiplexing optical code division multiplexing-passive optical network next-generation passive optical networks are based on the use of a legacy 10-gigabit Ethernet-passive optical network with the advantage of using only an en/decoder pair of optical code division multiplexing technology, thus eliminating the en/decoder at each optical network unit. The proposed joint mechanism is based on the sleep-mode power-saving scheme for a 10-gigabit Ethernet-passive optical network, combined with a power control procedure aiming to adjust the transmitted power of the active optical network units while maximizing the overall energy-efficiency network. The particle swarm optimization based power control algorithm establishes the optimal transmitted power in each optical network unit according to the network pre-defined quality of service requirements. The objective is controlling the power consumption of the optical network unit according to the traffic demand by adjusting its transmitter power in an attempt to maximize the number of transmitted bits with minimum energy consumption, achieving maximal system energy efficiency. Numerical results have revealed that it is possible to save 75% of energy consumption with the proposed particle swarm optimization based sleep-mode energy-efficiency mechanism compared to 55% energy savings when just a sleeping-mode-based mechanism is deployed.

  13. Dynamic Staffing and Rescheduling in Software Project Management: A Hybrid Approach.

    PubMed

    Ge, Yujia; Xu, Bin

    2016-01-01

    Resource allocation could be influenced by various dynamic elements, such as the skills of engineers and the growth of skills, which requires managers to find an effective and efficient tool to support their staffing decision-making processes. Rescheduling happens commonly and frequently during the project execution. Control options have to be made when new resources are added or tasks are changed. In this paper we propose a software project staffing model considering dynamic elements of staff productivity with a Genetic Algorithm (GA) and Hill Climbing (HC) based optimizer. Since a newly generated reschedule dramatically different from the initial schedule could cause an obvious shifting cost increase, our rescheduling strategies consider both efficiency and stability. The results of real world case studies and extensive simulation experiments show that our proposed method is effective and could achieve comparable performance to other heuristic algorithms in most cases.

  14. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  15. Dynamic Staffing and Rescheduling in Software Project Management: A Hybrid Approach

    PubMed Central

    Ge, Yujia; Xu, Bin

    2016-01-01

    Resource allocation could be influenced by various dynamic elements, such as the skills of engineers and the growth of skills, which requires managers to find an effective and efficient tool to support their staffing decision-making processes. Rescheduling happens commonly and frequently during the project execution. Control options have to be made when new resources are added or tasks are changed. In this paper we propose a software project staffing model considering dynamic elements of staff productivity with a Genetic Algorithm (GA) and Hill Climbing (HC) based optimizer. Since a newly generated reschedule dramatically different from the initial schedule could cause an obvious shifting cost increase, our rescheduling strategies consider both efficiency and stability. The results of real world case studies and extensive simulation experiments show that our proposed method is effective and could achieve comparable performance to other heuristic algorithms in most cases. PMID:27285420

  16. Social biases determine spatiotemporal sparseness of ciliate mating heuristics.

    PubMed

    Clark, Kevin B

    2012-01-01

    Ciliates become highly social, even displaying animal-like qualities, in the joint presence of aroused conspecifics and nonself mating pheromones. Pheromone detection putatively helps trigger instinctual and learned courtship and dominance displays from which social judgments are made about the availability, compatibility, and fitness representativeness or likelihood of prospective mates and rivals. In earlier studies, I demonstrated the heterotrich Spirostomum ambiguum improves mating competence by effecting preconjugal strategies and inferences in mock social trials via behavioral heuristics built from Hebbian-like associative learning. Heuristics embody serial patterns of socially relevant action that evolve into ordered, topologically invariant computational networks supporting intra- and intermate selection. S. ambiguum employs heuristics to acquire, store, plan, compare, modify, select, and execute sets of mating propaganda. One major adaptive constraint over formation and use of heuristics involves a ciliate's initial subjective bias, responsiveness, or preparedness, as defined by Stevens' Law of subjective stimulus intensity, for perceiving the meaningfulness of mechanical pressures accompanying cell-cell contacts and additional perimating events. This bias controls durations and valences of nonassociative learning, search rates for appropriate mating strategies, potential net reproductive payoffs, levels of social honesty and deception, successful error diagnosis and correction of mating signals, use of insight or analysis to solve mating dilemmas, bioenergetics expenditures, and governance of mating decisions by classical or quantum statistical mechanics. I now report this same social bias also differentially affects the spatiotemporal sparseness, as measured with metric entropy, of ciliate heuristics. Sparseness plays an important role in neural systems through optimizing the specificity, efficiency, and capacity of memory representations. The present findings indicate sparseness performs a similar function in single aneural cells by tuning the size and density of encoded computational architectures useful for decision making in social contexts.

  17. Social biases determine spatiotemporal sparseness of ciliate mating heuristics

    PubMed Central

    2012-01-01

    Ciliates become highly social, even displaying animal-like qualities, in the joint presence of aroused conspecifics and nonself mating pheromones. Pheromone detection putatively helps trigger instinctual and learned courtship and dominance displays from which social judgments are made about the availability, compatibility, and fitness representativeness or likelihood of prospective mates and rivals. In earlier studies, I demonstrated the heterotrich Spirostomum ambiguum improves mating competence by effecting preconjugal strategies and inferences in mock social trials via behavioral heuristics built from Hebbian-like associative learning. Heuristics embody serial patterns of socially relevant action that evolve into ordered, topologically invariant computational networks supporting intra- and intermate selection. S. ambiguum employs heuristics to acquire, store, plan, compare, modify, select, and execute sets of mating propaganda. One major adaptive constraint over formation and use of heuristics involves a ciliate’s initial subjective bias, responsiveness, or preparedness, as defined by Stevens’ Law of subjective stimulus intensity, for perceiving the meaningfulness of mechanical pressures accompanying cell-cell contacts and additional perimating events. This bias controls durations and valences of nonassociative learning, search rates for appropriate mating strategies, potential net reproductive payoffs, levels of social honesty and deception, successful error diagnosis and correction of mating signals, use of insight or analysis to solve mating dilemmas, bioenergetics expenditures, and governance of mating decisions by classical or quantum statistical mechanics. I now report this same social bias also differentially affects the spatiotemporal sparseness, as measured with metric entropy, of ciliate heuristics. Sparseness plays an important role in neural systems through optimizing the specificity, efficiency, and capacity of memory representations. The present findings indicate sparseness performs a similar function in single aneural cells by tuning the size and density of encoded computational architectures useful for decision making in social contexts. PMID:22482001

  18. Vervet monkeys use paths consistent with context-specific spatial movement heuristics.

    PubMed

    Teichroeb, Julie A

    2015-10-01

    Animal foraging routes are analogous to the computationally demanding "traveling salesman problem" (TSP), where individuals must find the shortest path among several locations before returning to the start. Humans approximate solutions to TSPs using simple heuristics or "rules of thumb," but our knowledge of how other animals solve multidestination routing problems is incomplete. Most nonhuman primate species have shown limited ability to route plan. However, captive vervets were shown to solve a TSP for six sites. These results were consistent with either planning three steps ahead or a risk-avoidance strategy. I investigated how wild vervet monkeys (Chlorocebus pygerythrus) solved a path problem with six, equally rewarding food sites; where site arrangement allowed assessment of whether vervets found the shortest route and/or used paths consistent with one of three simple heuristics to navigate. Single vervets took the shortest possible path in fewer than half of the trials, usually in ways consistent with the most efficient heuristic (the convex hull). When in competition, vervets' paths were consistent with different, more efficient heuristics dependent on their dominance rank (a cluster strategy for dominants and the nearest neighbor rule for subordinates). These results suggest that, like humans, vervets may solve multidestination routing problems by applying simple, adaptive, context-specific "rules of thumb." The heuristics that were consistent with vervet paths in this study are the same as some of those asserted to be used by humans. These spatial movement strategies may have common evolutionary roots and be part of a universal mental navigational toolkit. Alternatively, they may have emerged through convergent evolution as the optimal way to solve multidestination routing problems.

  19. Meta-analyzing dependent correlations: an SPSS macro and an R script.

    PubMed

    Cheung, Shu Fai; Chan, Darius K-S

    2014-06-01

    The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.

  20. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

Top