Statistical Mechanics of Combinatorial Auctions
NASA Astrophysics Data System (ADS)
Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo
2006-09-01
Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Lin, Lin
Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.
Combining local search with co-evolution in a remarkably simple way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.
2000-05-01
The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less
Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.
Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk
2017-01-01
Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.
Coelho, V N; Coelho, I M; Souza, M J F; Oliveira, T A; Cota, L P; Haddad, M N; Mladenovic, N; Silva, R C P; Guimarães, F G
2016-01-01
This article presents an Evolution Strategy (ES)--based algorithm, designed to self-adapt its mutation operators, guiding the search into the solution space using a Self-Adaptive Reduced Variable Neighborhood Search procedure. In view of the specific local search operators for each individual, the proposed population-based approach also fits into the context of the Memetic Algorithms. The proposed variant uses the Greedy Randomized Adaptive Search Procedure with different greedy parameters for generating its initial population, providing an interesting exploration-exploitation balance. To validate the proposal, this framework is applied to solve three different [Formula: see text]-Hard combinatorial optimization problems: an Open-Pit-Mining Operational Planning Problem with dynamic allocation of trucks, an Unrelated Parallel Machine Scheduling Problem with Setup Times, and the calibration of a hybrid fuzzy model for Short-Term Load Forecasting. Computational results point out the convergence of the proposed model and highlight its ability in combining the application of move operations from distinct neighborhood structures along the optimization. The results gathered and reported in this article represent a collective evidence of the performance of the method in challenging combinatorial optimization problems from different application domains. The proposed evolution strategy demonstrates an ability of adapting the strength of the mutation disturbance during the generations of its evolution process. The effectiveness of the proposal motivates the application of this novel evolutionary framework for solving other combinatorial optimization problems.
Fuel management optimization using genetic algorithms and code independence
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeChaine, M.D.; Feltus, M.A.
1994-12-31
Fuel management optimization is a hard problem for traditional optimization techniques. Loading pattern optimization is a large combinatorial problem without analytical derivative information. Therefore, methods designed for continuous functions, such as linear programming, do not always work well. Genetic algorithms (GAs) address these problems and, therefore, appear ideal for fuel management optimization. They do not require derivative information and work well with combinatorial. functions. The GAs are a stochastic method based on concepts from biological genetics. They take a group of candidate solutions, called the population, and use selection, crossover, and mutation operators to create the next generation of bettermore » solutions. The selection operator is a {open_quotes}survival-of-the-fittest{close_quotes} operation and chooses the solutions for the next generation. The crossover operator is analogous to biological mating, where children inherit a mixture of traits from their parents, and the mutation operator makes small random changes to the solutions.« less
Genetic algorithms for the vehicle routing problem
NASA Astrophysics Data System (ADS)
Volna, Eva
2016-06-01
The Vehicle Routing Problem (VRP) is one of the most challenging combinatorial optimization tasks. This problem consists in designing the optimal set of routes for fleet of vehicles in order to serve a given set of customers. Evolutionary algorithms are general iterative algorithms for combinatorial optimization. These algorithms have been found to be very effective and robust in solving numerous problems from a wide range of application domains. This problem is known to be NP-hard; hence many heuristic procedures for its solution have been suggested. For such problems it is often desirable to obtain approximate solutions, so they can be found fast enough and are sufficiently accurate for the purpose. In this paper we have performed an experimental study that indicates the suitable use of genetic algorithms for the vehicle routing problem.
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, H.B. III; Rosenkrantz, D.J.; Stearns, R.E.
We study both the complexity and approximability of various graph and combinatorial problems specified using two dimensional narrow periodic specifications (see [CM93, HW92, KMW67, KO91, Or84b, Wa93]). The following two general kinds of results are presented. (1) We prove that a number of natural graph and combinatorial problems are NEXPTIME- or EXPSPACE-complete when instances are so specified; (2) In contrast, we prove that the optimization versions of several of these NEXPTIME-, EXPSPACE-complete problems have polynomial time approximation algorithms with constant performance guarantees. Moreover, some of these problems even have polynomial time approximation schemes. We also sketch how our NEXPTIME-hardness resultsmore » can be used to prove analogous NEXPTIME-hardness results for problems specified using other kinds of succinct specification languages. Our results provide the first natural problems for which there is a proven exponential (and possibly doubly exponential) gap between the complexities of finding exact and approximate solutions.« less
NASA Astrophysics Data System (ADS)
Doerr, Timothy; Alves, Gelio; Yu, Yi-Kuo
2006-03-01
Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time. This suggests a way to efficiently find approximate solutions - - find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the fininte number of high- ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks - - peptide sequencing using tandem mass spectrometry data.
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
Landscape Encodings Enhance Optimization
Klemm, Konstantin; Mehta, Anita; Stadler, Peter F.
2012-01-01
Hard combinatorial optimization problems deal with the search for the minimum cost solutions (ground states) of discrete systems under strong constraints. A transformation of state variables may enhance computational tractability. It has been argued that these state encodings are to be chosen invertible to retain the original size of the state space. Here we show how redundant non-invertible encodings enhance optimization by enriching the density of low-energy states. In addition, smooth landscapes may be established on encoded state spaces to guide local search dynamics towards the ground state. PMID:22496860
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem.
Rajeswari, M; Amudhavel, J; Pothula, Sujatha; Dhavachelvan, P
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria.
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem
Amudhavel, J.; Pothula, Sujatha; Dhavachelvan, P.
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria. PMID:28473849
A quantum annealing approach for fault detection and diagnosis of graph-based systems
NASA Astrophysics Data System (ADS)
Perdomo-Ortiz, A.; Fluegemann, J.; Narasimhan, S.; Biswas, R.; Smelyanskiy, V. N.
2015-02-01
Diagnosing the minimal set of faults capable of explaining a set of given observations, e.g., from sensor readouts, is a hard combinatorial optimization problem usually tackled with artificial intelligence techniques. We present the mapping of this combinatorial problem to quadratic unconstrained binary optimization (QUBO), and the experimental results of instances embedded onto a quantum annealing device with 509 quantum bits. Besides being the first time a quantum approach has been proposed for problems in the advanced diagnostics community, to the best of our knowledge this work is also the first research utilizing the route Problem → QUBO → Direct embedding into quantum hardware, where we are able to implement and tackle problem instances with sizes that go beyond previously reported toy-model proof-of-principle quantum annealing implementations; this is a significant leap in the solution of problems via direct-embedding adiabatic quantum optimization. We discuss some of the programmability challenges in the current generation of the quantum device as well as a few possible ways to extend this work to more complex arbitrary network graphs.
Nash Social Welfare in Multiagent Resource Allocation
NASA Astrophysics Data System (ADS)
Ramezani, Sara; Endriss, Ulle
We study different aspects of the multiagent resource allocation problem when the objective is to find an allocation that maximizes Nash social welfare, the product of the utilities of the individual agents. The Nash solution is an important welfare criterion that combines efficiency and fairness considerations. We show that the problem of finding an optimal outcome is NP-hard for a number of different languages for representing agent preferences; we establish new results regarding convergence to Nash-optimal outcomes in a distributed negotiation framework; and we design and test algorithms similar to those applied in combinatorial auctions for computing such an outcome directly.
Solving the Container Stowage Problem (CSP) using Particle Swarm Optimization (PSO)
NASA Astrophysics Data System (ADS)
Matsaini; Santosa, Budi
2018-04-01
Container Stowage Problem (CSP) is a problem of containers arrangement into ships by considering rules such as: total weight, weight of one stack, destination, equilibrium, and placement of containers on vessel. Container stowage problem is combinatorial problem and hard to solve with enumeration technique. It is an NP-Hard Problem. Therefore, to find a solution, metaheuristics is preferred. The objective of solving the problem is to minimize the amount of shifting such that the unloading time is minimized. Particle Swarm Optimization (PSO) is proposed to solve the problem. The implementation of PSO is combined with some steps which are stack position change rules, stack changes based on destination, and stack changes based on the weight type of the stacks (light, medium, and heavy). The proposed method was applied on five different cases. The results were compared to Bee Swarm Optimization (BSO) and heuristics method. PSO provided mean of 0.87% gap and time gap of 60 second. While BSO provided mean of 2,98% gap and 459,6 second to the heuristcs.
Zalesak, J; Todt, J; Pitonak, R; Köpf, A; Weißenbacher, R; Sartory, B; Burghammer, M; Daniel, R; Keckes, J
2016-12-01
Because of the tremendous variability of crystallite sizes and shapes in nano-materials, it is challenging to assess the corresponding size-property relationships and to identify microstructures with particular physical properties or even optimized functions. This task is especially difficult for nanomaterials formed by self-organization, where the spontaneous evolution of microstructure and properties is coupled. In this work, two compositionally graded TiAlN films were (i) grown using chemical vapour deposition by applying a varying ratio of reacting gases and (ii) subsequently analysed using cross-sectional synchrotron X-ray nanodiffraction, electron microscopy and nanoindentation in order to evaluate the microstructure and hardness depth gradients. The results indicate the formation of self-organized hexagonal-cubic and cubic-cubic nanolamellae with varying compositions and thicknesses in the range of ∼3-15 nm across the film thicknesses, depending on the actual composition of the reactive gas mixtures. On the basis of the occurrence of the nanolamellae and their correlation with the local film hardness, progressively narrower ranges of the composition and hardness were refined in three steps. The third film was produced using an AlCl 3 /TiCl 4 precursor ratio of ∼1.9, resulting in the formation of an optimized lamellar microstructure with ∼1.3 nm thick cubic Ti(Al)N and ∼12 nm thick cubic Al(Ti)N nanolamellae which exhibits a maximal hardness of ∼36 GPa and an indentation modulus of ∼522 GPa. The presented approach of an iterative nanoscale search based on the application of cross-sectional synchrotron X-ray nanodiffraction and cross-sectional nanoindentation allows one to refine the relationship between (i) varying deposition conditions, (ii) gradients of microstructure and (iii) gradients of mechanical properties in nanostructured materials prepared as thin films. This is done in a combinatorial way in order to screen a wide range of deposition conditions, while identifying those that result in the formation of a particular microstructure with optimized functional attributes.
Combinatorial Effects of Arginine and Fluoride on Oral Bacteria
Zheng, X.; Cheng, X.; Wang, L.; Qiu, W.; Wang, S.; Zhou, Y.; Li, M.; Li, Y.; Cheng, L.; Li, J.; Zhou, X.
2015-01-01
Dental caries is closely associated with the microbial disequilibrium between acidogenic/aciduric pathogens and alkali-generating commensal residents within the dental plaque. Fluoride is a widely used anticaries agent, which promotes tooth hard-tissue remineralization and suppresses bacterial activities. Recent clinical trials have shown that oral hygiene products containing both fluoride and arginine possess a greater anticaries effect compared with those containing fluoride alone, indicating synergy between fluoride and arginine in caries management. Here, we hypothesize that arginine may augment the ecological benefit of fluoride by enriching alkali-generating bacteria in the plaque biofilm and thus synergizes with fluoride in controlling dental caries. Specifically, we assessed the combinatory effects of NaF/arginine on planktonic and biofilm cultures of Streptococcus mutans, Streptococcus sanguinis, and Porphyromonas gingivalis with checkerboard microdilution assays. The optimal NaF/arginine combinations were selected, and their combinatory effects on microbial composition were further examined in single-, dual-, and 3-species biofilm using bacterial species–specific fluorescence in situ hybridization and quantitative polymerase chain reaction. We found that arginine synergized with fluoride in suppressing acidogenic S. mutans in both planktonic and biofilm cultures. In addition, the NaF/arginine combination synergistically reduced S. mutans but enriched S. sanguinis within the multispecies biofilms. More importantly, the optimal combination of NaF/arginine maintained a “streptococcal pressure” against the potential growth of oral anaerobe P. gingivalis within the alkalized biofilm. Taken together, we conclude that the combinatory application of fluoride and arginine has a potential synergistic effect in maintaining a healthy oral microbial equilibrium and thus represents a promising ecological approach to caries management. PMID:25477312
Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication
ERIC Educational Resources Information Center
Wolf, Michael Maclean
2009-01-01
Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…
Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space
2015-05-01
ARL-TR-7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
Combinatorial effects of arginine and fluoride on oral bacteria.
Zheng, X; Cheng, X; Wang, L; Qiu, W; Wang, S; Zhou, Y; Li, M; Li, Y; Cheng, L; Li, J; Zhou, X; Xu, X
2015-02-01
Dental caries is closely associated with the microbial disequilibrium between acidogenic/aciduric pathogens and alkali-generating commensal residents within the dental plaque. Fluoride is a widely used anticaries agent, which promotes tooth hard-tissue remineralization and suppresses bacterial activities. Recent clinical trials have shown that oral hygiene products containing both fluoride and arginine possess a greater anticaries effect compared with those containing fluoride alone, indicating synergy between fluoride and arginine in caries management. Here, we hypothesize that arginine may augment the ecological benefit of fluoride by enriching alkali-generating bacteria in the plaque biofilm and thus synergizes with fluoride in controlling dental caries. Specifically, we assessed the combinatory effects of NaF/arginine on planktonic and biofilm cultures of Streptococcus mutans, Streptococcus sanguinis, and Porphyromonas gingivalis with checkerboard microdilution assays. The optimal NaF/arginine combinations were selected, and their combinatory effects on microbial composition were further examined in single-, dual-, and 3-species biofilm using bacterial species-specific fluorescence in situ hybridization and quantitative polymerase chain reaction. We found that arginine synergized with fluoride in suppressing acidogenic S. mutans in both planktonic and biofilm cultures. In addition, the NaF/arginine combination synergistically reduced S. mutans but enriched S. sanguinis within the multispecies biofilms. More importantly, the optimal combination of NaF/arginine maintained a "streptococcal pressure" against the potential growth of oral anaerobe P. gingivalis within the alkalized biofilm. Taken together, we conclude that the combinatory application of fluoride and arginine has a potential synergistic effect in maintaining a healthy oral microbial equilibrium and thus represents a promising ecological approach to caries management. © International & American Associations for Dental Research 2014.
Pourhassan, Mojgan; Neumann, Frank
2018-06-22
The generalized travelling salesperson problem is an important NP-hard combinatorial optimization problem for which meta-heuristics, such as local search and evolutionary algorithms, have been used very successfully. Two hierarchical approaches with different neighbourhood structures, namely a Cluster-Based approach and a Node-Based approach, have been proposed by Hu and Raidl (2008) for solving this problem. In this paper, local search algorithms and simple evolutionary algorithms based on these approaches are investigated from a theoretical perspective. For local search algorithms, we point out the complementary abilities of the two approaches by presenting instances where they mutually outperform each other. Afterwards, we introduce an instance which is hard for both approaches when initialized on a particular point of the search space, but where a variable neighbourhood search combining them finds the optimal solution in polynomial time. Then we turn our attention to analysing the behaviour of simple evolutionary algorithms that use these approaches. We show that the Node-Based approach solves the hard instance of the Cluster-Based approach presented in Corus et al. (2016) in polynomial time. Furthermore, we prove an exponential lower bound on the optimization time of the Node-Based approach for a class of Euclidean instances.
Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner
2013-04-08
In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.
Number Partitioning via Quantum Adiabatic Computation
NASA Technical Reports Server (NTRS)
Smelyanskiy, Vadim N.; Toussaint, Udo
2002-01-01
We study both analytically and numerically the complexity of the adiabatic quantum evolution algorithm applied to random instances of combinatorial optimization problems. We use as an example the NP-complete set partition problem and obtain an asymptotic expression for the minimal gap separating the ground and exited states of a system during the execution of the algorithm. We show that for computationally hard problem instances the size of the minimal gap scales exponentially with the problem size. This result is in qualitative agreement with the direct numerical simulation of the algorithm for small instances of the set partition problem. We describe the statistical properties of the optimization problem that are responsible for the exponential behavior of the algorithm.
NASA Astrophysics Data System (ADS)
Doerr, Timothy P.; Alves, Gelio; Yu, Yi-Kuo
2005-08-01
Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time using the transfer matrix technique or, equivalently, the dynamic programming approach. This suggests a way to efficiently find approximate solutions-find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of the kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the finite number of high-ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks-peptide sequencing using tandem mass spectrometry data. For directed paths in random media, the scaling function depends on the particular realization of randomness; in the mass spectrometry case, the scaling function is spectrum-specific.
Hernando, Leticia; Mendiburu, Alexander; Lozano, Jose A
2013-01-01
The solution of many combinatorial optimization problems is carried out by metaheuristics, which generally make use of local search algorithms. These algorithms use some kind of neighborhood structure over the search space. The performance of the algorithms strongly depends on the properties that the neighborhood imposes on the search space. One of these properties is the number of local optima. Given an instance of a combinatorial optimization problem and a neighborhood, the estimation of the number of local optima can help not only to measure the complexity of the instance, but also to choose the most convenient neighborhood to solve it. In this paper we review and evaluate several methods to estimate the number of local optima in combinatorial optimization problems. The methods reviewed not only come from the combinatorial optimization literature, but also from the statistical literature. A thorough evaluation in synthetic as well as real problems is given. We conclude by providing recommendations of methods for several scenarios.
ERIC Educational Resources Information Center
Kolata, Gina
1985-01-01
To determine how hard it is for computers to solve problems, researchers have classified groups of problems (polynomial hierarchy) according to how much time they seem to require for their solutions. A difficult and complex proof is offered which shows that a combinatorial approach (using Boolean circuits) may resolve the problem. (JN)
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.
A comparison of approaches for finding minimum identifying codes on graphs
NASA Astrophysics Data System (ADS)
Horan, Victoria; Adachi, Steve; Bak, Stanley
2016-05-01
In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.
Using Ant Colony Optimization for Routing in VLSI Chips
NASA Astrophysics Data System (ADS)
Arora, Tamanna; Moses, Melanie
2009-04-01
Rapid advances in VLSI technology have increased the number of transistors that fit on a single chip to about two billion. A frequent problem in the design of such high performance and high density VLSI layouts is that of routing wires that connect such large numbers of components. Most wire-routing problems are computationally hard. The quality of any routing algorithm is judged by the extent to which it satisfies routing constraints and design objectives. Some of the broader design objectives include minimizing total routed wire length, and minimizing total capacitance induced in the chip, both of which serve to minimize power consumed by the chip. Ant Colony Optimization algorithms (ACO) provide a multi-agent framework for combinatorial optimization by combining memory, stochastic decision and strategies of collective and distributed learning by ant-like agents. This paper applies ACO to the NP-hard problem of finding optimal routes for interconnect routing on VLSI chips. The constraints on interconnect routing are used by ants as heuristics which guide their search process. We found that ACO algorithms were able to successfully incorporate multiple constraints and route interconnects on suite of benchmark chips. On an average, the algorithm routed with total wire length 5.5% less than other established routing algorithms.
Causal gene identification using combinatorial V-structure search.
Cai, Ruichu; Zhang, Zhenjie; Hao, Zhifeng
2013-07-01
With the advances of biomedical techniques in the last decade, the costs of human genomic sequencing and genomic activity monitoring are coming down rapidly. To support the huge genome-based business in the near future, researchers are eager to find killer applications based on human genome information. Causal gene identification is one of the most promising applications, which may help the potential patients to estimate the risk of certain genetic diseases and locate the target gene for further genetic therapy. Unfortunately, existing pattern recognition techniques, such as Bayesian networks, cannot be directly applied to find the accurate causal relationship between genes and diseases. This is mainly due to the insufficient number of samples and the extremely high dimensionality of the gene space. In this paper, we present the first practical solution to causal gene identification, utilizing a new combinatorial formulation over V-Structures commonly used in conventional Bayesian networks, by exploring the combinations of significant V-Structures. We prove the NP-hardness of the combinatorial search problem under a general settings on the significance measure on the V-Structures, and present a greedy algorithm to find sub-optimal results. Extensive experiments show that our proposal is both scalable and effective, particularly with interesting findings on the causal genes over real human genome data. Copyright © 2013 Elsevier Ltd. All rights reserved.
The checkpoint ordering problem
Hungerländer, P.
2017-01-01
Abstract We suggest a new variant of a row layout problem: Find an ordering of n departments with given lengths such that the total weighted sum of their distances to a given checkpoint is minimized. The Checkpoint Ordering Problem (COP) is both of theoretical and practical interest. It has several applications and is conceptually related to some well-studied combinatorial optimization problems, namely the Single-Row Facility Layout Problem, the Linear Ordering Problem and a variant of parallel machine scheduling. In this paper we study the complexity of the (COP) and its special cases. The general version of the (COP) with an arbitrary but fixed number of checkpoints is NP-hard in the weak sense. We propose both a dynamic programming algorithm and an integer linear programming approach for the (COP) . Our computational experiments indicate that the (COP) is hard to solve in practice. While the run time of the dynamic programming algorithm strongly depends on the length of the departments, the integer linear programming approach is able to solve instances with up to 25 departments to optimality. PMID:29170574
Multiple-variable neighbourhood search for the single-machine total weighted tardiness problem
NASA Astrophysics Data System (ADS)
Chung, Tsui-Ping; Fu, Qunjie; Liao, Ching-Jong; Liu, Yi-Ting
2017-07-01
The single-machine total weighted tardiness (SMTWT) problem is a typical discrete combinatorial optimization problem in the scheduling literature. This problem has been proved to be NP hard and thus provides a challenging area for metaheuristics, especially the variable neighbourhood search algorithm. In this article, a multiple variable neighbourhood search (m-VNS) algorithm with multiple neighbourhood structures is proposed to solve the problem. Special mechanisms named matching and strengthening operations are employed in the algorithm, which has an auto-revising local search procedure to explore the solution space beyond local optimality. Two aspects, searching direction and searching depth, are considered, and neighbourhood structures are systematically exchanged. Experimental results show that the proposed m-VNS algorithm outperforms all the compared algorithms in solving the SMTWT problem.
Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network.
Goto, Hayato
2016-02-22
The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.
Discrete Optimization Model for Vehicle Routing Problem with Scheduling Side Cosntraints
NASA Astrophysics Data System (ADS)
Juliandri, Dedy; Mawengkang, Herman; Bu'ulolo, F.
2018-01-01
Vehicle Routing Problem (VRP) is an important element of many logistic systems which involve routing and scheduling of vehicles from a depot to a set of customers node. This is a hard combinatorial optimization problem with the objective to find an optimal set of routes used by a fleet of vehicles to serve the demands a set of customers It is required that these vehicles return to the depot after serving customers’ demand. The problem incorporates time windows, fleet and driver scheduling, pick-up and delivery in the planning horizon. The goal is to determine the scheduling of fleet and driver and routing policies of the vehicles. The objective is to minimize the overall costs of all routes over the planning horizon. We model the problem as a linear mixed integer program. We develop a combination of heuristics and exact method for solving the model.
Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network
NASA Astrophysics Data System (ADS)
Goto, Hayato
2016-02-01
The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.
An Integrated Method Based on PSO and EDA for the Max-Cut Problem.
Lin, Geng; Guan, Jian
2016-01-01
The max-cut problem is NP-hard combinatorial optimization problem with many real world applications. In this paper, we propose an integrated method based on particle swarm optimization and estimation of distribution algorithm (PSO-EDA) for solving the max-cut problem. The integrated algorithm overcomes the shortcomings of particle swarm optimization and estimation of distribution algorithm. To enhance the performance of the PSO-EDA, a fast local search procedure is applied. In addition, a path relinking procedure is developed to intensify the search. To evaluate the performance of PSO-EDA, extensive experiments were carried out on two sets of benchmark instances with 800 to 20,000 vertices from the literature. Computational results and comparisons show that PSO-EDA significantly outperforms the existing PSO-based and EDA-based algorithms for the max-cut problem. Compared with other best performing algorithms, PSO-EDA is able to find very competitive results in terms of solution quality.
Solving optimization problems by the public goods game
NASA Astrophysics Data System (ADS)
Javarone, Marco Alberto
2017-09-01
We introduce a method based on the Public Goods Game for solving optimization tasks. In particular, we focus on the Traveling Salesman Problem, i.e. a NP-hard problem whose search space exponentially grows increasing the number of cities. The proposed method considers a population whose agents are provided with a random solution to the given problem. In doing so, agents interact by playing the Public Goods Game using the fitness of their solution as currency of the game. Notably, agents with better solutions provide higher contributions, while those with lower ones tend to imitate the solution of richer agents for increasing their fitness. Numerical simulations show that the proposed method allows to compute exact solutions, and suboptimal ones, in the considered search spaces. As result, beyond to propose a new heuristic for combinatorial optimization problems, our work aims to highlight the potentiality of evolutionary game theory beyond its current horizons.
Lexicographic goal programming and assessment tools for a combinatorial production problem.
DOT National Transportation Integrated Search
2008-01-01
NP-complete combinatorial problems often necessitate the use of near-optimal solution techniques including : heuristics and metaheuristics. The addition of multiple optimization criteria can further complicate : comparison of these solution technique...
Microbatteries for Combinatorial Studies of Conventional Lithium-Ion Batteries
NASA Technical Reports Server (NTRS)
West, William; Whitacre, Jay; Bugga, Ratnakumar
2003-01-01
Integrated arrays of microscopic solid-state batteries have been demonstrated in a continuing effort to develop microscopic sources of power and of voltage reference circuits to be incorporated into low-power integrated circuits. Perhaps even more importantly, arrays of microscopic batteries can be fabricated and tested in combinatorial experiments directed toward optimization and discovery of battery materials. The value of the combinatorial approach to optimization and discovery has been proven in the optoelectronic, pharmaceutical, and bioengineering industries. Depending on the specific application, the combinatorial approach can involve the investigation of hundreds or even thousands of different combinations; hence, it is time-consuming and expensive to attempt to implement the combinatorial approach by building and testing full-size, discrete cells and batteries. The conception of microbattery arrays makes it practical to bring the advantages of the combinatorial approach to the development of batteries.
Block clustering based on difference of convex functions (DC) programming and DC algorithms.
Le, Hoai Minh; Le Thi, Hoai An; Dinh, Tao Pham; Huynh, Van Ngai
2013-10-01
We investigate difference of convex functions (DC) programming and the DC algorithm (DCA) to solve the block clustering problem in the continuous framework, which traditionally requires solving a hard combinatorial optimization problem. DC reformulation techniques and exact penalty in DC programming are developed to build an appropriate equivalent DC program of the block clustering problem. They lead to an elegant and explicit DCA scheme for the resulting DC program. Computational experiments show the robustness and efficiency of the proposed algorithm and its superiority over standard algorithms such as two-mode K-means, two-mode fuzzy clustering, and block classification EM.
Aghamohammadi, Hossein; Saadi Mesgari, Mohammad; Molaei, Damoon; Aghamohammadi, Hasan
2013-01-01
Location-allocation is a combinatorial optimization problem, and is defined as Non deterministic Polynomial Hard (NP) hard optimization. Therefore, solution of such a problem should be shifted from exact to heuristic or Meta heuristic due to the complexity of the problem. Locating medical centers and allocating injuries of an earthquake to them has high importance in earthquake disaster management so that developing a proper method will reduce the time of relief operation and will consequently decrease the number of fatalities. This paper presents the development of a heuristic method based on two nested genetic algorithms to optimize this location allocation problem by using the abilities of Geographic Information System (GIS). In the proposed method, outer genetic algorithm is applied to the location part of the problem and inner genetic algorithm is used to optimize the resource allocation. The final outcome of implemented method includes the spatial location of new required medical centers. The method also calculates that how many of the injuries at each demanding point should be taken to any of the existing and new medical centers as well. The results of proposed method showed high performance of designed structure to solve a capacitated location-allocation problem that may arise in a disaster situation when injured people has to be taken to medical centers in a reasonable time.
Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.
Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi
2016-06-01
Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.
Gobin, Oliver C; Schüth, Ferdi
2008-01-01
Genetic algorithms are widely used to solve and optimize combinatorial problems and are more often applied for library design in combinatorial chemistry. Because of their flexibility, however, their implementation can be challenging. In this study, the influence of the representation of solid catalysts on the performance of genetic algorithms was systematically investigated on the basis of a new, constrained, multiobjective, combinatorial test problem with properties common to problems in combinatorial materials science. Constraints were satisfied by penalty functions, repair algorithms, or special representations. The tests were performed using three state-of-the-art evolutionary multiobjective algorithms by performing 100 optimization runs for each algorithm and test case. Experimental data obtained during the optimization of a noble metal-free solid catalyst system active in the selective catalytic reduction of nitric oxide with propene was used to build up a predictive model to validate the results of the theoretical test problem. A significant influence of the representation on the optimization performance was observed. Binary encodings were found to be the preferred encoding in most of the cases, and depending on the experimental test unit, repair algorithms or penalty functions performed best.
Quantum Resonance Approach to Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Zak, Michail
1997-01-01
It is shown that quantum resonance can be used for combinatorial optimization. The advantage of the approach is in independence of the computing time upon the dimensionality of the problem. As an example, the solution to a constraint satisfaction problem of exponential complexity is demonstrated.
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
Optimizing Irregular Applications for Energy and Performance on the Tilera Many-core Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Panyala, Ajay R.; Halappanavar, Mahantesh
Optimizing applications simultaneously for energy and performance is a complex problem. High performance, parallel, irregular applications are notoriously hard to optimize due to their data-dependent memory accesses, lack of structured locality and complex data structures and code patterns. Irregular kernels are growing in importance in applications such as machine learning, graph analytics and combinatorial scientific computing. Performance- and energy-efficient implementation of these kernels on modern, energy efficient, multicore and many-core platforms is therefore an important and challenging problem. We present results from optimizing two irregular applications { the Louvain method for community detection (Grappolo), and high-performance conjugate gradient (HPCCG) {more » on the Tilera many-core system. We have significantly extended MIT's OpenTuner auto-tuning framework to conduct a detailed study of platform-independent and platform-specific optimizations to improve performance as well as reduce total energy consumption. We explore the optimization design space along three dimensions: memory layout schemes, compiler-based code transformations, and optimization of parallel loop schedules. Using auto-tuning, we demonstrate whole node energy savings of up to 41% relative to a baseline instantiation, and up to 31% relative to manually optimized variants.« less
Chen, Xin; Wu, Qiong; Sun, Ruimin; Zhang, Louxin
2012-01-01
The discovery of single-nucleotide polymorphisms (SNPs) has important implications in a variety of genetic studies on human diseases and biological functions. One valuable approach proposed for SNP discovery is based on base-specific cleavage and mass spectrometry. However, it is still very challenging to achieve the full potential of this SNP discovery approach. In this study, we formulate two new combinatorial optimization problems. While both problems are aimed at reconstructing the sample sequence that would attain the minimum number of SNPs, they search over different candidate sequence spaces. The first problem, denoted as SNP - MSP, limits its search to sequences whose in silico predicted mass spectra have all their signals contained in the measured mass spectra. In contrast, the second problem, denoted as SNP - MSQ, limits its search to sequences whose in silico predicted mass spectra instead contain all the signals of the measured mass spectra. We present an exact dynamic programming algorithm for solving the SNP - MSP problem and also show that the SNP - MSQ problem is NP-hard by a reduction from a restricted variation of the 3-partition problem. We believe that an efficient solution to either problem above could offer a seamless integration of information in four complementary base-specific cleavage reactions, thereby improving the capability of the underlying biotechnology for sensitive and accurate SNP discovery.
NASA Astrophysics Data System (ADS)
Dao, Son Duy; Abhary, Kazem; Marian, Romeo
2017-06-01
Integration of production planning and scheduling is a class of problems commonly found in manufacturing industry. This class of problems associated with precedence constraint has been previously modeled and optimized by the authors, in which, it requires a multidimensional optimization at the same time: what to make, how many to make, where to make and the order to make. It is a combinatorial, NP-hard problem, for which no polynomial time algorithm is known to produce an optimal result on a random graph. In this paper, the further development of Genetic Algorithm (GA) for this integrated optimization is presented. Because of the dynamic nature of the problem, the size of its solution is variable. To deal with this variability and find an optimal solution to the problem, GA with new features in chromosome encoding, crossover, mutation, selection as well as algorithm structure is developed herein. With the proposed structure, the proposed GA is able to "learn" from its experience. Robustness of the proposed GA is demonstrated by a complex numerical example in which performance of the proposed GA is compared with those of three commercial optimization solvers.
Nonlinear Multidimensional Assignment Problems Efficient Conic Optimization Methods and Applications
2015-06-24
WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Arizona State University School of Mathematical & Statistical Sciences 901 S...SUPPLEMENTARY NOTES 14. ABSTRACT The major goals of this project were completed: the exact solution of previously unsolved challenging combinatorial optimization... combinatorial optimization problem, the Directional Sensor Problem, was solved in two ways. First, heuristically in an engineering fashion and second, exactly
TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization
2016-11-28
objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation
Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network
Goto, Hayato
2016-01-01
The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence. PMID:26899997
Chang, Yuchao; Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Yuan, Baoqing Li andXiaobing
2017-07-19
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum-minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms.
A modified genetic algorithm with fuzzy roulette wheel selection for job-shop scheduling problems
NASA Astrophysics Data System (ADS)
Thammano, Arit; Teekeng, Wannaporn
2015-05-01
The job-shop scheduling problem is one of the most difficult production planning problems. Since it is in the NP-hard class, a recent trend in solving the job-shop scheduling problem is shifting towards the use of heuristic and metaheuristic algorithms. This paper proposes a novel metaheuristic algorithm, which is a modification of the genetic algorithm. This proposed algorithm introduces two new concepts to the standard genetic algorithm: (1) fuzzy roulette wheel selection and (2) the mutation operation with tabu list. The proposed algorithm has been evaluated and compared with several state-of-the-art algorithms in the literature. The experimental results on 53 JSSPs show that the proposed algorithm is very effective in solving the combinatorial optimization problems. It outperforms all state-of-the-art algorithms on all benchmark problems in terms of the ability to achieve the optimal solution and the computational time.
Constant Communities in Complex Networks
NASA Astrophysics Data System (ADS)
Chakraborty, Tanmoy; Srinivasan, Sriram; Ganguly, Niloy; Bhowmick, Sanjukta; Mukherjee, Animesh
2013-05-01
Identifying community structure is a fundamental problem in network analysis. Most community detection algorithms are based on optimizing a combinatorial parameter, for example modularity. This optimization is generally NP-hard, thus merely changing the vertex order can alter their assignments to the community. However, there has been less study on how vertex ordering influences the results of the community detection algorithms. Here we identify and study the properties of invariant groups of vertices (constant communities) whose assignment to communities are, quite remarkably, not affected by vertex ordering. The percentage of constant communities can vary across different applications and based on empirical results we propose metrics to evaluate these communities. Using constant communities as a pre-processing step, one can significantly reduce the variation of the results. Finally, we present a case study on phoneme network and illustrate that constant communities, quite strikingly, form the core functional units of the larger communities.
Human Performance on the Traveling Salesman and Related Problems: A Review
ERIC Educational Resources Information Center
MacGregor, James N.; Chu, Yun
2011-01-01
The article provides a review of recent research on human performance on the traveling salesman problem (TSP) and related combinatorial optimization problems. We discuss what combinatorial optimization problems are, why they are important, and why they may be of interest to cognitive scientists. We next describe the main characteristics of human…
ERIC Educational Resources Information Center
Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie
2008-01-01
Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…
Osaba, E; Carballedo, R; Diaz, F; Onieva, E; de la Iglesia, I; Perallos, A
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test.
Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; de la Iglesia, I.; Perallos, A.
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test. PMID:25165731
Optimal weighted combinatorial forecasting model of QT dispersion of ECGs in Chinese adults.
Wen, Zhang; Miao, Ge; Xinlei, Liu; Minyi, Cen
2016-07-01
This study aims to provide a scientific basis for unifying the reference value standard of QT dispersion of ECGs in Chinese adults. Three predictive models including regression model, principal component model, and artificial neural network model are combined to establish the optimal weighted combination model. The optimal weighted combination model and single model are verified and compared. Optimal weighted combinatorial model can reduce predicting risk of single model and improve the predicting precision. The reference value of geographical distribution of Chinese adults' QT dispersion was precisely made by using kriging methods. When geographical factors of a particular area are obtained, the reference value of QT dispersion of Chinese adults in this area can be estimated by using optimal weighted combinatorial model and reference value of the QT dispersion of Chinese adults anywhere in China can be obtained by using geographical distribution figure as well.
Wang, Lipo; Li, Sa; Tian, Fuyu; Fu, Xiuju
2004-10-01
Recently Chen and Aihara have demonstrated both experimentally and mathematically that their chaotic simulated annealing (CSA) has better search ability for solving combinatorial optimization problems compared to both the Hopfield-Tank approach and stochastic simulated annealing (SSA). However, CSA may not find a globally optimal solution no matter how slowly annealing is carried out, because the chaotic dynamics are completely deterministic. In contrast, SSA tends to settle down to a global optimum if the temperature is reduced sufficiently slowly. Here we combine the best features of both SSA and CSA, thereby proposing a new approach for solving optimization problems, i.e., stochastic chaotic simulated annealing, by using a noisy chaotic neural network. We show the effectiveness of this new approach with two difficult combinatorial optimization problems, i.e., a traveling salesman problem and a channel assignment problem for cellular mobile communications.
MGA trajectory planning with an ACO-inspired algorithm
NASA Astrophysics Data System (ADS)
Ceriotti, Matteo; Vasile, Massimiliano
2010-11-01
Given a set of celestial bodies, the problem of finding an optimal sequence of swing-bys, deep space manoeuvres (DSM) and transfer arcs connecting the elements of the set is combinatorial in nature. The number of possible paths grows exponentially with the number of celestial bodies. Therefore, the design of an optimal multiple gravity assist (MGA) trajectory is a NP-hard mixed combinatorial-continuous problem. Its automated solution would greatly improve the design of future space missions, allowing the assessment of a large number of alternative mission options in a short time. This work proposes to formulate the complete automated design of a multiple gravity assist trajectory as an autonomous planning and scheduling problem. The resulting scheduled plan will provide the optimal planetary sequence and a good estimation of the set of associated optimal trajectories. The trajectory model consists of a sequence of celestial bodies connected by two-dimensional transfer arcs containing one DSM. For each transfer arc, the position of the planet and the spacecraft, at the time of arrival, are matched by varying the pericentre of the preceding swing-by, or the magnitude of the launch excess velocity, for the first arc. For each departure date, this model generates a full tree of possible transfers from the departure to the destination planet. Each leaf of the tree represents a planetary encounter and a possible way to reach that planet. An algorithm inspired by ant colony optimization (ACO) is devised to explore the space of possible plans. The ants explore the tree from departure to destination adding one node at the time: every time an ant is at a node, a probability function is used to select a feasible direction. This approach to automatic trajectory planning is applied to the design of optimal transfers to Saturn and among the Galilean moons of Jupiter. Solutions are compared to those found through more traditional genetic-algorithm techniques.
NASA Astrophysics Data System (ADS)
Kumar, Vijay M.; Murthy, ANN; Chandrashekara, K.
2012-05-01
The production planning problem of flexible manufacturing system (FMS) concerns with decisions that have to be made before an FMS begins to produce parts according to a given production plan during an upcoming planning horizon. The main aspect of production planning deals with machine loading problem in which selection of a subset of jobs to be manufactured and assignment of their operations to the relevant machines are made. Such problems are not only combinatorial optimization problems, but also happen to be non-deterministic polynomial-time-hard, making it difficult to obtain satisfactory solutions using traditional optimization techniques. In this paper, an attempt has been made to address the machine loading problem with objectives of minimization of system unbalance and maximization of throughput simultaneously while satisfying the system constraints related to available machining time and tool slot designing and using a meta-hybrid heuristic technique based on genetic algorithm and particle swarm optimization. The results reported in this paper demonstrate the model efficiency and examine the performance of the system with respect to measures such as throughput and system utilization.
Aerospace applications of integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in solving combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on a large space structure and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Structure-based design of combinatorial mutagenesis libraries
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-01-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called “Structure-based Optimization of Combinatorial Mutagenesis” (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. PMID:25611189
Structure-based design of combinatorial mutagenesis libraries.
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-05-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called "Structure-based Optimization of Combinatorial Mutagenesis" (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. © 2015 The Protein Society.
Combinatorial Optimization in Project Selection Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Dewi, Sari; Sawaluddin
2018-01-01
This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Li, Baoqing; Yuan, Xiaobing
2017-01-01
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum–minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms. PMID:28753962
Quadratic constrained mixed discrete optimization with an adiabatic quantum optimizer
NASA Astrophysics Data System (ADS)
Chandra, Rishabh; Jacobson, N. Tobias; Moussa, Jonathan E.; Frankel, Steven H.; Kais, Sabre
2014-07-01
We extend the family of problems that may be implemented on an adiabatic quantum optimizer (AQO). When a quadratic optimization problem has at least one set of discrete controls and the constraints are linear, we call this a quadratic constrained mixed discrete optimization (QCMDO) problem. QCMDO problems are NP-hard, and no efficient classical algorithm for their solution is known. Included in the class of QCMDO problems are combinatorial optimization problems constrained by a linear partial differential equation (PDE) or system of linear PDEs. An essential complication commonly encountered in solving this type of problem is that the linear constraint may introduce many intermediate continuous variables into the optimization while the computational cost grows exponentially with problem size. We resolve this difficulty by developing a constructive mapping from QCMDO to quadratic unconstrained binary optimization (QUBO) such that the size of the QUBO problem depends only on the number of discrete control variables. With a suitable embedding, taking into account the physical constraints of the realizable coupling graph, the resulting QUBO problem can be implemented on an existing AQO. The mapping itself is efficient, scaling cubically with the number of continuous variables in the general case and linearly in the PDE case if an efficient preconditioner is available.
Aerospace Applications of Integer and Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Aerospace applications on integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem. for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Methodologies in determining mechanical properties of thin films using nanoindentation
NASA Astrophysics Data System (ADS)
Han, Seung Min Jane
Thin films are critical components of microelectronic and MEMS devices, and evaluating their mechanical properties is of current interest. As the dimensions of the devices become smaller and smaller, however, understanding the mechanical properties of materials at sub-micron length scales becomes more challenging. The conventional methods for evaluating strengths of materials in bulk form cannot be applied, and new methodologies are required for accurately evaluating mechanical properties of thin films. In this work, development of methodologies using the nanoindenter was pursued in three parts: (1) creation of a new method for extracting thin film hardness, (2) use of combinatorial methods for determining compositions with desired mechanical properties, and (3) use of microcompression testing of sub-micron sized pillars to understand plasticity in Al-Sc multilayers. The existing nanoindentation hardness model by Oliver & Pharr is unable to accurately determine the hardness of thin films on substrates with an elastic mismatch. Thus, a new method of analysis for extracting thin film hardness from film/substrate systems, that eliminates the effect of elastic mismatch of the underlying substrate, surface roughness, and also pile-up/sink-in, is needed. Such a method was developed in the first part of this study. The feasibility of using the nanoindentation hardness together with combinatorial methods to efficiently scan through mechanical properties of Ti-Al metallic alloys was examined in the second part of this study. The combinatorial approach provides an efficient method that can be used to determine alloy compositions that might merit further exploration and development as bulk materials. Finally, the mechanical properties of Al-Al3Sc multilayers with bilayer periods ranging from 6-100 nm were examined using microcompression. The sub-micron sized pillars were prepared using the focused ion beam (FIB) and compression tested with the flat tip of the nanoindenter. The measured yield strengths show the trend of increasing strength with decreasing bilayer period, and agree with the nanoindentation hardness results using the suitable Tabor correction factor. Strain softening was observed at large strains, and a new model for the true stress and true strain was developed to account for the inhomogeneous deformation geometry.
Liu, Zhi-Hua; Xie, Shangxian; Lin, Furong; Jin, Mingjie; Yuan, Joshua S
2018-01-01
Lignin valorization has recently been considered to be an essential process for sustainable and cost-effective biorefineries. Lignin represents a potential new feedstock for value-added products. Oleaginous bacteria such as Rhodococcus opacus can produce intracellular lipids from biodegradation of aromatic substrates. These lipids can be used for biofuel production, which can potentially replace petroleum-derived chemicals. However, the low reactivity of lignin produced from pretreatment and the underdeveloped fermentation technology hindered lignin bioconversion to lipids. In this study, combinatorial pretreatment with an optimized fermentation strategy was evaluated to improve lignin valorization into lipids using R. opacus PD630. As opposed to single pretreatment, combinatorial pretreatment produced a 12.8-75.6% higher lipid concentration in fermentation using lignin as the carbon source. Gas chromatography-mass spectrometry analysis showed that combinatorial pretreatment released more aromatic monomers, which could be more readily utilized by lignin-degrading strains. Three detoxification strategies were used to remove potential inhibitors produced from pretreatment. After heating detoxification of the lignin stream, the lipid concentration further increased by 2.9-9.7%. Different fermentation strategies were evaluated in scale-up lipid fermentation using a 2.0-l fermenter. With laccase treatment of the lignin stream produced from combinatorial pretreatment, the highest cell dry weight and lipid concentration were 10.1 and 1.83 g/l, respectively, in fed-batch fermentation, with a total soluble substrate concentration of 40 g/l. The improvement of the lipid fermentation performance may have resulted from lignin depolymerization by the combinatorial pretreatment and laccase treatment, reduced inhibition effects by fed-batch fermentation, adequate oxygen supply, and an accurate pH control in the fermenter. Overall, these results demonstrate that combinatorial pretreatment, together with fermentation optimization, favorably improves lipid production using lignin as the carbon source. Combinatorial pretreatment integrated with fed-batch fermentation was an effective strategy to improve the bioconversion of lignin into lipids, thus facilitating lignin valorization in biorefineries.
A path following algorithm for the graph matching problem.
Zaslavskiy, Mikhail; Bach, Francis; Vert, Jean-Philippe
2009-12-01
We propose a convex-concave programming approach for the labeled weighted graph matching problem. The convex-concave programming formulation is obtained by rewriting the weighted graph matching problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems: a quadratic convex and a quadratic concave optimization problem on the set of doubly stochastic matrices. The concave relaxation has the same global minimum as the initial graph matching problem, but the search for its global minimum is also a hard combinatorial problem. We, therefore, construct an approximation of the concave problem solution by following a solution path of a convex-concave problem obtained by linear interpolation of the convex and concave formulations, starting from the convex relaxation. This method allows to easily integrate the information on graph label similarities into the optimization problem, and therefore, perform labeled weighted graph matching. The algorithm is compared with some of the best performing graph matching methods on four data sets: simulated graphs, QAPLib, retina vessel images, and handwritten Chinese characters. In all cases, the results are competitive with the state of the art.
Ant colony optimization for solving university facility layout problem
NASA Astrophysics Data System (ADS)
Mohd Jani, Nurul Hafiza; Mohd Radzi, Nor Haizan; Ngadiman, Mohd Salihin
2013-04-01
Quadratic Assignment Problems (QAP) is classified as the NP hard problem. It has been used to model a lot of problem in several areas such as operational research, combinatorial data analysis and also parallel and distributed computing, optimization problem such as graph portioning and Travel Salesman Problem (TSP). In the literature, researcher use exact algorithm, heuristics algorithm and metaheuristic approaches to solve QAP problem. QAP is largely applied in facility layout problem (FLP). In this paper we used QAP to model university facility layout problem. There are 8 facilities that need to be assigned to 8 locations. Hence we have modeled a QAP problem with n ≤ 10 and developed an Ant Colony Optimization (ACO) algorithm to solve the university facility layout problem. The objective is to assign n facilities to n locations such that the minimum product of flows and distances is obtained. Flow is the movement from one to another facility, whereas distance is the distance between one locations of a facility to other facilities locations. The objective of the QAP is to obtain minimum total walking (flow) of lecturers from one destination to another (distance).
Antolini, Ermete
2017-02-13
Combinatorial chemistry and high-throughput screening represent an innovative and rapid tool to prepare and evaluate a large number of new materials, saving time and expense for research and development. Considering that the activity and selectivity of catalysts depend on complex kinetic phenomena, making their development largely empirical in practice, they are prime candidates for combinatorial discovery and optimization. This review presents an overview of recent results of combinatorial screening of low-temperature fuel cell electrocatalysts for methanol oxidation. Optimum catalyst compositions obtained by combinatorial screening were compared with those of bulk catalysts, and the effect of the library geometry on the screening of catalyst composition is highlighted.
Optimization of Highway Work Zone Decisions Considering Short-Term and Long-Term Impacts
2010-01-01
strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature of this optimization problem, a heuristic...combination of lane closure and traffic control strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature ...zone) NV # the number of vehicle classes NPV $ Net Present Value p’(t) % Adjusted traffic diversion rate at time t p(t) % Natural diversion rate
Genetic algorithm parameters tuning for resource-constrained project scheduling problem
NASA Astrophysics Data System (ADS)
Tian, Xingke; Yuan, Shengrui
2018-04-01
Project Scheduling Problem (RCPSP) is a kind of important scheduling problem. To achieve a certain optimal goal such as the shortest duration, the smallest cost, the resource balance and so on, it is required to arrange the start and finish of all tasks under the condition of satisfying project timing constraints and resource constraints. In theory, the problem belongs to the NP-hard problem, and the model is abundant. Many combinatorial optimization problems are special cases of RCPSP, such as job shop scheduling, flow shop scheduling and so on. At present, the genetic algorithm (GA) has been used to deal with the classical RCPSP problem and achieved remarkable results. Vast scholars have also studied the improved genetic algorithm for the RCPSP problem, which makes it to solve the RCPSP problem more efficiently and accurately. However, for the selection of the main parameters of the genetic algorithm, there is no parameter optimization in these studies. Generally, we used the empirical method, but it cannot ensure to meet the optimal parameters. In this paper, the problem was carried out, which is the blind selection of parameters in the process of solving the RCPSP problem. We made sampling analysis, the establishment of proxy model and ultimately solved the optimal parameters.
A mixed analog/digital chaotic neuro-computer system for quadratic assignment problems.
Horio, Yoshihiko; Ikeguchi, Tohru; Aihara, Kazuyuki
2005-01-01
We construct a mixed analog/digital chaotic neuro-computer prototype system for quadratic assignment problems (QAPs). The QAP is one of the difficult NP-hard problems, and includes several real-world applications. Chaotic neural networks have been used to solve combinatorial optimization problems through chaotic search dynamics, which efficiently searches optimal or near optimal solutions. However, preliminary experiments have shown that, although it obtained good feasible solutions, the Hopfield-type chaotic neuro-computer hardware system could not obtain the optimal solution of the QAP. Therefore, in the present study, we improve the system performance by adopting a solution construction method, which constructs a feasible solution using the analog internal state values of the chaotic neurons at each iteration. In order to include the construction method into our hardware, we install a multi-channel analog-to-digital conversion system to observe the internal states of the chaotic neurons. We show experimentally that a great improvement in the system performance over the original Hopfield-type chaotic neuro-computer is obtained. That is, we obtain the optimal solution for the size-10 QAP in less than 1000 iterations. In addition, we propose a guideline for parameter tuning of the chaotic neuro-computer system according to the observation of the internal states of several chaotic neurons in the network.
NASA Astrophysics Data System (ADS)
Chandra, Rishabh
Partial differential equation-constrained combinatorial optimization (PDECCO) problems are a mixture of continuous and discrete optimization problems. PDECCO problems have discrete controls, but since the partial differential equations (PDE) are continuous, the optimization space is continuous as well. Such problems have several applications, such as gas/water network optimization, traffic optimization, micro-chip cooling optimization, etc. Currently, no efficient classical algorithm which guarantees a global minimum for PDECCO problems exists. A new mapping has been developed that transforms PDECCO problem, which only have linear PDEs as constraints, into quadratic unconstrained binary optimization (QUBO) problems that can be solved using an adiabatic quantum optimizer (AQO). The mapping is efficient, it scales polynomially with the size of the PDECCO problem, requires only one PDE solve to form the QUBO problem, and if the QUBO problem is solved correctly and efficiently on an AQO, guarantees a global optimal solution for the original PDECCO problem.
Solving Set Cover with Pairs Problem using Quantum Annealing
NASA Astrophysics Data System (ADS)
Cao, Yudong; Jiang, Shuxian; Perouli, Debbie; Kais, Sabre
2016-09-01
Here we consider using quantum annealing to solve Set Cover with Pairs (SCP), an NP-hard combinatorial optimization problem that plays an important role in networking, computational biology, and biochemistry. We show an explicit construction of Ising Hamiltonians whose ground states encode the solution of SCP instances. We numerically simulate the time-dependent Schrödinger equation in order to test the performance of quantum annealing for random instances and compare with that of simulated annealing. We also discuss explicit embedding strategies for realizing our Hamiltonian construction on the D-wave type restricted Ising Hamiltonian based on Chimera graphs. Our embedding on the Chimera graph preserves the structure of the original SCP instance and in particular, the embedding for general complete bipartite graphs and logical disjunctions may be of broader use than that the specific problem we deal with.
Distributed Combinatorial Optimization Using Privacy on Mobile Phones
NASA Astrophysics Data System (ADS)
Ono, Satoshi; Katayama, Kimihiro; Nakayama, Shigeru
This paper proposes a method for distributed combinatorial optimization which uses mobile phones as computers. In the proposed method, an ordinary computer generates solution candidates and mobile phones evaluates them by referring privacy — private information and preferences. Users therefore does not have to send their privacy to any other computers and does not have to refrain from inputting their preferences. They therefore can obtain satisfactory solution. Experimental results have showed the proposed method solved room assignment problems without sending users' privacy to a server.
Combinatorial investigation of Fe–B thin-film nanocomposites
Brunken, Hayo; Grochla, Dario; Savan, Alan; Kieschnick, Michael; Meijer, Jan D; Ludwig, Alfred
2011-01-01
Combinatorial magnetron sputter deposition from elemental targets was used to create Fe–B composition spread type thin film materials libraries on thermally oxidized 4-in. Si wafers. The materials libraries consisting of wedge-type multilayer thin films were annealed at 500 or 700 °C to transform the multilayers into multiphase alloys. The libraries were characterized by nuclear reaction analysis, Rutherford backscattering, nanoindentation, vibrating sample magnetometry, x-ray diffraction (XRD) and transmission electron microscopy (TEM). Young's modulus and hardness values were related to the annealing parameters, structure and composition of the films. The magnetic properties of the films were improved by annealing in a H2 atmosphere, showing a more than tenfold decrease in the coercive field values in comparison to those of the vacuum-annealed films. The hardness values increased from 8 to 18 GPa when the annealing temperature was increased from 500 to 700 °C. The appearance of Fe2B phases, as revealed by XRD and TEM, had a significant effect on the mechanical properties of the films. PMID:27877435
Lin-Gibson, Sheng; Sung, Lipiin; Forster, Aaron M; Hu, Haiqing; Cheng, Yajun; Lin, Nancy J
2009-07-01
Multicomponent formulations coupled with complex processing conditions govern the final properties of photopolymerizable dental composites. In this study, a single test substrate was fabricated to support multiple formulations with a gradient in degree of conversion (DC), allowing the evaluation of multiple processing conditions and formulations on one specimen. Mechanical properties and damage response were evaluated as a function of filler type/content and irradiation. DC, surface roughness, modulus, hardness, scratch deformation and cytotoxicity were quantified using techniques including near-infrared spectroscopy, laser confocal scanning microscopy, depth-sensing indentation, scratch testing and cell viability. Scratch parameters (depth, width, percent recovery) were correlated to composite modulus and hardness. Total filler content, nanofiller and irradiation time/intensity all affected the final properties, with the dominant factor for improved properties being a higher DC. This combinatorial platform accelerates the screening of dental composites through the direct comparison of properties and processing conditions across the same sample.
Two is better than one; toward a rational design of combinatorial therapy.
Chen, Sheng-Hong; Lahav, Galit
2016-12-01
Drug combination is an appealing strategy for combating the heterogeneity of tumors and evolution of drug resistance. However, the rationale underlying combinatorial therapy is often not well established due to lack of understandings of the specific pathways responding to the drugs, and their temporal dynamics following each treatment. Here we present several emerging trends in harnessing properties of biological systems for the optimal design of drug combinations, including the type of drugs, specific concentration, sequence of addition and the temporal schedule of treatments. We highlight recent studies showing different approaches for efficient design of drug combinations including single-cell signaling dynamics, adaption and pathway crosstalk. Finally, we discuss novel and feasible approaches that can facilitate the optimal design of combinatorial therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.
GALAXY: A new hybrid MOEA for the optimal design of Water Distribution Systems
NASA Astrophysics Data System (ADS)
Wang, Q.; Savić, D. A.; Kapelan, Z.
2017-03-01
A new hybrid optimizer, called genetically adaptive leaping algorithm for approximation and diversity (GALAXY), is proposed for dealing with the discrete, combinatorial, multiobjective design of Water Distribution Systems (WDSs), which is NP-hard and computationally intensive. The merit of GALAXY is its ability to alleviate to a great extent the parameterization issue and the high computational overhead. It follows the generational framework of Multiobjective Evolutionary Algorithms (MOEAs) and includes six search operators and several important strategies. These operators are selected based on their leaping ability in the objective space from the global and local search perspectives. These strategies steer the optimization and balance the exploration and exploitation aspects simultaneously. A highlighted feature of GALAXY lies in the fact that it eliminates majority of parameters, thus being robust and easy-to-use. The comparative studies between GALAXY and three representative MOEAs on five benchmark WDS design problems confirm its competitiveness. GALAXY can identify better converged and distributed boundary solutions efficiently and consistently, indicating a much more balanced capability between the global and local search. Moreover, its advantages over other MOEAs become more substantial as the complexity of the design problem increases.
Combinatorial optimization in foundry practice
NASA Astrophysics Data System (ADS)
Antamoshkin, A. N.; Masich, I. S.
2016-04-01
The multicriteria mathematical model of foundry production capacity planning is suggested in the paper. The model is produced in terms of pseudo-Boolean optimization theory. Different search optimization methods were used to solve the obtained problem.
Zhou, Yikang; Li, Gang; Dong, Junkai; Xing, Xin-Hui; Dai, Junbiao; Zhang, Chong
2018-05-01
Facing boosting ability to construct combinatorial metabolic pathways, how to search the metabolic sweet spot has become the rate-limiting step. We here reported an efficient Machine-learning workflow in conjunction with YeastFab Assembly strategy (MiYA) for combinatorial optimizing the large biosynthetic genotypic space of heterologous metabolic pathways in Saccharomyces cerevisiae. Using β-carotene biosynthetic pathway as example, we first demonstrated that MiYA has the power to search only a small fraction (2-5%) of combinatorial space to precisely tune the expression level of each gene with a machine-learning algorithm of an artificial neural network (ANN) ensemble to avoid over-fitting problem when dealing with a small number of training samples. We then applied MiYA to improve the biosynthesis of violacein. Feed with initial data from a colorimetric plate-based, pre-screened pool of 24 strains producing violacein, MiYA successfully predicted, and verified experimentally, the existence of a strain that showed a 2.42-fold titer improvement in violacein production among 3125 possible designs. Furthermore, MiYA was able to largely avoid the branch pathway of violacein biosynthesis that makes deoxyviolacein, and produces very pure violacein. Together, MiYA combines the advantages of standardized building blocks and machine learning to accelerate the Design-Build-Test-Learn (DBTL) cycle for combinatorial optimization of metabolic pathways, which could significantly accelerate the development of microbial cell factories. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Combinatorial chemical bath deposition of CdS contacts for chalcogenide photovoltaics
Mokurala, Krishnaiah; Baranowski, Lauryn L.; de Souza Lucas, Francisco W.; ...
2016-08-01
Contact layers play an important role in thin film solar cells, but new material development and optimization of its thickness is usually a long and tedious process. A high-throughput experimental approach has been used to accelerate the rate of research in photovoltaic (PV) light absorbers and transparent conductive electrodes, however the combinatorial research on contact layers is less common. Here, we report on the chemical bath deposition (CBD) of CdS thin films by combinatorial dip coating technique and apply these contact layers to Cu(In,Ga)Se 2 (CIGSe) and Cu 2ZnSnSe 4 (CZTSe) light absorbers in PV devices. Combinatorial thickness steps ofmore » CdS thin films were achieved by removal of the substrate from the chemical bath, at regular intervals of time, and in equal distance increments. The trends in the photoconversion efficiency and in the spectral response of the PV devices as a function of thickness of CdS contacts were explained with the help of optical and morphological characterization of the CdS thin films. The maximum PV efficiency achieved for the combinatorial dip-coating CBD was similar to that for the PV devices processed using conventional CBD. Finally, the results of this study lead to the conclusion that combinatorial dip-coating can be used to accelerate the optimization of PV device performance of CdS and other candidate contact layers for a wide range of emerging absorbers.« less
Identification of combinatorial drug regimens for treatment of Huntington's disease using Drosophila
NASA Astrophysics Data System (ADS)
Agrawal, Namita; Pallos, Judit; Slepko, Natalia; Apostol, Barbara L.; Bodai, Laszlo; Chang, Ling-Wen; Chiang, Ann-Shyn; Michels Thompson, Leslie; Marsh, J. Lawrence
2005-03-01
We explore the hypothesis that pathology of Huntington's disease involves multiple cellular mechanisms whose contributions to disease are incrementally additive or synergistic. We provide evidence that the photoreceptor neuron degeneration seen in flies expressing mutant human huntingtin correlates with widespread degenerative events in the Drosophila CNS. We use a Drosophila Huntington's disease model to establish dose regimens and protocols to assess the effectiveness of drug combinations used at low threshold concentrations. These proof of principle studies identify at least two potential combinatorial treatment options and illustrate a rapid and cost-effective paradigm for testing and optimizing combinatorial drug therapies while reducing side effects for patients with neurodegenerative disease. The potential for using prescreening in Drosophila to inform combinatorial therapies that are most likely to be effective for testing in mammals is discussed. combinatorial treatments | neurodegeneration
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
ERIC Educational Resources Information Center
Kittredge, Kevin W.; Marine, Susan S.; Taylor, Richard T.
2004-01-01
A molecule possessing other functional groups that could be hydrogenerated is examined, where a variety of metal catalysts are evaluated under similar reaction conditions. Optimizing organic reactions is both time and labor intensive, and the use of a combinatorial parallel synthesis reactor was great time saving device, as per summary.
QAPgrid: A Two Level QAP-Based Approach for Large-Scale Data Analysis and Visualization
Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo
2011-01-01
Background The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain “hidden regularities” and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. Methodology/Principal Findings We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Conclusions/Significance Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics. PMID:21267077
QAPgrid: a two level QAP-based approach for large-scale data analysis and visualization.
Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo
2011-01-18
The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain "hidden regularities" and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics.
Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem.
Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue
2015-01-01
As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods.
A 16-bit Coherent Ising Machine for One-Dimensional Ring and Cubic Graph Problems
NASA Astrophysics Data System (ADS)
Takata, Kenta; Marandi, Alireza; Hamerly, Ryan; Haribara, Yoshitaka; Maruo, Daiki; Tamate, Shuhei; Sakaguchi, Hiromasa; Utsunomiya, Shoko; Yamamoto, Yoshihisa
2016-09-01
Many tasks in our modern life, such as planning an efficient travel, image processing and optimizing integrated circuit design, are modeled as complex combinatorial optimization problems with binary variables. Such problems can be mapped to finding a ground state of the Ising Hamiltonian, thus various physical systems have been studied to emulate and solve this Ising problem. Recently, networks of mutually injected optical oscillators, called coherent Ising machines, have been developed as promising solvers for the problem, benefiting from programmability, scalability and room temperature operation. Here, we report a 16-bit coherent Ising machine based on a network of time-division-multiplexed femtosecond degenerate optical parametric oscillators. The system experimentally gives more than 99.6% of success rates for one-dimensional Ising ring and nondeterministic polynomial-time (NP) hard instances. The experimental and numerical results indicate that gradual pumping of the network combined with multiple spectral and temporal modes of the femtosecond pulses can improve the computational performance of the Ising machine, offering a new path for tackling larger and more complex instances.
Charleston, M A
1995-01-01
This article introduces a coherent language base for describing and working with characteristics of combinatorial optimization problems, which is at once general enough to be used in all such problems and precise enough to allow subtle concepts in this field to be discussed unambiguously. An example is provided of how this nomenclature is applied to an instance of the phylogeny problem. Also noted is the beneficial effect, on the landscape of the solution space, of transforming the observed data to account for multiple changes of character state.
Concept of combinatorial de novo design of drug-like molecules by particle swarm optimization.
Hartenfeller, Markus; Proschak, Ewgenij; Schüller, Andreas; Schneider, Gisbert
2008-07-01
We present a fast stochastic optimization algorithm for fragment-based molecular de novo design (COLIBREE, Combinatorial Library Breeding). The search strategy is based on a discrete version of particle swarm optimization. Molecules are represented by a scaffold, which remains constant during optimization, and variable linkers and side chains. Different linkers represent virtual chemical reactions. Side-chain building blocks were obtained from pseudo-retrosynthetic dissection of large compound databases. Here, ligand-based design was performed using chemically advanced template search (CATS) topological pharmacophore similarity to reference ligands as fitness function. A weighting scheme was included for particle swarm optimization-based molecular design, which permits the use of many reference ligands and allows for positive and negative design to be performed simultaneously. In a case study, the approach was applied to the de novo design of potential peroxisome proliferator-activated receptor subtype-selective agonists. The results demonstrate the ability of the technique to cope with large combinatorial chemistry spaces and its applicability to focused library design. The technique was able to perform exploitation of a known scheme and at the same time explorative search for novel ligands within the framework of a given molecular core structure. It thereby represents a practical solution for compound screening in the early hit and lead finding phase of a drug discovery project.
A preliminary study to metaheuristic approach in multilayer radiation shielding optimization
NASA Astrophysics Data System (ADS)
Arif Sazali, Muhammad; Rashid, Nahrul Khair Alang Md; Hamzah, Khaidzir
2018-01-01
Metaheuristics are high-level algorithmic concepts that can be used to develop heuristic optimization algorithms. One of their applications is to find optimal or near optimal solutions to combinatorial optimization problems (COPs) such as scheduling, vehicle routing, and timetabling. Combinatorial optimization deals with finding optimal combinations or permutations in a given set of problem components when exhaustive search is not feasible. A radiation shield made of several layers of different materials can be regarded as a COP. The time taken to optimize the shield may be too high when several parameters are involved such as the number of materials, the thickness of layers, and the arrangement of materials. Metaheuristics can be applied to reduce the optimization time, trading guaranteed optimal solutions for near-optimal solutions in comparably short amount of time. The application of metaheuristics for radiation shield optimization is lacking. In this paper, we present a review on the suitability of using metaheuristics in multilayer shielding design, specifically the genetic algorithm and ant colony optimization algorithm (ACO). We would also like to propose an optimization model based on the ACO method.
A Combinatorial Platform for the Optimization of Peptidomimetic Methyl-Lysine Reader Antagonists
NASA Astrophysics Data System (ADS)
Barnash, Kimberly D.
Post-translational modification of histone N-terminal tails mediates chromatin compaction and, consequently, DNA replication, transcription, and repair. While numerous post-translational modifications decorate histone tails, lysine methylation is an abundant mark important for both gene activation and repression. Methyl-lysine (Kme) readers function through binding mono-, di-, or trimethyl-lysine. Chemical intervention of Kme readers faces numerous challenges due to the broad surface-groove interactions between readers and their cognate histone peptides; yet, the increasing interest in understanding chromatin-modifying complexes suggests tractable lead compounds for Kme readers are critical for elucidating the mechanisms of chromatin dysregulation in disease states and validating the druggability of these domains and complexes. The successful discovery of a peptide-derived chemical probe, UNC3866, for the Polycomb repressive complex 1 (PRC1) chromodomain Kme readers has proven the potential for selective peptidomimetic inhibition of reader function. Unfortunately, the systematic modification of peptides-to-peptidomimetics is a costly and inefficient strategy for target-class hit discovery against Kme readers. Through the exploration of biased chemical space via combinatorial on-bead libraries, we have developed two concurrent methodologies for Kme reader chemical probe discovery. We employ biased peptide combinatorial libraries as a hit discovery strategy with subsequent optimization via iterative targeted libraries. Peptide-to-peptidomimetic optimization through targeted library design was applied based on structure-guided library design around the interaction of the endogenous peptide ligand with three target Kme readers. Efforts targeting the WD40 reader EED led to the discovery of the 3-mer peptidomimetic ligand UNC5115 while combinatorial repurposing of UNC3866 for off-target chromodomains resulted in the discovery of UNC4991, a CDYL/2-selective ligand, and UNC4848, a MPP8 and CDYL/2 ligand. Ultimately, our efforts demonstrate the generalizability of a peptidomimetic combinatorial platform for the optimization of Kme reader ligands in a target class manner.
Network of time-multiplexed optical parametric oscillators as a coherent Ising machine
NASA Astrophysics Data System (ADS)
Marandi, Alireza; Wang, Zhe; Takata, Kenta; Byer, Robert L.; Yamamoto, Yoshihisa
2014-12-01
Finding the ground states of the Ising Hamiltonian maps to various combinatorial optimization problems in biology, medicine, wireless communications, artificial intelligence and social network. So far, no efficient classical and quantum algorithm is known for these problems and intensive research is focused on creating physical systems—Ising machines—capable of finding the absolute or approximate ground states of the Ising Hamiltonian. Here, we report an Ising machine using a network of degenerate optical parametric oscillators (OPOs). Spins are represented with above-threshold binary phases of the OPOs and the Ising couplings are realized by mutual injections. The network is implemented in a single OPO ring cavity with multiple trains of femtosecond pulses and configurable mutual couplings, and operates at room temperature. We programmed a small non-deterministic polynomial time-hard problem on a 4-OPO Ising machine and in 1,000 runs no computational error was detected.
Exploiting Quantum Resonance to Solve Combinatorial Problems
NASA Technical Reports Server (NTRS)
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
Combinatorial Methods for Exploring Complex Materials
NASA Astrophysics Data System (ADS)
Amis, Eric J.
2004-03-01
Combinatorial and high-throughput methods have changed the paradigm of pharmaceutical synthesis and have begun to have a similar impact on materials science research. Already there are examples of combinatorial methods used for inorganic materials, catalysts, and polymer synthesis. For many investigations the primary goal has been discovery of new material compositions that optimize properties such as phosphorescence or catalytic activity. In the midst of the excitement generated to "make things", another opportunity arises for materials science to "understand things" by using the efficiency of combinatorial methods. We have shown that combinatorial methods hold potential for rapid and systematic generation of experimental data over the multi-parameter space typical of investigations in polymer physics. We have applied the combinatorial approach to studies of polymer thin films, biomaterials, polymer blends, filled polymers, and semicrystalline polymers. By combining library fabrication, high-throughput measurements, informatics, and modeling we can demonstrate validation of the methodology, new observations, and developments toward predictive models. This talk will present some of our latest work with applications to coating stability, multi-component formulations, and nanostructure assembly.
Multiobjective optimization of combinatorial libraries.
Agrafiotis, D K
2002-01-01
Combinatorial chemistry and high-throughput screening have caused a fundamental shift in the way chemists contemplate experiments. Designing a combinatorial library is a controversial art that involves a heterogeneous mix of chemistry, mathematics, economics, experience, and intuition. Although there seems to be little agreement as to what constitutes an ideal library, one thing is certain: only one property or measure seldom defines the quality of the design. In most real-world applications, a good experiment requires the simultaneous optimization of several, often conflicting, design objectives, some of which may be vague and uncertain. In this paper, we discuss a class of algorithms for subset selection rooted in the principles of multiobjective optimization. Our approach is to employ an objective function that encodes all of the desired selection criteria, and then use a simulated annealing or evolutionary approach to identify the optimal (or a nearly optimal) subset from among the vast number of possibilities. Many design criteria can be accommodated, including diversity, similarity to known actives, predicted activity and/or selectivity determined by quantitative structure-activity relationship (QSAR) models or receptor binding models, enforcement of certain property distributions, reagent cost and availability, and many others. The method is robust, convergent, and extensible, offers the user full control over the relative significance of the various objectives in the final design, and permits the simultaneous selection of compounds from multiple libraries in full- or sparse-array format.
Guturu, Parthasarathy; Dantu, Ram
2008-06-01
Many graph- and set-theoretic problems, because of their tremendous application potential and theoretical appeal, have been well investigated by the researchers in complexity theory and were found to be NP-hard. Since the combinatorial complexity of these problems does not permit exhaustive searches for optimal solutions, only near-optimal solutions can be explored using either various problem-specific heuristic strategies or metaheuristic global-optimization methods, such as simulated annealing, genetic algorithms, etc. In this paper, we propose a unified evolutionary algorithm (EA) to the problems of maximum clique finding, maximum independent set, minimum vertex cover, subgraph and double subgraph isomorphism, set packing, set partitioning, and set cover. In the proposed approach, we first map these problems onto the maximum clique-finding problem (MCP), which is later solved using an evolutionary strategy. The proposed impatient EA with probabilistic tabu search (IEA-PTS) for the MCP integrates the best features of earlier successful approaches with a number of new heuristics that we developed to yield a performance that advances the state of the art in EAs for the exploration of the maximum cliques in a graph. Results of experimentation with the 37 DIMACS benchmark graphs and comparative analyses with six state-of-the-art algorithms, including two from the smaller EA community and four from the larger metaheuristics community, indicate that the IEA-PTS outperforms the EAs with respect to a Pareto-lexicographic ranking criterion and offers competitive performance on some graph instances when individually compared to the other heuristic algorithms. It has also successfully set a new benchmark on one graph instance. On another benchmark suite called Benchmarks with Hidden Optimal Solutions, IEA-PTS ranks second, after a very recent algorithm called COVER, among its peers that have experimented with this suite.
Tuning the physical properties of amorphous In–Zn–Sn–O thin films using combinatorial sputtering
Ndione, Paul F.; Zakutayev, A.; Kumar, M.; ...
2016-12-05
Transparent conductive oxides and amorphous oxide semiconductors are important materials for many modern technologies. Here, we explore the ternary indium zinc tin oxide (IZTO) using combinatorial synthesis and spatially resolved characterization. The electrical conductivity, work function, absorption onset, mechanical hardness, and elastic modulus of the optically transparent (>85%) amorphous IZTO thin films were found to be in the range of 10–2415 S/cm, 4.6–5.3 eV, 3.20–3.34 eV, 9.0–10.8 GPa, and 111–132 GPa, respectively, depending on the cation composition and the deposition conditions. Furthermore, this study enables control of IZTO performance over a broad range of cation compositions.
USDA-ARS?s Scientific Manuscript database
Ant Colony Optimization (ACO) refers to the family of algorithms inspired by the behavior of real ants and used to solve combinatorial problems such as the Traveling Salesman Problem (TSP).Optimal Foraging Theory (OFT) is an evolutionary principle wherein foraging organisms or insect parasites seek ...
Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.
Bauer, Markus; Klau, Gunnar W; Reinert, Knut
2007-07-27
The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.
A gradient system solution to Potts mean field equations and its electronic implementation.
Urahama, K; Ueno, S
1993-03-01
A gradient system solution method is presented for solving Potts mean field equations for combinatorial optimization problems subject to winner-take-all constraints. In the proposed solution method the optimum solution is searched by using gradient descent differential equations whose trajectory is confined within the feasible solution space of optimization problems. This gradient system is proven theoretically to always produce a legal local optimum solution of combinatorial optimization problems. An elementary analog electronic circuit implementing the presented method is designed on the basis of current-mode subthreshold MOS technologies. The core constituent of the circuit is the winner-take-all circuit developed by Lazzaro et al. Correct functioning of the presented circuit is exemplified with simulations of the circuits implementing the scheme for solving the shortest path problems.
Xiang, X D
Combinatorial materials synthesis methods and high-throughput evaluation techniques have been developed to accelerate the process of materials discovery and optimization and phase-diagram mapping. Analogous to integrated circuit chips, integrated materials chips containing thousands of discrete different compositions or continuous phase diagrams, often in the form of high-quality epitaxial thin films, can be fabricated and screened for interesting properties. Microspot x-ray method, various optical measurement techniques, and a novel evanescent microwave microscope have been used to characterize the structural, optical, magnetic, and electrical properties of samples on the materials chips. These techniques are routinely used to discover/optimize and map phase diagrams of ferroelectric, dielectric, optical, magnetic, and superconducting materials.
Turkett, Jeremy A; Bicker, Kevin L
2017-04-10
Growing prevalence of antibiotic resistant bacterial infections necessitates novel antimicrobials, which could be rapidly identified from combinatorial libraries. We report the use of the peptoid library agar diffusion (PLAD) assay to screen peptoid libraries against the ESKAPE pathogens, including the optimization of assay conditions for each pathogen. Work presented here focuses on the tailoring of combinatorial peptoid library design through a detailed study of how peptoid lipophilicity relates to antibacterial potency and mammalian cell toxicity. The information gleaned from this optimization was then applied using the aforementioned screening method to examine the relative potency of peptoid libraries against Staphylococcus aureus, Acinetobacter baumannii, and Enterococcus faecalis prior to and following functionalization with long alkyl tails. The data indicate that overall peptoid hydrophobicity and not simply alkyl tail length is strongly correlated with mammalian cell toxicity. Furthermore, this work demonstrates the utility of the PLAD assay in rapidly evaluating the effect of molecular property changes in similar libraries.
Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem
Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue
2015-01-01
As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods. PMID:26176764
High performance genetic algorithm for VLSI circuit partitioning
NASA Astrophysics Data System (ADS)
Dinu, Simona
2016-12-01
Partitioning is one of the biggest challenges in computer-aided design for VLSI circuits (very large-scale integrated circuits). This work address the min-cut balanced circuit partitioning problem- dividing the graph that models the circuit into almost equal sized k sub-graphs while minimizing the number of edges cut i.e. minimizing the number of edges connecting the sub-graphs. The problem may be formulated as a combinatorial optimization problem. Experimental studies in the literature have shown the problem to be NP-hard and thus it is important to design an efficient heuristic algorithm to solve it. The approach proposed in this study is a parallel implementation of a genetic algorithm, namely an island model. The information exchange between the evolving subpopulations is modeled using a fuzzy controller, which determines an optimal balance between exploration and exploitation of the solution space. The results of simulations show that the proposed algorithm outperforms the standard sequential genetic algorithm both in terms of solution quality and convergence speed. As a direction for future study, this research can be further extended to incorporate local search operators which should include problem-specific knowledge. In addition, the adaptive configuration of mutation and crossover rates is another guidance for future research.
NASA Astrophysics Data System (ADS)
Kunze, Herb; La Torre, Davide; Lin, Jianyi
2017-01-01
We consider the inverse problem associated with IFSM: Given a target function f , find an IFSM, such that its fixed point f ¯ is sufficiently close to f in the Lp distance. Forte and Vrscay [1] showed how to reduce this problem to a quadratic optimization model. In this paper, we extend the collage-based method developed by Kunze, La Torre and Vrscay ([2][3][4]), by proposing the minimization of the 1-norm instead of the 0-norm. In fact, optimization problems involving the 0-norm are combinatorial in nature, and hence in general NP-hard. To overcome these difficulties, we introduce the 1-norm and propose a Sequential Quadratic Programming algorithm to solve the corresponding inverse problem. As in Kunze, La Torre and Vrscay [3] in our formulation, the minimization of collage error is treated as a multi-criteria problem that includes three different and conflicting criteria i.e., collage error, entropy and sparsity. This multi-criteria program is solved by means of a scalarization technique which reduces the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
Combinatorial Multiobjective Optimization Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Crossley, William A.; Martin. Eric T.
2002-01-01
The research proposed in this document investigated multiobjective optimization approaches based upon the Genetic Algorithm (GA). Several versions of the GA have been adopted for multiobjective design, but, prior to this research, there had not been significant comparisons of the most popular strategies. The research effort first generalized the two-branch tournament genetic algorithm in to an N-branch genetic algorithm, then the N-branch GA was compared with a version of the popular Multi-Objective Genetic Algorithm (MOGA). Because the genetic algorithm is well suited to combinatorial (mixed discrete / continuous) optimization problems, the GA can be used in the conceptual phase of design to combine selection (discrete variable) and sizing (continuous variable) tasks. Using a multiobjective formulation for the design of a 50-passenger aircraft to meet the competing objectives of minimizing takeoff gross weight and minimizing trip time, the GA generated a range of tradeoff designs that illustrate which aircraft features change from a low-weight, slow trip-time aircraft design to a heavy-weight, short trip-time aircraft design. Given the objective formulation and analysis methods used, the results of this study identify where turboprop-powered aircraft and turbofan-powered aircraft become more desirable for the 50 seat passenger application. This aircraft design application also begins to suggest how a combinatorial multiobjective optimization technique could be used to assist in the design of morphing aircraft.
Optimized Reaction Conditions for Amide Bond Formation in DNA-Encoded Combinatorial Libraries.
Li, Yizhou; Gabriele, Elena; Samain, Florent; Favalli, Nicholas; Sladojevich, Filippo; Scheuermann, Jörg; Neri, Dario
2016-08-08
DNA-encoded combinatorial libraries are increasingly being used as tools for the discovery of small organic binding molecules to proteins of biological or pharmaceutical interest. In the majority of cases, synthetic procedures for the formation of DNA-encoded combinatorial libraries incorporate at least one step of amide bond formation between amino-modified DNA and a carboxylic acid. We investigated reaction conditions and established a methodology by using 1-ethyl-3-(3-(dimethylamino)propyl)carbodiimide, 1-hydroxy-7-azabenzotriazole and N,N'-diisopropylethylamine (EDC/HOAt/DIPEA) in combination, which provided conversions greater than 75% for 423/543 (78%) of the carboxylic acids tested. These reaction conditions were efficient with a variety of primary and secondary amines, as well as with various types of amino-modified oligonucleotides. The reaction conditions, which also worked efficiently over a broad range of DNA concentrations and reaction scales, should facilitate the synthesis of novel DNA-encoded combinatorial libraries.
Exact model reduction of combinatorial reaction networks
Conzelmann, Holger; Fey, Dirk; Gilles, Ernst D
2008-01-01
Background Receptors and scaffold proteins usually possess a high number of distinct binding domains inducing the formation of large multiprotein signaling complexes. Due to combinatorial reasons the number of distinguishable species grows exponentially with the number of binding domains and can easily reach several millions. Even by including only a limited number of components and binding domains the resulting models are very large and hardly manageable. A novel model reduction technique allows the significant reduction and modularization of these models. Results We introduce methods that extend and complete the already introduced approach. For instance, we provide techniques to handle the formation of multi-scaffold complexes as well as receptor dimerization. Furthermore, we discuss a new modeling approach that allows the direct generation of exactly reduced model structures. The developed methods are used to reduce a model of EGF and insulin receptor crosstalk comprising 5,182 ordinary differential equations (ODEs) to a model with 87 ODEs. Conclusion The methods, presented in this contribution, significantly enhance the available methods to exactly reduce models of combinatorial reaction networks. PMID:18755034
A Discriminative Sentence Compression Method as Combinatorial Optimization Problem
NASA Astrophysics Data System (ADS)
Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki
In the study of automatic summarization, the main research topic was `important sentence extraction' but nowadays `sentence compression' is a hot research topic. Conventional sentence compression methods usually transform a given sentence into a parse tree or a dependency tree, and modify them to get a shorter sentence. However, this method is sometimes too rigid. In this paper, we regard sentence compression as an combinatorial optimization problem that extracts an optimal subsequence of words. Hori et al. also proposed a similar method, but they used only a small number of features and their weights were tuned by hand. We introduce a large number of features such as part-of-speech bigrams and word position in the sentence. Furthermore, we train the system by discriminative learning. According to our experiments, our method obtained better score than other methods with statistical significance.
Automatically Generated Algorithms for the Vertex Coloring Problem
Contreras Bolton, Carlos; Gatica, Gustavo; Parada, Víctor
2013-01-01
The vertex coloring problem is a classical problem in combinatorial optimization that consists of assigning a color to each vertex of a graph such that no adjacent vertices share the same color, minimizing the number of colors used. Despite the various practical applications that exist for this problem, its NP-hardness still represents a computational challenge. Some of the best computational results obtained for this problem are consequences of hybridizing the various known heuristics. Automatically revising the space constituted by combining these techniques to find the most adequate combination has received less attention. In this paper, we propose exploring the heuristics space for the vertex coloring problem using evolutionary algorithms. We automatically generate three new algorithms by combining elementary heuristics. To evaluate the new algorithms, a computational experiment was performed that allowed comparing them numerically with existing heuristics. The obtained algorithms present an average 29.97% relative error, while four other heuristics selected from the literature present a 59.73% error, considering 29 of the more difficult instances in the DIMACS benchmark. PMID:23516506
Boyen, Peter; Van Dyck, Dries; Neven, Frank; van Ham, Roeland C H J; van Dijk, Aalt D J
2011-01-01
Correlated motif mining (cmm) is the problem of finding overrepresented pairs of patterns, called motifs, in sequences of interacting proteins. Algorithmic solutions for cmm thereby provide a computational method for predicting binding sites for protein interaction. In this paper, we adopt a motif-driven approach where the support of candidate motif pairs is evaluated in the network. We experimentally establish the superiority of the Chi-square-based support measure over other support measures. Furthermore, we obtain that cmm is an np-hard problem for a large class of support measures (including Chi-square) and reformulate the search for correlated motifs as a combinatorial optimization problem. We then present the generic metaheuristic slider which uses steepest ascent with a neighborhood function based on sliding motifs and employs the Chi-square-based support measure. We show that slider outperforms existing motif-driven cmm methods and scales to large protein-protein interaction networks. The slider-implementation and the data used in the experiments are available on http://bioinformatics.uhasselt.be.
A set partitioning reformulation for the multiple-choice multidimensional knapsack problem
NASA Astrophysics Data System (ADS)
Voß, Stefan; Lalla-Ruiz, Eduardo
2016-05-01
The Multiple-choice Multidimensional Knapsack Problem (MMKP) is a well-known ?-hard combinatorial optimization problem that has received a lot of attention from the research community as it can be easily translated to several real-world problems arising in areas such as allocating resources, reliability engineering, cognitive radio networks, cloud computing, etc. In this regard, an exact model that is able to provide high-quality feasible solutions for solving it or being partially included in algorithmic schemes is desirable. The MMKP basically consists of finding a subset of objects that maximizes the total profit while observing some capacity restrictions. In this article a reformulation of the MMKP as a set partitioning problem is proposed to allow for new insights into modelling the MMKP. The computational experimentation provides new insights into the problem itself and shows that the new model is able to improve on the best of the known results for some of the most common benchmark instances.
Approximability of the d-dimensional Euclidean capacitated vehicle routing problem
NASA Astrophysics Data System (ADS)
Khachay, Michael; Dubinin, Roman
2016-10-01
Capacitated Vehicle Routing Problem (CVRP) is the well known intractable combinatorial optimization problem, which remains NP-hard even in the Euclidean plane. Since the introduction of this problem in the middle of the 20th century, many researchers were involved into the study of its approximability. Most of the results obtained in this field are based on the well known Iterated Tour Partition heuristic proposed by M. Haimovich and A. Rinnoy Kan in their celebrated paper, where they construct the first Polynomial Time Approximation Scheme (PTAS) for the single depot CVRP in ℝ2. For decades, this result was extended by many authors to numerous useful modifications of the problem taking into account multiple depots, pick up and delivery options, time window restrictions, etc. But, to the best of our knowledge, almost none of these results go beyond the Euclidean plane. In this paper, we try to bridge this gap and propose a EPTAS for the Euclidean CVRP for any fixed dimension.
Martín H., José Antonio
2013-01-01
Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to “efficiently” solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter . Nevertheless, here it is proved that the probability of requiring a value of to obtain a solution for a random graph decreases exponentially: , making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results. PMID:23349711
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Focusing on the golden ball metaheuristic: an extended study on a wider set of problems.
Osaba, E; Diaz, F; Carballedo, R; Onieva, E; Perallos, A
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results.
Focusing on the Golden Ball Metaheuristic: An Extended Study on a Wider Set of Problems
Osaba, E.; Diaz, F.; Carballedo, R.; Onieva, E.; Perallos, A.
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results. PMID:25165742
An evolutionary strategy based on partial imitation for solving optimization problems
NASA Astrophysics Data System (ADS)
Javarone, Marco Alberto
2016-12-01
In this work we introduce an evolutionary strategy to solve combinatorial optimization tasks, i.e. problems characterized by a discrete search space. In particular, we focus on the Traveling Salesman Problem (TSP), i.e. a famous problem whose search space grows exponentially, increasing the number of cities, up to becoming NP-hard. The solutions of the TSP can be codified by arrays of cities, and can be evaluated by fitness, computed according to a cost function (e.g. the length of a path). Our method is based on the evolution of an agent population by means of an imitative mechanism, we define 'partial imitation'. In particular, agents receive a random solution and then, interacting among themselves, may imitate the solutions of agents with a higher fitness. Since the imitation mechanism is only partial, agents copy only one entry (randomly chosen) of another array (i.e. solution). In doing so, the population converges towards a shared solution, behaving like a spin system undergoing a cooling process, i.e. driven towards an ordered phase. We highlight that the adopted 'partial imitation' mechanism allows the population to generate solutions over time, before reaching the final equilibrium. Results of numerical simulations show that our method is able to find, in a finite time, both optimal and suboptimal solutions, depending on the size of the considered search space.
Puthiyedth, Nisha; Riveros, Carlos; Berretta, Regina; Moscato, Pablo
2015-01-01
Background The joint study of multiple datasets has become a common technique for increasing statistical power in detecting biomarkers obtained from smaller studies. The approach generally followed is based on the fact that as the total number of samples increases, we expect to have greater power to detect associations of interest. This methodology has been applied to genome-wide association and transcriptomic studies due to the availability of datasets in the public domain. While this approach is well established in biostatistics, the introduction of new combinatorial optimization models to address this issue has not been explored in depth. In this study, we introduce a new model for the integration of multiple datasets and we show its application in transcriptomics. Methods We propose a new combinatorial optimization problem that addresses the core issue of biomarker detection in integrated datasets. Optimal solutions for this model deliver a feature selection from a panel of prospective biomarkers. The model we propose is a generalised version of the (α,β)-k-Feature Set problem. We illustrate the performance of this new methodology via a challenging meta-analysis task involving six prostate cancer microarray datasets. The results are then compared to the popular RankProd meta-analysis tool and to what can be obtained by analysing the individual datasets by statistical and combinatorial methods alone. Results Application of the integrated method resulted in a more informative signature than the rank-based meta-analysis or individual dataset results, and overcomes problems arising from real world datasets. The set of genes identified is highly significant in the context of prostate cancer. The method used does not rely on homogenisation or transformation of values to a common scale, and at the same time is able to capture markers associated with subgroups of the disease. PMID:26106884
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Podlewska, Sabina; Czarnecki, Wojciech M; Kafel, Rafał; Bojarski, Andrzej J
2017-02-27
The growing computational abilities of various tools that are applied in the broadly understood field of computer-aided drug design have led to the extreme popularity of virtual screening in the search for new biologically active compounds. Most often, the source of such molecules consists of commercially available compound databases, but they can also be searched for within the libraries of structures generated in silico from existing ligands. Various computational combinatorial approaches are based solely on the chemical structure of compounds, using different types of substitutions for new molecules formation. In this study, the starting point for combinatorial library generation was the fingerprint referring to the optimal substructural composition in terms of the activity toward a considered target, which was obtained using a machine learning-based optimization procedure. The systematic enumeration of all possible connections between preferred substructures resulted in the formation of target-focused libraries of new potential ligands. The compounds were initially assessed by machine learning methods using a hashed fingerprint to represent molecules; the distribution of their physicochemical properties was also investigated, as well as their synthetic accessibility. The examination of various fingerprints and machine learning algorithms indicated that the Klekota-Roth fingerprint and support vector machine were an optimal combination for such experiments. This study was performed for 8 protein targets, and the obtained compound sets and their characterization are publically available at http://skandal.if-pan.krakow.pl/comb_lib/ .
Optimizing Perioperative Decision Making: Improved Information for Clinical Workflow Planning
Doebbeling, Bradley N.; Burton, Matthew M.; Wiebke, Eric A.; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph
2012-01-01
Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40–70% of hospital revenues and 30–40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction. PMID:23304284
Optimizing perioperative decision making: improved information for clinical workflow planning.
Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph
2012-01-01
Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction.
Awwal, Abdul; Diaz-Ramirez, Victor H.; Cuevas, Andres; ...
2014-10-23
Composite correlation filters are used for solving a wide variety of pattern recognition problems. These filters are given by a combination of several training templates chosen by a designer in an ad hoc manner. In this work, we present a new approach for the design of composite filters based on multi-objective combinatorial optimization. Given a vast search space of training templates, an iterative algorithm is used to synthesize a filter with an optimized performance in terms of several competing criteria. Furthermore, by employing a suggested binary-search procedure a filter bank with a minimum number of filters can be constructed, formore » a prespecified trade-off of performance metrics. Computer simulation results obtained with the proposed method in recognizing geometrically distorted versions of a target in cluttered and noisy scenes are discussed and compared in terms of recognition performance and complexity with existing state-of-the-art filters.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Awwal, Abdul; Diaz-Ramirez, Victor H.; Cuevas, Andres
Composite correlation filters are used for solving a wide variety of pattern recognition problems. These filters are given by a combination of several training templates chosen by a designer in an ad hoc manner. In this work, we present a new approach for the design of composite filters based on multi-objective combinatorial optimization. Given a vast search space of training templates, an iterative algorithm is used to synthesize a filter with an optimized performance in terms of several competing criteria. Furthermore, by employing a suggested binary-search procedure a filter bank with a minimum number of filters can be constructed, formore » a prespecified trade-off of performance metrics. Computer simulation results obtained with the proposed method in recognizing geometrically distorted versions of a target in cluttered and noisy scenes are discussed and compared in terms of recognition performance and complexity with existing state-of-the-art filters.« less
Rationally reduced libraries for combinatorial pathway optimization minimizing experimental effort.
Jeschek, Markus; Gerngross, Daniel; Panke, Sven
2016-03-31
Rational flux design in metabolic engineering approaches remains difficult since important pathway information is frequently not available. Therefore empirical methods are applied that randomly change absolute and relative pathway enzyme levels and subsequently screen for variants with improved performance. However, screening is often limited on the analytical side, generating a strong incentive to construct small but smart libraries. Here we introduce RedLibs (Reduced Libraries), an algorithm that allows for the rational design of smart combinatorial libraries for pathway optimization thereby minimizing the use of experimental resources. We demonstrate the utility of RedLibs for the design of ribosome-binding site libraries by in silico and in vivo screening with fluorescent proteins and perform a simple two-step optimization of the product selectivity in the branched multistep pathway for violacein biosynthesis, indicating a general applicability for the algorithm and the proposed heuristics. We expect that RedLibs will substantially simplify the refactoring of synthetic metabolic pathways.
Automatic Summarization as a Combinatorial Optimization Problem
NASA Astrophysics Data System (ADS)
Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki
We derived the oracle summary with the highest ROUGE score that can be achieved by integrating sentence extraction with sentence compression from the reference abstract. The analysis results of the oracle revealed that summarization systems have to assign an appropriate compression rate for each sentence in the document. In accordance with this observation, this paper proposes a summarization method as a combinatorial optimization: selecting the set of sentences that maximize the sum of the sentence scores from the pool which consists of the sentences with various compression rates, subject to length constrains. The score of the sentence is defined by its compression rate, content words and positional information. The parameters for the compression rates and positional information are optimized by minimizing the loss between score of oracles and that of candidates. The results obtained from TSC-2 corpus showed that our method outperformed the previous systems with statistical significance.
Combinatorial optimization games
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, X.; Ibaraki, Toshihide; Nagamochi, Hiroshi
1997-06-01
We introduce a general integer programming formulation for a class of combinatorial optimization games, which immediately allows us to improve the algorithmic result for finding amputations in the core (an important solution concept in cooperative game theory) of the network flow game on simple networks by Kalai and Zemel. An interesting result is a general theorem that the core for this class of games is nonempty if and only if a related linear program has an integer optimal solution. We study the properties for this mathematical condition to hold for several interesting problems, and apply them to resolve algorithmic andmore » complexity issues for their cores along the line as put forward in: decide whether the core is empty; if the core is empty, find an imputation in the core; given an imputation x, test whether x is in the core. We also explore the properties of totally balanced games in this succinct formulation of cooperative games.« less
Parallel tempering for the traveling salesman problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Percus, Allon; Wang, Richard; Hyman, Jeffrey
We explore the potential of parallel tempering as a combinatorial optimization method, applying it to the traveling salesman problem. We compare simulation results of parallel tempering with a benchmark implementation of simulated annealing, and study how different choices of parameters affect the relative performance of the two methods. We find that a straightforward implementation of parallel tempering can outperform simulated annealing in several crucial respects. When parameters are chosen appropriately, both methods yield close approximation to the actual minimum distance for an instance with 200 nodes. However, parallel tempering yields more consistently accurate results when a series of independent simulationsmore » are performed. Our results suggest that parallel tempering might offer a simple but powerful alternative to simulated annealing for combinatorial optimization problems.« less
Investigations of quantum heuristics for optimization
NASA Astrophysics Data System (ADS)
Rieffel, Eleanor; Hadfield, Stuart; Jiang, Zhang; Mandra, Salvatore; Venturelli, Davide; Wang, Zhihui
We explore the design of quantum heuristics for optimization, focusing on the quantum approximate optimization algorithm, a metaheuristic developed by Farhi, Goldstone, and Gutmann. We develop specific instantiations of the of quantum approximate optimization algorithm for a variety of challenging combinatorial optimization problems. Through theoretical analyses and numeric investigations of select problems, we provide insight into parameter setting and Hamiltonian design for quantum approximate optimization algorithms and related quantum heuristics, and into their implementation on hardware realizable in the near term.
NASA Astrophysics Data System (ADS)
Xue, Wei; Wang, Qi; Wang, Tianyu
2018-04-01
This paper presents an improved parallel combinatory spread spectrum (PC/SS) communication system with the method of double information matching (DIM). Compared with conventional PC/SS system, the new model inherits the advantage of high transmission speed, large information capacity and high security. Besides, the problem traditional system will face is the high bit error rate (BER) and since its data-sequence mapping algorithm. Hence the new model presented shows lower BER and higher efficiency by its optimization of mapping algorithm.
Hybrid Nested Partitions and Math Programming Framework for Large-scale Combinatorial Optimization
2010-03-31
optimization problems: 1) exact algorithms and 2) metaheuristic algorithms . This project will integrate concepts from these two technologies to develop...optimal solutions within an acceptable amount of computation time, and 2) metaheuristic algorithms such as genetic algorithms , tabu search, and the...integer programming decomposition approaches, such as Dantzig Wolfe decomposition and Lagrangian relaxation, and metaheuristics such as the Nested
Development of the PEBLebl Traveling Salesman Problem Computerized Testbed
ERIC Educational Resources Information Center
Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew
2015-01-01
The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…
CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.
Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro
2017-03-30
Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.
Combinatorial therapy discovery using mixed integer linear programming.
Pang, Kaifang; Wan, Ying-Wooi; Choi, William T; Donehower, Lawrence A; Sun, Jingchun; Pant, Dhruv; Liu, Zhandong
2014-05-15
Combinatorial therapies play increasingly important roles in combating complex diseases. Owing to the huge cost associated with experimental methods in identifying optimal drug combinations, computational approaches can provide a guide to limit the search space and reduce cost. However, few computational approaches have been developed for this purpose, and thus there is a great need of new algorithms for drug combination prediction. Here we proposed to formulate the optimal combinatorial therapy problem into two complementary mathematical algorithms, Balanced Target Set Cover (BTSC) and Minimum Off-Target Set Cover (MOTSC). Given a disease gene set, BTSC seeks a balanced solution that maximizes the coverage on the disease genes and minimizes the off-target hits at the same time. MOTSC seeks a full coverage on the disease gene set while minimizing the off-target set. Through simulation, both BTSC and MOTSC demonstrated a much faster running time over exhaustive search with the same accuracy. When applied to real disease gene sets, our algorithms not only identified known drug combinations, but also predicted novel drug combinations that are worth further testing. In addition, we developed a web-based tool to allow users to iteratively search for optimal drug combinations given a user-defined gene set. Our tool is freely available for noncommercial use at http://www.drug.liuzlab.org/. zhandong.liu@bcm.edu Supplementary data are available at Bioinformatics online.
Construction of a scFv Library with Synthetic, Non-combinatorial CDR Diversity.
Bai, Xuelian; Shim, Hyunbo
2017-01-01
Many large synthetic antibody libraries have been designed, constructed, and successfully generated high-quality antibodies suitable for various demanding applications. While synthetic antibody libraries have many advantages such as optimized framework sequences and a broader sequence landscape than natural antibodies, their sequence diversities typically are generated by random combinatorial synthetic processes which cause the incorporation of many undesired CDR sequences. Here, we describe the construction of a synthetic scFv library using oligonucleotide mixtures that contain predefined, non-combinatorially synthesized CDR sequences. Each CDR is first inserted to a master scFv framework sequence and the resulting single-CDR libraries are subjected to a round of proofread panning. The proofread CDR sequences are assembled to produce the final scFv library with six diversified CDRs.
Directed differentiation of embryonic stem cells using a bead-based combinatorial screening method.
Tarunina, Marina; Hernandez, Diana; Johnson, Christopher J; Rybtsov, Stanislav; Ramathas, Vidya; Jeyakumar, Mylvaganam; Watson, Thomas; Hook, Lilian; Medvinsky, Alexander; Mason, Chris; Choo, Yen
2014-01-01
We have developed a rapid, bead-based combinatorial screening method to determine optimal combinations of variables that direct stem cell differentiation to produce known or novel cell types having pre-determined characteristics. Here we describe three experiments comprising stepwise exposure of mouse or human embryonic cells to 10,000 combinations of serum-free differentiation media, through which we discovered multiple novel, efficient and robust protocols to generate a number of specific hematopoietic and neural lineages. We further demonstrate that the technology can be used to optimize existing protocols in order to substitute costly growth factors with bioactive small molecules and/or increase cell yield, and to identify in vitro conditions for the production of rare developmental intermediates such as an embryonic lymphoid progenitor cell that has not previously been reported.
Cascaded Optimization for a Persistent Data Ferrying Unmanned Aircraft
NASA Astrophysics Data System (ADS)
Carfang, Anthony
This dissertation develops and assesses a cascaded method for designing optimal periodic trajectories and link schedules for an unmanned aircraft to ferry data between stationary ground nodes. This results in a fast solution method without the need to artificially constrain system dynamics. Focusing on a fundamental ferrying problem that involves one source and one destination, but includes complex vehicle and Radio-Frequency (RF) dynamics, a cascaded structure to the system dynamics is uncovered. This structure is exploited by reformulating the nonlinear optimization problem into one that reduces the independent control to the vehicle's motion, while the link scheduling control is folded into the objective function and implemented as an optimal policy that depends on candidate motion control. This formulation is proven to maintain optimality while reducing computation time in comparison to traditional ferry optimization methods. The discrete link scheduling problem takes the form of a combinatorial optimization problem that is known to be NP-Hard. A derived necessary condition for optimality guides the development of several heuristic algorithms, specifically the Most-Data-First Algorithm and the Knapsack Adaptation. These heuristics are extended to larger ferrying scenarios, and assessed analytically and through Monte Carlo simulation, showing better throughput performance in the same order of magnitude of computation time in comparison to other common link scheduling policies. The cascaded optimization method is implemented with a novel embedded software system on a small, unmanned aircraft to validate the simulation results with field experiments. To address the sensitivity of results on trajectory tracking performance, a system that combines motion and link control with waypoint-based navigation is developed and assessed through field experiments. The data ferrying algorithms are further extended by incorporating a Gaussian process to opportunistically learn the RF environment. By continuously improving RF models, the cascaded planner can continually improve the ferrying system's overall performance.
Anatomy of the Attraction Basins: Breaking with the Intuition.
Hernando, Leticia; Mendiburu, Alexander; Lozano, Jose A
2018-05-22
Solving combinatorial optimization problems efficiently requires the development of algorithms that consider the specific properties of the problems. In this sense, local search algorithms are designed over a neighborhood structure that partially accounts for these properties. Considering a neighborhood, the space is usually interpreted as a natural landscape, with valleys and mountains. Under this perception, it is commonly believed that, if maximizing, the solutions located in the slopes of the same mountain belong to the same attraction basin, with the peaks of the mountains being the local optima. Unfortunately, this is a widespread erroneous visualization of a combinatorial landscape. Thus, our aim is to clarify this aspect, providing a detailed analysis of, first, the existence of plateaus where the local optima are involved, and second, the properties that define the topology of the attraction basins, picturing a reliable visualization of the landscapes. Some of the features explored in this paper have never been examined before. Hence, new findings about the structure of the attraction basins are shown. The study is focused on instances of permutation-based combinatorial optimization problems considering the 2-exchange and the insert neighborhoods. As a consequence of this work, we break away from the extended belief about the anatomy of attraction basins.
Fast and Efficient Discrimination of Traveling Salesperson Problem Stimulus Difficulty
ERIC Educational Resources Information Center
Dry, Matthew J.; Fontaine, Elizabeth L.
2014-01-01
The Traveling Salesperson Problem (TSP) is a computationally difficult combinatorial optimization problem. In spite of its relative difficulty, human solvers are able to generate close-to-optimal solutions in a close-to-linear time frame, and it has been suggested that this is due to the visual system's inherent sensitivity to certain geometric…
Limpoco, F Ted; Bailey, Ryan C
2011-09-28
We directly monitor in parallel and in real time the temporal profiles of polymer brushes simultaneously grown via multiple ATRP reaction conditions on a single substrate using arrays of silicon photonic microring resonators. In addition to probing relative polymerization rates, we show the ability to evaluate the dynamic properties of the in situ grown polymers. This presents a powerful new platform for studying modified interfaces that may allow for the combinatorial optimization of surface-initiated polymerization conditions.
Efficient search, mapping, and optimization of multi-protein genetic systems in diverse bacteria
Farasat, Iman; Kushwaha, Manish; Collens, Jason; Easterbrook, Michael; Guido, Matthew; Salis, Howard M
2014-01-01
Developing predictive models of multi-protein genetic systems to understand and optimize their behavior remains a combinatorial challenge, particularly when measurement throughput is limited. We developed a computational approach to build predictive models and identify optimal sequences and expression levels, while circumventing combinatorial explosion. Maximally informative genetic system variants were first designed by the RBS Library Calculator, an algorithm to design sequences for efficiently searching a multi-protein expression space across a > 10,000-fold range with tailored search parameters and well-predicted translation rates. We validated the algorithm's predictions by characterizing 646 genetic system variants, encoded in plasmids and genomes, expressed in six gram-positive and gram-negative bacterial hosts. We then combined the search algorithm with system-level kinetic modeling, requiring the construction and characterization of 73 variants to build a sequence-expression-activity map (SEAMAP) for a biosynthesis pathway. Using model predictions, we designed and characterized 47 additional pathway variants to navigate its activity space, find optimal expression regions with desired activity response curves, and relieve rate-limiting steps in metabolism. Creating sequence-expression-activity maps accelerates the optimization of many protein systems and allows previous measurements to quantitatively inform future designs. PMID:24952589
Mondal, Milon; Radeva, Nedyalka; Fanlo‐Virgós, Hugo; Otto, Sijbren; Klebe, Gerhard
2016-01-01
Abstract Fragment‐based drug design (FBDD) affords active compounds for biological targets. While there are numerous reports on FBDD by fragment growing/optimization, fragment linking has rarely been reported. Dynamic combinatorial chemistry (DCC) has become a powerful hit‐identification strategy for biological targets. We report the synergistic combination of fragment linking and DCC to identify inhibitors of the aspartic protease endothiapepsin. Based on X‐ray crystal structures of endothiapepsin in complex with fragments, we designed a library of bis‐acylhydrazones and used DCC to identify potent inhibitors. The most potent inhibitor exhibits an IC50 value of 54 nm, which represents a 240‐fold improvement in potency compared to the parent hits. Subsequent X‐ray crystallography validated the predicted binding mode, thus demonstrating the efficiency of the combination of fragment linking and DCC as a hit‐identification strategy. This approach could be applied to a range of biological targets, and holds the potential to facilitate hit‐to‐lead optimization. PMID:27400756
Azimi, Sayyed M; Sheridan, Steven D; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih
2018-05-01
Identification of optimal transcription-factor expression patterns to direct cellular differentiation along a desired pathway presents significant challenges. We demonstrate massively combinatorial screening of temporally-varying mRNA transcription factors to direct differentiation of neural progenitor cells using a dynamically-reconfigurable magnetically-guided spotting technology for localizing mRNA, enabling experiments on millimetre size spots. In addition, we present a time-interleaved delivery method that dramatically reduces fluctuations in the delivered transcription-factor copy-numbers per cell. We screened combinatorial and temporal delivery of a pool of midbrain-specific transcription factors to augment the generation of dopaminergic neurons. We show that the combinatorial delivery of LMX1A, FOXA2 and PITX3 is highly effective in generating dopaminergic neurons from midbrain progenitors. We show that LMX1A significantly increases TH -expression levels when delivered to neural progenitor cells either during proliferation or after induction of neural differentiation, while FOXA2 and PITX3 increase expression only when delivered prior to induction, demonstrating temporal dependence of factor addition. © 2018, Azimi et al.
Dynamical analysis of continuous higher-order hopfield networks for combinatorial optimization.
Atencia, Miguel; Joya, Gonzalo; Sandoval, Francisco
2005-08-01
In this letter, the ability of higher-order Hopfield networks to solve combinatorial optimization problems is assessed by means of a rigorous analysis of their properties. The stability of the continuous network is almost completely clarified: (1) hyperbolic interior equilibria, which are unfeasible, are unstable; (2) the state cannot escape from the unitary hypercube; and (3) a Lyapunov function exists. Numerical methods used to implement the continuous equation on a computer should be designed with the aim of preserving these favorable properties. The case of nonhyperbolic fixed points, which occur when the Hessian of the target function is the null matrix, requires further study. We prove that these nonhyperbolic interior fixed points are unstable in networks with three neurons and order two. The conjecture that interior equilibria are unstable in the general case is left open.
Directed Differentiation of Embryonic Stem Cells Using a Bead-Based Combinatorial Screening Method
Tarunina, Marina; Hernandez, Diana; Johnson, Christopher J.; Rybtsov, Stanislav; Ramathas, Vidya; Jeyakumar, Mylvaganam; Watson, Thomas; Hook, Lilian; Medvinsky, Alexander; Mason, Chris; Choo, Yen
2014-01-01
We have developed a rapid, bead-based combinatorial screening method to determine optimal combinations of variables that direct stem cell differentiation to produce known or novel cell types having pre-determined characteristics. Here we describe three experiments comprising stepwise exposure of mouse or human embryonic cells to 10,000 combinations of serum-free differentiation media, through which we discovered multiple novel, efficient and robust protocols to generate a number of specific hematopoietic and neural lineages. We further demonstrate that the technology can be used to optimize existing protocols in order to substitute costly growth factors with bioactive small molecules and/or increase cell yield, and to identify in vitro conditions for the production of rare developmental intermediates such as an embryonic lymphoid progenitor cell that has not previously been reported. PMID:25251366
Combinatorial optimization problem solution based on improved genetic algorithm
NASA Astrophysics Data System (ADS)
Zhang, Peng
2017-08-01
Traveling salesman problem (TSP) is a classic combinatorial optimization problem. It is a simplified form of many complex problems. In the process of study and research, it is understood that the parameters that affect the performance of genetic algorithm mainly include the quality of initial population, the population size, and crossover probability and mutation probability values. As a result, an improved genetic algorithm for solving TSP problems is put forward. The population is graded according to individual similarity, and different operations are performed to different levels of individuals. In addition, elitist retention strategy is adopted at each level, and the crossover operator and mutation operator are improved. Several experiments are designed to verify the feasibility of the algorithm. Through the experimental results analysis, it is proved that the improved algorithm can improve the accuracy and efficiency of the solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shendruk, Tyler N., E-mail: tyler.shendruk@physics.ox.ac.uk; Bertrand, Martin; Harden, James L.
2014-12-28
Given the ubiquity of depletion effects in biological and other soft matter systems, it is desirable to have coarse-grained Molecular Dynamics (MD) simulation approaches appropriate for the study of complex systems. This paper examines the use of two common truncated Lennard-Jones (Weeks-Chandler-Andersen (WCA)) potentials to describe a pair of colloidal particles in a thermal bath of depletants. The shifted-WCA model is the steeper of the two repulsive potentials considered, while the combinatorial-WCA model is the softer. It is found that the depletion-induced well depth for the combinatorial-WCA model is significantly deeper than the shifted-WCA model because the resulting overlap ofmore » the colloids yields extra accessible volume for depletants. For both shifted- and combinatorial-WCA simulations, the second virial coefficients and pair potentials between colloids are demonstrated to be well approximated by the Morphometric Thermodynamics (MT) model. This agreement suggests that the presence of depletants can be accurately modelled in MD simulations by implicitly including them through simple, analytical MT forms for depletion-induced interactions. Although both WCA potentials are found to be effective generic coarse-grained simulation approaches for studying depletion effects in complicated soft matter systems, combinatorial-WCA is the more efficient approach as depletion effects are enhanced at lower depletant densities. The findings indicate that for soft matter systems that are better modelled by potentials with some compressibility, predictions from hard-sphere systems could greatly underestimate the magnitude of depletion effects at a given depletant density.« less
Evolution, learning, and cognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Y.C.
1988-01-01
The book comprises more than fifteen articles in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
A combinatorial approach to the design of vaccines.
Martínez, Luis; Milanič, Martin; Legarreta, Leire; Medvedev, Paul; Malaina, Iker; de la Fuente, Ildefonso M
2015-05-01
We present two new problems of combinatorial optimization and discuss their applications to the computational design of vaccines. In the shortest λ-superstring problem, given a family S1,...,S(k) of strings over a finite alphabet, a set Τ of "target" strings over that alphabet, and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ target strings as substrings of S(i). In the shortest λ-cover superstring problem, given a collection X1,...,X(n) of finite sets of strings over a finite alphabet and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ elements of X(i) as substrings. The two problems are polynomially equivalent, and the shortest λ-cover superstring problem is a common generalization of two well known combinatorial optimization problems, the shortest common superstring problem and the set cover problem. We present two approaches to obtain exact or approximate solutions to the shortest λ-superstring and λ-cover superstring problems: one based on integer programming, and a hill-climbing algorithm. An application is given to the computational design of vaccines and the algorithms are applied to experimental data taken from patients infected by H5N1 and HIV-1.
1997-01-01
create a dependency tree containing an optimum set of n-1 first-order dependencies. To do this, first, we select an arbitrary bit Xroot to place at the...the root to an arbitrary bit Xroot -For all other bits Xi, set bestMatchingBitInTree[Xi] to Xroot . -While not all bits have been
Interference Aware Routing Using Spatial Reuse in Wireless Sensor Networks
2013-12-01
practice there is no optimal STDMA algorithm due to the computational complexity of the STDMA implementation; therefore, the common approach is to...Applications, Springer Berlin Heidelberg, pp. 653–657, 2001. [26] B. Korte and J. Vygen, “Shortest Paths,” Combinatorial Optimization Theory and...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited INTERFERENCE
NASA Astrophysics Data System (ADS)
Rahman, P. A.
2018-05-01
This scientific paper deals with the model of the knapsack optimization problem and method of its solving based on directed combinatorial search in the boolean space. The offered by the author specialized mathematical model of decomposition of the search-zone to the separate search-spheres and the algorithm of distribution of the search-spheres to the different cores of the multi-core processor are also discussed. The paper also provides an example of decomposition of the search-zone to the several search-spheres and distribution of the search-spheres to the different cores of the quad-core processor. Finally, an offered by the author formula for estimation of the theoretical maximum of the computational acceleration, which can be achieved due to the parallelization of the search-zone to the search-spheres on the unlimited number of the processor cores, is also given.
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernal, Andrés; Patiny, Luc; Castillo, Andrés M.
2015-02-21
Nuclear magnetic resonance (NMR) assignment of small molecules is presented as a typical example of a combinatorial optimization problem in chemical physics. Three strategies that help improve the efficiency of solution search by the branch and bound method are presented: 1. reduction of the size of the solution space by resort to a condensed structure formula, wherein symmetric nuclei are grouped together; 2. partitioning of the solution space based on symmetry, that becomes the basis for an efficient branching procedure; and 3. a criterion of selection of input restrictions that leads to increased gaps between branches and thus faster pruningmore » of non-viable solutions. Although the examples chosen to illustrate this work focus on small-molecule NMR assignment, the results are generic and might help solving other combinatorial optimization problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitley, L. Darrell; Howe, Adele E.; Watson, Jean-Paul
2004-09-01
Tabu search is one of the most effective heuristics for locating high-quality solutions to a diverse array of NP-hard combinatorial optimization problems. Despite the widespread success of tabu search, researchers have a poor understanding of many key theoretical aspects of this algorithm, including models of the high-level run-time dynamics and identification of those search space features that influence problem difficulty. We consider these questions in the context of the job-shop scheduling problem (JSP), a domain where tabu search algorithms have been shown to be remarkably effective. Previously, we demonstrated that the mean distance between random local optima and the nearestmore » optimal solution is highly correlated with problem difficulty for a well-known tabu search algorithm for the JSP introduced by Taillard. In this paper, we discuss various shortcomings of this measure and develop a new model of problem difficulty that corrects these deficiencies. We show that Taillard's algorithm can be modeled with high fidelity as a simple variant of a straightforward random walk. The random walk model accounts for nearly all of the variability in the cost required to locate both optimal and sub-optimal solutions to random JSPs, and provides an explanation for differences in the difficulty of random versus structured JSPs. Finally, we discuss and empirically substantiate two novel predictions regarding tabu search algorithm behavior. First, the method for constructing the initial solution is highly unlikely to impact the performance of tabu search. Second, tabu tenure should be selected to be as small as possible while simultaneously avoiding search stagnation; values larger than necessary lead to significant degradations in performance.« less
Jiménez-Moreno, Ester; Montalvillo-Jiménez, Laura; Santana, Andrés G; Gómez, Ana M; Jiménez-Osés, Gonzalo; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesús; Cañada, Francisco Javier; Gómez-Pinto, Irene; González, Carlos; Asensio, Juan Luis
2016-05-25
Development of strong and selective binders from promiscuous lead compounds represents one of the most expensive and time-consuming tasks in drug discovery. We herein present a novel fragment-based combinatorial strategy for the optimization of multivalent polyamine scaffolds as DNA/RNA ligands. Our protocol provides a quick access to a large variety of regioisomer libraries that can be tested for selective recognition by combining microdialysis assays with simple isotope labeling and NMR experiments. To illustrate our approach, 20 small libraries comprising 100 novel kanamycin-B derivatives have been prepared and evaluated for selective binding to the ribosomal decoding A-Site sequence. Contrary to the common view of NMR as a low-throughput technique, we demonstrate that our NMR methodology represents a valuable alternative for the detection and quantification of complex mixtures, even integrated by highly similar or structurally related derivatives, a common situation in the context of a lead optimization process. Furthermore, this study provides valuable clues about the structural requirements for selective A-site recognition.
Mondal, Milon; Radeva, Nedyalka; Fanlo-Virgós, Hugo; Otto, Sijbren; Klebe, Gerhard; Hirsch, Anna K H
2016-08-01
Fragment-based drug design (FBDD) affords active compounds for biological targets. While there are numerous reports on FBDD by fragment growing/optimization, fragment linking has rarely been reported. Dynamic combinatorial chemistry (DCC) has become a powerful hit-identification strategy for biological targets. We report the synergistic combination of fragment linking and DCC to identify inhibitors of the aspartic protease endothiapepsin. Based on X-ray crystal structures of endothiapepsin in complex with fragments, we designed a library of bis-acylhydrazones and used DCC to identify potent inhibitors. The most potent inhibitor exhibits an IC50 value of 54 nm, which represents a 240-fold improvement in potency compared to the parent hits. Subsequent X-ray crystallography validated the predicted binding mode, thus demonstrating the efficiency of the combination of fragment linking and DCC as a hit-identification strategy. This approach could be applied to a range of biological targets, and holds the potential to facilitate hit-to-lead optimization. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-08-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.
Design of focused and restrained subsets from extremely large virtual libraries.
Jamois, Eric A; Lin, Chien T; Waldman, Marvin
2003-11-01
With the current and ever-growing offering of reagents along with the vast palette of organic reactions, virtual libraries accessible to combinatorial chemists can reach sizes of billions of compounds or more. Extracting practical size subsets for experimentation has remained an essential step in the design of combinatorial libraries. A typical approach to computational library design involves enumeration of structures and properties for the entire virtual library, which may be unpractical for such large libraries. This study describes a new approach termed as on the fly optimization (OTFO) where descriptors are computed as needed within the subset optimization cycle and without intermediate enumeration of structures. Results reported herein highlight the advantages of coupling an ultra-fast descriptor calculation engine to subset optimization capabilities. We also show that enumeration of properties for the entire virtual library may not only be unpractical but also wasteful. Successful design of focused and restrained subsets can be achieved while sampling only a small fraction of the virtual library. We also investigate the stability of the method and compare results obtained from simulated annealing (SA) and genetic algorithms (GA).
Manipulating Tabu List to Handle Machine Breakdowns in Job Shop Scheduling Problems
NASA Astrophysics Data System (ADS)
Nababan, Erna Budhiarti; SalimSitompul, Opim
2011-06-01
Machine breakdowns in a production schedule may occur on a random basis that make the well-known hard combinatorial problem of Job Shop Scheduling Problems (JSSP) becomes more complex. One of popular techniques used to solve the combinatorial problems is Tabu Search. In this technique, moves that will be not allowed to be revisited are retained in a tabu list in order to avoid in gaining solutions that have been obtained previously. In this paper, we propose an algorithm to employ a second tabu list to keep broken machines, in addition to the tabu list that keeps the moves. The period of how long the broken machines will be kept on the list is categorized using fuzzy membership function. Our technique are tested to the benchmark data of JSSP available on the OR library. From the experiment, we found that our algorithm is promising to help a decision maker to face the event of machine breakdowns.
Optimal placement of tuning masses on truss structures by genetic algorithms
NASA Technical Reports Server (NTRS)
Ponslet, Eric; Haftka, Raphael T.; Cudney, Harley H.
1993-01-01
Optimal placement of tuning masses, actuators and other peripherals on large space structures is a combinatorial optimization problem. This paper surveys several techniques for solving this problem. The genetic algorithm approach to the solution of the placement problem is described in detail. An example of minimizing the difference between the two lowest frequencies of a laboratory truss by adding tuning masses is used for demonstrating some of the advantages of genetic algorithms. The relative efficiencies of different codings are compared using the results of a large number of optimization runs.
Performance evaluation of coherent Ising machines against classical neural networks
NASA Astrophysics Data System (ADS)
Haribara, Yoshitaka; Ishikawa, Hitoshi; Utsunomiya, Shoko; Aihara, Kazuyuki; Yamamoto, Yoshihisa
2017-12-01
The coherent Ising machine is expected to find a near-optimal solution in various combinatorial optimization problems, which has been experimentally confirmed with optical parametric oscillators and a field programmable gate array circuit. The similar mathematical models were proposed three decades ago by Hopfield et al in the context of classical neural networks. In this article, we compare the computational performance of both models.
Lessel, Uta; Wellenzohn, Bernd; Fischer, J Robert; Rarey, Matthias
2012-02-27
A case study is presented illustrating the design of a focused CDK2 library. The scaffold of the library was detected by a feature trees search in a fragment space based on reactions from combinatorial chemistry. For the design the software LoFT (Library optimizer using Feature Trees) was used. The special feature called FTMatch was applied to restrict the parts of the queries where the reagents are permitted to match. This way a 3D scoring function could be simulated. Results were compared with alternative designs by GOLD docking and ROCS 3D alignments.
Decorated Heegaard Diagrams and Combinatorial Heegaard Floer Homology
NASA Astrophysics Data System (ADS)
Hammarsten, Carl
Heegaard Floer homology is a collection of invariants for closed oriented three-manifolds, introduced by Ozsvath and Szabo in 2001. The simplest version is defined as the homology of a chain complex coming from a Heegaard diagram of the three manifold. In the original definition, the differentials count the number of points in certain moduli spaces of holomorphic disks, which are hard to compute in general. More recently, Sarkar and Wang (2006) and Ozsvath, Stipsicz and Szabo, (2009) have determined combinatorial methods for computing this homology with Z2 coefficients. Both methods rely on the construction of very specific Heegaard diagrams for the manifold, which are generally very complicated. Given a decorated Heegaard diagram H for a closed oriented 3-manifold Y, that is a Heegaard diagram together with a collection of embedded paths satisfying certain criteria, we describe a combinatorial recipe for a chain complex CF'[special character omitted]( H). If H satisfies some technical constraints we show that this chain complex is homotopically equivalent to the Heegaard Floer chain complex CF[special character omitted](H) and hence has the Heegaard Floer homology HF[special character omitted](Y) as its homology groups. Using branched spines we give an algorithm to construct a decorated Heegaard diagram which satisfies the necessary technical constraints for every closed oriented Y. We present this diagram graphically in the form of a strip diagram.
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
An Introduction to Simulated Annealing
ERIC Educational Resources Information Center
Albright, Brian
2007-01-01
An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.
Chemical Compound Design Using Nuclear Charge Distributions
2012-03-01
Finding optimal solutions to design problems in chemistry is hampered by the combinatorially large search space. We develop a general theoretical ... framework for finding chemical compounds with prescribed properties using nuclear charge distributions. The key is the reformulation of the design
Optimization Strategies for Sensor and Actuator Placement
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Kincaid, Rex K.
1999-01-01
This paper provides a survey of actuator and sensor placement problems from a wide range of engineering disciplines and a variety of applications. Combinatorial optimization methods are recommended as a means for identifying sets of actuators and sensors that maximize performance. Several sample applications from NASA Langley Research Center, such as active structural acoustic control, are covered in detail. Laboratory and flight tests of these applications indicate that actuator and sensor placement methods are effective and important. Lessons learned in solving these optimization problems can guide future research.
Latimer, Luke N; Dueber, John E
2017-06-01
A common challenge in metabolic engineering is rapidly identifying rate-controlling enzymes in heterologous pathways for subsequent production improvement. We demonstrate a workflow to address this challenge and apply it to improving xylose utilization in Saccharomyces cerevisiae. For eight reactions required for conversion of xylose to ethanol, we screened enzymes for functional expression in S. cerevisiae, followed by a combinatorial expression analysis to achieve pathway flux balancing and identification of limiting enzymatic activities. In the next round of strain engineering, we increased the copy number of these limiting enzymes and again tested the eight-enzyme combinatorial expression library in this new background. This workflow yielded a strain that has a ∼70% increase in biomass yield and ∼240% increase in xylose utilization. Finally, we chromosomally integrated the expression library. This library enriched for strains with multiple integrations of the pathway, which likely were the result of tandem integrations mediated by promoter homology. Biotechnol. Bioeng. 2017;114: 1301-1309. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Ito, Yoichiro; Yamanishi, Mamoru; Ikeuchi, Akinori; Imamura, Chie; Matsuyama, Takashi
2015-01-01
Combinatorial screening used together with a broad library of gene expression cassettes is expected to produce a powerful tool for the optimization of the simultaneous expression of multiple enzymes. Recently, we proposed a highly tunable protein expression system that utilized multiple genome-integrated target genes to fine-tune enzyme expression in yeast cells. This tunable system included a library of expression cassettes each composed of three gene-expression control elements that in different combinations produced a wide range of protein expression levels. In this study, four gene expression cassettes with graded protein expression levels were applied to the expression of three cellulases: cellobiohydrolase 1, cellobiohydrolase 2, and endoglucanase 2. After combinatorial screening for transgenic yeasts simultaneously secreting these three cellulases, we obtained strains with higher cellulase expressions than a strain harboring three cellulase-expression constructs within one high-performance gene expression cassette. These results show that our method will be of broad use throughout the field of metabolic engineering. PMID:26692026
Combinatorial Strategies for the Development of Bulk Metallic Glasses
NASA Astrophysics Data System (ADS)
Ding, Shiyan
The systematic identification of multi-component alloys out of the vast composition space is still a daunting task, especially in the development of bulk metallic glasses that are typically based on three or more elements. In order to address this challenge, combinatorial approaches have been proposed. However, previous attempts have not successfully coupled the synthesis of combinatorial libraries with high-throughput characterization methods. The goal of my dissertation is to develop efficient high-throughput characterization methods, optimized to identify glass formers systematically. Here, two innovative approaches have been invented. One is to measure the nucleation temperature in parallel for up-to 800 compositions. The composition with the lowest nucleation temperature has a reasonable agreement with the best-known glass forming composition. In addition, the thermoplastic formability of a metallic glass forming system is determined through blow molding a compositional library. Our results reveal that the composition with the largest thermoplastic deformation correlates well with the best-known formability composition. I have demonstrated both methods as powerful tools to develop new bulk metallic glasses.
Bioengineering Strategies for Designing Targeted Cancer Therapies
Wen, Xuejun
2014-01-01
The goals of bioengineering strategies for targeted cancer therapies are (1) to deliver a high dose of an anticancer drug directly to a cancer tumor, (2) to enhance drug uptake by malignant cells, and (3) to minimize drug uptake by nonmalignant cells. Effective cancer-targeting therapies will require both passive- and active targeting strategies and a thorough understanding of physiologic barriers to targeted drug delivery. Designing a targeted therapy includes the selection and optimization of a nanoparticle delivery vehicle for passive accumulation in tumors, a targeting moiety for active receptor-mediated uptake, and stimuli-responsive polymers for control of drug release. The future direction of cancer targeting is a combinatorial approach, in which targeting therapies are designed to use multiple targeting strategies. The combinatorial approach will enable combination therapy for delivery of multiple drugs and dual ligand targeting to improve targeting specificity. Targeted cancer treatments in development and the new combinatorial approaches show promise for improving targeted anticancer drug delivery and improving treatment outcomes. PMID:23768509
Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun
2018-07-01
This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.
NASA Astrophysics Data System (ADS)
Kasiviswanathan, Shiva Prasad; Pan, Feng
In the matrix interdiction problem, a real-valued matrix and an integer k is given. The objective is to remove a set of k matrix columns that minimizes in the residual matrix the sum of the row values, where the value of a row is defined to be the largest entry in that row. This combinatorial problem is closely related to bipartite network interdiction problem that can be applied to minimize the probability that an adversary can successfully smuggle weapons. After introducing the matrix interdiction problem, we study the computational complexity of this problem. We show that the matrix interdiction problem is NP-hard and that there exists a constant γ such that it is even NP-hard to approximate this problem within an n γ additive factor. We also present an algorithm for this problem that achieves an (n - k) multiplicative approximation ratio.
Carbon Nanotubes by CVD and Applications
NASA Technical Reports Server (NTRS)
Cassell, Alan; Delzeit, Lance; Nguyen, Cattien; Stevens, Ramsey; Han, Jie; Meyyappan, M.; Arnold, James O. (Technical Monitor)
2001-01-01
Carbon nanotube (CNT) exhibits extraordinary mechanical and unique electronic properties and offers significant potential for structural, sensor, and nanoelectronics applications. An overview of CNT, growth methods, properties and applications is provided. Single-wall, and multi-wall CNTs have been grown by chemical vapor deposition. Catalyst development and optimization has been accomplished using combinatorial optimization methods. CNT has also been grown from the tips of silicon cantilevers for use in atomic force microscopy.
NASA Astrophysics Data System (ADS)
Youl Jung, Kyeong
2010-08-01
Conventional solution-based combinatorial chemistry was combined with spray pyrolysis and applied to optimize the luminescence properties of (Y x, Gd y, Al z)BO 3:Eu 3+ red phosphor under vacuum ultraviolet (VUV) excitation. For the Y-Gd-Al ternary system, a compositional library was established to seek the optimal composition at which the highest luminescence under VUV (147 nm) excitation could be achieved. The Al content was found to mainly control the relative peak ratio (R/O) of red and orange colors due to the 5D 0→ 7F 2 to 5D 0→ 7F 1 transitions of Eu 3+. The substitution of Gd atoms in the place of Y sites did not contribute to change the R/O ratio, but was helpful to enhance the emission intensity. As a result, the 613 nm emission peak due to the 5D 0→ 7F 2 transitions of Eu 3+ was intensified by increasing the Al/Gd ratio at a fixed Y content, resulting in the improvement of the color coordinate. Finally, the optimized host composition was (Y 0.11, Gd 0.10, Al 0.79)BO 3 in terms of the emission intensity at 613 nm and the color coordinate.
Legrand, Yves-Marie; van der Lee, Arie; Barboiu, Mihail
2007-11-12
In this paper we report an extended series of 2,6-(iminoarene)pyridine-type ZnII complexes [(Lii)2Zn]II, which were surveyed for their ability to self-exchange both their ligands and their aromatic arms and to form different homoduplex and heteroduplex complexes in solution. The self-sorting of heteroduplex complexes is likely to be the result of geometric constraints. Whereas the imine-exchange process occurs quantitatively in 1:1 mixtures of [(Lii)2Zn]II complexes, the octahedral coordination process around the metal ion defines spatial-frustrated exchanges that involve the selective formation of heterocomplexes of two, by two different substituents; the bulkiest ones (pyrene in principle) specifically interact with the pseudoterpyridine core, sterically hindering the least bulky ones, which are intermolecularly stacked with similar ligands of neighboring molecules. Such a self-sorting process defined by the specific self-constitution of the ligands exchanging their aromatic substituents is self-optimized by a specific control over their spatial orientation around a metal center within the complex. They ultimately show an improved charge-transfer energy function by virtue of the dynamic amplification of self-optimized heteroduplex architectures. These systems therefore illustrate the convergence of the combinatorial self-sorting of the dynamic combinatorial libraries (DCLs) strategy and the constitutional self-optimized function.
Adham, Manal T; Bentley, Peter J
2016-08-01
This paper proposes and evaluates a solution to the truck redistribution problem prominent in London's Santander Cycle scheme. Due to the complexity of this NP-hard combinatorial optimisation problem, no efficient optimisation techniques are known to solve the problem exactly. This motivates our use of the heuristic Artificial Ecosystem Algorithm (AEA) to find good solutions in a reasonable amount of time. The AEA is designed to take advantage of highly distributed computer architectures and adapt to changing problems. In the AEA a problem is first decomposed into its relative sub-components; they then evolve solution building blocks that fit together to form a single optimal solution. Three variants of the AEA centred on evaluating clustering methods are presented: the baseline AEA, the community-based AEA which groups stations according to journey flows, and the Adaptive AEA which actively modifies clusters to cater for changes in demand. We applied these AEA variants to the redistribution problem prominent in bike share schemes (BSS). The AEA variants are empirically evaluated using historical data from Santander Cycles to validate the proposed approach and prove its potential effectiveness. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A hybrid quantum-inspired genetic algorithm for multiobjective flow shop scheduling.
Li, Bin-Bin; Wang, Ling
2007-06-01
This paper proposes a hybrid quantum-inspired genetic algorithm (HQGA) for the multiobjective flow shop scheduling problem (FSSP), which is a typical NP-hard combinatorial optimization problem with strong engineering backgrounds. On the one hand, a quantum-inspired GA (QGA) based on Q-bit representation is applied for exploration in the discrete 0-1 hyperspace by using the updating operator of quantum gate and genetic operators of Q-bit. Moreover, random-key representation is used to convert the Q-bit representation to job permutation for evaluating the objective values of the schedule solution. On the other hand, permutation-based GA (PGA) is applied for both performing exploration in permutation-based scheduling space and stressing exploitation for good schedule solutions. To evaluate solutions in multiobjective sense, a randomly weighted linear-sum function is used in QGA, and a nondominated sorting technique including classification of Pareto fronts and fitness assignment is applied in PGA with regard to both proximity and diversity of solutions. To maintain the diversity of the population, two trimming techniques for population are proposed. The proposed HQGA is tested based on some multiobjective FSSPs. Simulation results and comparisons based on several performance metrics demonstrate the effectiveness of the proposed HQGA.
Global gene expression analysis by combinatorial optimization.
Ameur, Adam; Aurell, Erik; Carlsson, Mats; Westholm, Jakub Orzechowski
2004-01-01
Generally, there is a trade-off between methods of gene expression analysis that are precise but labor-intensive, e.g. RT-PCR, and methods that scale up to global coverage but are not quite as quantitative, e.g. microarrays. In the present paper, we show how how a known method of gene expression profiling (K. Kato, Nucleic Acids Res. 23, 3685-3690 (1995)), which relies on a fairly small number of steps, can be turned into a global gene expression measurement by advanced data post-processing, with potentially little loss of accuracy. Post-processing here entails solving an ancillary combinatorial optimization problem. Validation is performed on in silico experiments generated from the FANTOM data base of full-length mouse cDNA. We present two variants of the method. One uses state-of-the-art commercial software for solving problems of this kind, the other a code developed by us specifically for this purpose, released in the public domain under GPL license.
Combinatorial Optimization of Heterogeneous Catalysts Used in the Growth of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Verma, Sunita; Delzeit, Lance; Meyyappan, M.; Han, Jie
2000-01-01
Libraries of liquid-phase catalyst precursor solutions were printed onto iridium-coated silicon substrates and evaluated for their effectiveness in catalyzing the growth of multi-walled carbon nanotubes (MWNTs) by chemical vapor deposition (CVD). The catalyst precursor solutions were composed of inorganic salts and a removable tri-block copolymer (EO)20(PO)70(EO)20 (EO = ethylene oxide, PO = propylene oxide) structure-directing agent (SDA), dissolved in ethanol/methanol mixtures. Sample libraries were quickly assayed using scanning electron microscopy after CVD growth to identify active catalysts and CVD conditions. Composition libraries and focus libraries were then constructed around the active spots identified in the discovery libraries to understand how catalyst precursor composition affects the yield, density, and quality of the nanotubes. Successful implementation of combinatorial optimization methods in the development of highly active, carbon nanotube catalysts is demonstrated, as well as the identification of catalyst formulations that lead to varying densities and shapes of aligned nanotube towers.
Kasperkiewicz, Paulina; Poreba, Marcin; Snipas, Scott J.; Parker, Heather; Winterbourn, Christine C.; Salvesen, Guy S.; Drag, Marcin
2014-01-01
The exploration of protease substrate specificity is generally restricted to naturally occurring amino acids, limiting the degree of conformational space that can be surveyed. We substantially enhanced this by incorporating 102 unnatural amino acids to explore the S1–S4 pockets of human neutrophil elastase. This approach provides hybrid natural and unnatural amino acid sequences, and thus we termed it the Hybrid Combinatorial Substrate Library. Library results were validated by the synthesis of individual tetrapeptide substrates, with the optimal substrate demonstrating more than three orders of magnitude higher catalytic efficiency than commonly used substrates of elastase. This optimal substrate was converted to an activity-based probe that demonstrated high selectivity and revealed the specific presence of active elastase during the process of neutrophil extracellular trap formation. We propose that this approach can be successfully used for any type of endopeptidase to deliver high activity and selectivity in substrates and probes. PMID:24550277
Optimizing Sensor and Actuator Arrays for ASAC Noise Control
NASA Technical Reports Server (NTRS)
Palumbo, Dan; Cabell, Ran
2000-01-01
This paper summarizes the development of an approach to optimizing the locations for arrays of sensors and actuators in active noise control systems. A type of directed combinatorial search, called Tabu Search, is used to select an optimal configuration from a much larger set of candidate locations. The benefit of using an optimized set is demonstrated. The importance of limiting actuator forces to realistic levels when evaluating the cost function is discussed. Results of flight testing an optimized system are presented. Although the technique has been applied primarily to Active Structural Acoustic Control systems, it can be adapted for use in other active noise control implementations.
A Simple Combinatorial Codon Mutagenesis Method for Targeted Protein Engineering.
Belsare, Ketaki D; Andorfer, Mary C; Cardenas, Frida S; Chael, Julia R; Park, Hyun June; Lewis, Jared C
2017-03-17
Directed evolution is a powerful tool for optimizing enzymes, and mutagenesis methods that improve enzyme library quality can significantly expedite the evolution process. Here, we report a simple method for targeted combinatorial codon mutagenesis (CCM). To demonstrate the utility of this method for protein engineering, CCM libraries were constructed for cytochrome P450 BM3 , pfu prolyl oligopeptidase, and the flavin-dependent halogenase RebH; 10-26 sites were targeted for codon mutagenesis in each of these enzymes, and libraries with a tunable average of 1-7 codon mutations per gene were generated. Each of these libraries provided improved enzymes for their respective transformations, which highlights the generality, simplicity, and tunability of CCM for targeted protein engineering.
Combinatorial Algorithms for Portfolio Optimization Problems - Case of Risk Moderate Investor
NASA Astrophysics Data System (ADS)
Juarna, A.
2017-03-01
Portfolio optimization problem is a problem of finding optimal combination of n stocks from N ≥ n available stocks that gives maximal aggregate return and minimal aggregate risk. In this paper given N = 43 from the IDX (Indonesia Stock Exchange) group of the 45 most-traded stocks, known as the LQ45, with p = 24 data of monthly returns for each stock, spanned over interval 2013-2014. This problem actually is a combinatorial one where its algorithm is constructed based on two considerations: risk moderate type of investor and maximum allowed correlation coefficient between every two eligible stocks. The main outputs resulted from implementation of the algorithms is a multiple curve of three portfolio’s attributes, e.g. the size, the ratio of return to risk, and the percentage of negative correlation coefficient for every two chosen stocks, as function of maximum allowed correlation coefficient between each two stocks. The output curve shows that the portfolio contains three stocks with ratio of return to risk at 14.57 if the maximum allowed correlation coefficient between every two eligible stocks is negative and contains 19 stocks with maximum allowed correlation coefficient 0.17 to get maximum ratio of return to risk at 25.48.
Solving multi-objective optimization problems in conservation with the reference point method
Dujardin, Yann; Chadès, Iadine
2018-01-01
Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650
Synthesizing optimal waste blends
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narayan, V.; Diwekar, W.M.; Hoza, M.
Vitrification of tank wastes to form glass is a technique that will be used for the disposal of high-level waste at Hanford. Process and storage economics show that minimizing the total number of glass logs produced is the key to keeping cost as low as possible. The amount of glass produced can be reduced by blending of the wastes. The optimal way to combine the tanks to minimize the vole of glass can be determined from a discrete blend calculation. However, this problem results in a combinatorial explosion as the number of tanks increases. Moreover, the property constraints make thismore » problem highly nonconvex where many algorithms get trapped in local minima. In this paper the authors examine the use of different combinatorial optimization approaches to solve this problem. A two-stage approach using a combination of simulated annealing and nonlinear programming (NLP) is developed. The results of different methods such as the heuristics approach based on human knowledge and judgment, the mixed integer nonlinear programming (MINLP) approach with GAMS, and branch and bound with lower bound derived from the structure of the given blending problem are compared with this coupled simulated annealing and NLP approach.« less
Optimization of Coil Element Configurations for a Matrix Gradient Coil.
Kroboth, Stefan; Layton, Kelvin J; Jia, Feng; Littin, Sebastian; Yu, Huijun; Hennig, Jurgen; Zaitsev, Maxim
2018-01-01
Recently, matrix gradient coils (also termed multi-coils or multi-coil arrays) were introduced for imaging and B 0 shimming with 24, 48, and even 84 coil elements. However, in imaging applications, providing one amplifier per coil element is not always feasible due to high cost and technical complexity. In this simulation study, we show that an 84-channel matrix gradient coil (head insert for brain imaging) is able to create a wide variety of field shapes even if the number of amplifiers is reduced. An optimization algorithm was implemented that obtains groups of coil elements, such that a desired target field can be created by driving each group with an amplifier. This limits the number of amplifiers to the number of coil element groups. Simulated annealing is used due to the NP-hard combinatorial nature of the given problem. A spherical harmonic basis set up to the full third order within a sphere of 20-cm diameter in the center of the coil was investigated as target fields. We show that the median normalized least squares error for all target fields is below approximately 5% for 12 or more amplifiers. At the same time, the dissipated power stays within reasonable limits. With a relatively small set of amplifiers, switches can be used to sequentially generate spherical harmonics up to third order. The costs associated with a matrix gradient coil can be lowered, which increases the practical utility of matrix gradient coils.
Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.
Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana
2015-03-11
Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2012-01-01
us.army.mil ABSTRACT Scenario-based training exemplifies the learning-by-doing approach to human performance improvement. In this paper , we enumerate...through a narrative, mission, quest, or scenario. In this paper we argue for a combinatorial optimization search approach to selecting and ordering...the role of an expert for the purposes of practicing skills and knowledge in realistic situations in a learning-by-doing approach to performance
Investigation and Implementation of Matrix Permanent Algorithms for Identity Resolution
2014-12-01
calculation of the permanent of a matrix whose dimension is a function of target count [21]. However, the optimal approach for computing the permanent is...presently unclear. The primary objective of this project was to determine the optimal computing strategy(-ies) for the matrix permanent in tactical and...solving various combinatorial problems (see [16] for details and appli- cations to a wide variety of problems) and thus can be applied to compute a
Optimal placement of excitations and sensors for verification of large dynamical systems
NASA Technical Reports Server (NTRS)
Salama, M.; Rose, T.; Garba, J.
1987-01-01
The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.
An effective PSO-based memetic algorithm for flow shop scheduling.
Liu, Bo; Wang, Ling; Jin, Yi-Hui
2007-02-01
This paper proposes an effective particle swarm optimization (PSO)-based memetic algorithm (MA) for the permutation flow shop scheduling problem (PFSSP) with the objective to minimize the maximum completion time, which is a typical non-deterministic polynomial-time (NP) hard combinatorial optimization problem. In the proposed PSO-based MA (PSOMA), both PSO-based searching operators and some special local searching operators are designed to balance the exploration and exploitation abilities. In particular, the PSOMA applies the evolutionary searching mechanism of PSO, which is characterized by individual improvement, population cooperation, and competition to effectively perform exploration. On the other hand, the PSOMA utilizes several adaptive local searches to perform exploitation. First, to make PSO suitable for solving PFSSP, a ranked-order value rule based on random key representation is presented to convert the continuous position values of particles to job permutations. Second, to generate an initial swarm with certain quality and diversity, the famous Nawaz-Enscore-Ham (NEH) heuristic is incorporated into the initialization of population. Third, to balance the exploration and exploitation abilities, after the standard PSO-based searching operation, a new local search technique named NEH_1 insertion is probabilistically applied to some good particles selected by using a roulette wheel mechanism with a specified probability. Fourth, to enrich the searching behaviors and to avoid premature convergence, a simulated annealing (SA)-based local search with multiple different neighborhoods is designed and incorporated into the PSOMA. Meanwhile, an effective adaptive meta-Lamarckian learning strategy is employed to decide which neighborhood to be used in SA-based local search. Finally, to further enhance the exploitation ability, a pairwise-based local search is applied after the SA-based search. Simulation results based on benchmarks demonstrate the effectiveness of the PSOMA. Additionally, the effects of some parameters on optimization performances are also discussed.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.
Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian
2011-01-13
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm
2010-01-01
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure−activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods. PMID:24900251
Kell, Douglas B
2012-03-01
A considerable number of areas of bioscience, including gene and drug discovery, metabolic engineering for the biotechnological improvement of organisms, and the processes of natural and directed evolution, are best viewed in terms of a 'landscape' representing a large search space of possible solutions or experiments populated by a considerably smaller number of actual solutions that then emerge. This is what makes these problems 'hard', but as such these are to be seen as combinatorial optimisation problems that are best attacked by heuristic methods known from that field. Such landscapes, which may also represent or include multiple objectives, are effectively modelled in silico, with modern active learning algorithms such as those based on Darwinian evolution providing guidance, using existing knowledge, as to what is the 'best' experiment to do next. An awareness, and the application, of these methods can thereby enhance the scientific discovery process considerably. This analysis fits comfortably with an emerging epistemology that sees scientific reasoning, the search for solutions, and scientific discovery as Bayesian processes. Copyright © 2012 WILEY Periodicals, Inc.
Creating IRT-Based Parallel Test Forms Using the Genetic Algorithm Method
ERIC Educational Resources Information Center
Sun, Koun-Tem; Chen, Yu-Jen; Tsai, Shu-Yen; Cheng, Chien-Fen
2008-01-01
In educational measurement, the construction of parallel test forms is often a combinatorial optimization problem that involves the time-consuming selection of items to construct tests having approximately the same test information functions (TIFs) and constraints. This article proposes a novel method, genetic algorithm (GA), to construct parallel…
AFLOW: An Automatic Framework for High-throughput Materials Discovery
2011-11-14
computational ma- terials HT applications include combinatorial discov- ery of superconductors [1], Pareto-optimal search for alloys and catalysts [14, 15...Ducastelle, D. Gratias, Physica A 128 (1984) 334–350. [37] D. de Fontaine, Cluster Approach to Order- disorder Transfor- mations in Alloys, volume 47 of
NASA Astrophysics Data System (ADS)
Ushijima, Timothy T.; Yeh, William W.-G.
2013-10-01
An optimal experimental design algorithm is developed to select locations for a network of observation wells that provide maximum information about unknown groundwater pumping in a confined, anisotropic aquifer. The design uses a maximal information criterion that chooses, among competing designs, the design that maximizes the sum of squared sensitivities while conforming to specified design constraints. The formulated optimization problem is non-convex and contains integer variables necessitating a combinatorial search. Given a realistic large-scale model, the size of the combinatorial search required can make the problem difficult, if not impossible, to solve using traditional mathematical programming techniques. Genetic algorithms (GAs) can be used to perform the global search; however, because a GA requires a large number of calls to a groundwater model, the formulated optimization problem still may be infeasible to solve. As a result, proper orthogonal decomposition (POD) is applied to the groundwater model to reduce its dimensionality. Then, the information matrix in the full model space can be searched without solving the full model. Results from a small-scale test case show identical optimal solutions among the GA, integer programming, and exhaustive search methods. This demonstrates the GA's ability to determine the optimal solution. In addition, the results show that a GA with POD model reduction is several orders of magnitude faster in finding the optimal solution than a GA using the full model. The proposed experimental design algorithm is applied to a realistic, two-dimensional, large-scale groundwater problem. The GA converged to a solution for this large-scale problem.
Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors
Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang
2015-01-01
The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383
Napolitano, Roberta; Soesbe, Todd C; De León-Rodríguez, Luis M; Sherry, A Dean; Udugamasooriya, D Gomika
2011-08-24
The sensitivity of magnetic resonance imaging (MRI) contrast agents is highly dependent on the rate of water exchange between the inner sphere of a paramagnetic ion and bulk water. Normally, identifying a paramagnetic complex that has optimal water exchange kinetics is done by synthesizing and testing one compound at a time. We report here a rapid, economical on-bead combinatorial synthesis of a library of imaging agents. Eighty different 1,4,7,10-tetraazacyclododecan-1,4,7,10-tetraacetic acid (DOTA)-tetraamide peptoid derivatives were prepared on beads using a variety of charged, uncharged but polar, hydrophobic, and variably sized primary amines. A single chemical exchange saturation transfer image of the on-bead library easily distinguished those compounds having the most favorable water exchange kinetics. This combinatorial approach will allow rapid screening of libraries of imaging agents to identify the chemical characteristics of a ligand that yield the most sensitive imaging agents. This technique could be automated and readily adapted to other types of MRI or magnetic resonance/positron emission tomography agents as well.
Kajiwara, Shota; Yamada, Ryosuke; Ogino, Hiroyasu
2018-04-10
Simple and cost-effective lipase expression host microorganisms are highly desirable. A combinatorial library strategy is used to improve the secretory expression of lipase from Bacillus thermocatenulatus (BTL2) in the culture supernatant of Saccharomyces cerevisiae. A plasmid library including expression cassettes composed of sequences encoding one of each 15 promoters, 15 secretion signals, and 15 terminators derived from yeast species, S. cerevisiae, Pichia pastoris, and Hansenula polymorpha, is constructed. The S. cerevisiae transformant YPH499/D4, comprising H. polymorpha GAP promoter, S. cerevisiae SAG1 secretion signal, and P. pastoris AOX1 terminator, is selected by high-throughput screening. This transformant expresses BTL2 extra-cellularly with a 130-fold higher than the control strain, comprising S. cerevisiae PGK1 promoter, S. cerevisiae α-factor secretion signal, and S. cerevisiae PGK1 terminator, after cultivation for 72 h. This combinatorial library strategy holds promising potential for application in the optimization of the secretory expression of proteins in yeast. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jézéquel, Laetitia; Loeper, Jacqueline; Pompon, Denis
2008-11-01
Combinatorial libraries coding for mosaic enzymes with predefined crossover points constitute useful tools to address and model structure-function relationships and for functional optimization of enzymes based on multivariate statistics. The presented method, called sequence-independent generation of a chimera-ordered library (SIGNAL), allows easy shuffling of any predefined amino acid segment between two or more proteins. This method is particularly well adapted to the exchange of protein structural modules. The procedure could also be well suited to generate ordered combinatorial libraries independent of sequence similarities in a robotized manner. Sequence segments to be recombined are first extracted by PCR from a single-stranded template coding for an enzyme of interest using a biotin-avidin-based method. This technique allows the reduction of parental template contamination in the final library. Specific PCR primers allow amplification of two complementary mosaic DNA fragments, overlapping in the region to be exchanged. Fragments are finally reassembled using a fusion PCR. The process is illustrated via the construction of a set of mosaic CYP2B enzymes using this highly modular approach.
Ant algorithms for discrete optimization.
Dorigo, M; Di Caro, G; Gambardella, L M
1999-01-01
This article presents an overview of recent work on ant algorithms, that is, algorithms for discrete optimization that took inspiration from the observation of ant colonies' foraging behavior, and introduces the ant colony optimization (ACO) metaheuristic. In the first part of the article the basic biological findings on real ants are reviewed and their artificial counterparts as well as the ACO metaheuristic are defined. In the second part of the article a number of applications of ACO algorithms to combinatorial optimization and routing in communications networks are described. We conclude with a discussion of related work and of some of the most important aspects of the ACO metaheuristic.
NASA Astrophysics Data System (ADS)
Vallet, B.; Soheilian, B.; Brédif, M.
2014-08-01
The 3D reconstruction of similar 3D objects detected in 2D faces a major issue when it comes to grouping the 2D detections into clusters to be used to reconstruct the individual 3D objects. Simple clustering heuristics fail as soon as similar objects are close. This paper formulates a framework to use the geometric quality of the reconstruction as a hint to do a proper clustering. We present a methodology to solve the resulting combinatorial optimization problem with some simplifications and approximations in order to make it tractable. The proposed method is applied to the reconstruction of 3D traffic signs from their 2D detections to demonstrate its capacity to solve ambiguities.
Sahib, Mouayad A.; Gambardella, Luca M.; Afzal, Wasif; Zamli, Kamal Z.
2016-01-01
Combinatorial test design is a plan of test that aims to reduce the amount of test cases systematically by choosing a subset of the test cases based on the combination of input variables. The subset covers all possible combinations of a given strength and hence tries to match the effectiveness of the exhaustive set. This mechanism of reduction has been used successfully in software testing research with t-way testing (where t indicates the interaction strength of combinations). Potentially, other systems may exhibit many similarities with this approach. Hence, it could form an emerging application in different areas of research due to its usefulness. To this end, more recently it has been applied in a few research areas successfully. In this paper, we explore the applicability of combinatorial test design technique for Fractional Order (FO), Proportional-Integral-Derivative (PID) parameter design controller, named as FOPID, for an automatic voltage regulator (AVR) system. Throughout the paper, we justify this new application theoretically and practically through simulations. In addition, we report on first experiments indicating its practical use in this field. We design different algorithms and adapted other strategies to cover all the combinations with an optimum and effective test set. Our findings indicate that combinatorial test design can find the combinations that lead to optimum design. Besides this, we also found that by increasing the strength of combination, we can approach to the optimum design in a way that with only 4-way combinatorial set, we can get the effectiveness of an exhaustive test set. This significantly reduced the number of tests needed and thus leads to an approach that optimizes design of parameters quickly. PMID:27829025
Luo, Li; Luo, Le; Zhang, Xinli; He, Xiaoli
2017-07-10
Accurate forecasting of hospital outpatient visits is beneficial for the reasonable planning and allocation of healthcare resource to meet the medical demands. In terms of the multiple attributes of daily outpatient visits, such as randomness, cyclicity and trend, time series methods, ARIMA, can be a good choice for outpatient visits forecasting. On the other hand, the hospital outpatient visits are also affected by the doctors' scheduling and the effects are not pure random. Thinking about the impure specialty, this paper presents a new forecasting model that takes cyclicity and the day of the week effect into consideration. We formulate a seasonal ARIMA (SARIMA) model on a daily time series and then a single exponential smoothing (SES) model on the day of the week time series, and finally establish a combinatorial model by modifying them. The models are applied to 1 year of daily visits data of urban outpatients in two internal medicine departments of a large hospital in Chengdu, for forecasting the daily outpatient visits about 1 week ahead. The proposed model is applied to forecast the cross-sectional data for 7 consecutive days of daily outpatient visits over an 8-weeks period based on 43 weeks of observation data during 1 year. The results show that the two single traditional models and the combinatorial model are simplicity of implementation and low computational intensiveness, whilst being appropriate for short-term forecast horizons. Furthermore, the combinatorial model can capture the comprehensive features of the time series data better. Combinatorial model can achieve better prediction performance than the single model, with lower residuals variance and small mean of residual errors which needs to be optimized deeply on the next research step.
NASA Astrophysics Data System (ADS)
Li, Yuzhong
Using GA solve the winner determination problem (WDP) with large bids and items, run under different distribution, because the search space is large, constraint complex and it may easy to produce infeasible solution, would affect the efficiency and quality of algorithm. This paper present improved MKGA, including three operator: preprocessing, insert bid and exchange recombination, and use Monkey-king elite preservation strategy. Experimental results show that improved MKGA is better than SGA in population size and computation. The problem that traditional branch and bound algorithm hard to solve, improved MKGA can solve and achieve better effect.
Optical solver of combinatorial problems: nanotechnological approach.
Cohen, Eyal; Dolev, Shlomi; Frenkel, Sergey; Kryzhanovsky, Boris; Palagushkin, Alexandr; Rosenblit, Michael; Zakharov, Victor
2013-09-01
We present an optical computing system to solve NP-hard problems. As nano-optical computing is a promising venue for the next generation of computers performing parallel computations, we investigate the application of submicron, or even subwavelength, computing device designs. The system utilizes a setup of exponential sized masks with exponential space complexity produced in polynomial time preprocessing. The masks are later used to solve the problem in polynomial time. The size of the masks is reduced to nanoscaled density. Simulations were done to choose a proper design, and actual implementations show the feasibility of such a system.
Monkey search algorithm for ECE components partitioning
NASA Astrophysics Data System (ADS)
Kuliev, Elmar; Kureichik, Vladimir; Kureichik, Vladimir, Jr.
2018-05-01
The paper considers one of the important design problems – a partitioning of electronic computer equipment (ECE) components (blocks). It belongs to the NP-hard class of problems and has a combinatorial and logic nature. In the paper, a partitioning problem formulation can be found as a partition of graph into parts. To solve the given problem, the authors suggest using a bioinspired approach based on a monkey search algorithm. Based on the developed software, computational experiments were carried out that show the algorithm efficiency, as well as its recommended settings for obtaining more effective solutions in comparison with a genetic algorithm.
Laser-assisted development of titanium alloys: the search for new biomedical materials
NASA Astrophysics Data System (ADS)
Almeida, Amelia; Gupta, Dheeraj; Vilar, Rui
2011-02-01
Ti-alloys used in prosthetic applications are mostly alloys initially developed for aeronautical applications, so their behavior was not optimized for medical use. A need remains to design new alloys for biomedical applications, where requirements such as biocompatibility, in-body durability, specific manufacturing ability, and cost effectiveness are considered. Materials for this application must present excellent biocompatibility, ductility, toughness and wear and corrosion resistance, a large laser processing window and low sensitivity to changes in the processing parameters. Laser deposition has been investigated in order to access its applicability to laser based manufactured implants. In this study, variable powder feed rate laser cladding has been used as a method for the combinatorial investigation of new alloy systems that offers a unique possibility for the rapid and exhaustive preparation of a whole range of alloys with compositions variable along a single clad track. This method was used as to produce composition gradient Ti-Mo alloys. Mo has been used since it is among the few elements biocompatible, non-toxic β-Ti phase stabilizers. Alloy tracks with compositions in the range 0-19 wt.%Mo were produced and characterized in detail as a function of composition using microscale testing procedures for screening of compositions with promising properties. Microstructural analysis showed that alloys with Mo content above 8% are fully formed of β phase grains. However, these β grains present a cellular substructure that is associated to a Ti and Mo segregation pattern that occurs during solidification. Ultramicroindentation tests carried out to evaluate the alloys' hardness and Young's modulus showed that Ti-13%Mo alloys presented the lowest hardness and Young's modulus (70 GPa) closer to that of bone than common Ti alloys, thus showing great potential for implant applications.
Cooperative combinatorial optimization: evolutionary computation case study.
Burgin, Mark; Eberbach, Eugene
2008-01-01
This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.
Swarm Intelligence Optimization and Its Applications
NASA Astrophysics Data System (ADS)
Ding, Caichang; Lu, Lu; Liu, Yuanchao; Peng, Wenxiu
Swarm Intelligence is a computational and behavioral metaphor for solving distributed problems inspired from biological examples provided by social insects such as ants, termites, bees, and wasps and by swarm, herd, flock, and shoal phenomena in vertebrates such as fish shoals and bird flocks. An example of successful research direction in Swarm Intelligence is ant colony optimization (ACO), which focuses on combinatorial optimization problems. Ant algorithms can be viewed as multi-agent systems (ant colony), where agents (individual ants) solve required tasks through cooperation in the same way that ants create complex social behavior from the combined efforts of individuals.
Zhang, H H; Gao, S; Chen, W; Shi, L; D'Souza, W D; Meyer, R R
2013-03-21
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equallyspaced beams (eplans), we have developed a global search metaheuristic process based on the nested partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are of superior quality.
Zhang, H H; Gao, S; Chen, W; Shi, L; D’Souza, W D; Meyer, R R
2013-01-01
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equally-spaced beams (eplans), we have developed a global search metaheuristic process based on the Nested Partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are superior quality. PMID:23459411
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
Gooding, Owen W
2004-06-01
The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.
Optimal Iterative Task Scheduling for Parallel Simulations.
1991-03-01
State University, Pullman, Washington. November 1976. 19. Grimaldi , Ralph P . Discrete and Combinatorial Mathematics. Addison-Wesley. June 1989. 20...2 4.8.1 Problem Description .. .. .. .. ... .. ... .... 4-25 4.8.2 Reasons for Level-Strate- p Failure. .. .. .. .. ... 4-26...f- I CA A* overview................................ C-1 C .2 Sample A* r......................... .... C-I C-3 Evaluation P
ERIC Educational Resources Information Center
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-01-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The…
Solforosi, Laura; Mancini, Nicasio; Canducci, Filippo; Clementi, Nicola; Sautto, Giuseppe Andrea; Diotti, Roberta Antonia; Clementi, Massimo; Burioni, Roberto
2012-07-01
A novel phagemid vector, named pCM, was optimized for the cloning and display of antibody fragment (Fab) libraries on the surface of filamentous phage. This vector contains two long DNA "stuffer" fragments for easier differentiation of the correctly cut forms of the vector. Moreover, in pCM the fragment at the heavy-chain cloning site contains an acid phosphatase-encoding gene allowing an easy distinction of the Escherichia coli cells containing the unmodified form of the phagemid versus the heavy-chain fragment coding cDNA. In pCM transcription of heavy-chain Fd/gene III and light chain is driven by a single lacZ promoter. The light chain is directed to the periplasm by the ompA signal peptide, whereas the heavy-chain Fd/coat protein III is trafficked by the pelB signal peptide. The phagemid pCM was used to generate a human combinatorial phage display antibody library that allowed the selection of a monoclonal Fab fragment antibody directed against the nucleoprotein (NP) of Influenza A virus.
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Hassib, Lamyaa
2005-06-01
Multicomponent polymer-based formulations of optical sensor materials are difficult and time consuming to optimize using conventional approaches. To address these challenges, our long-term goal is to determine relationships between sensor formulation and sensor response parameters using new scientific methodologies. As the first step, we have designed and implemented an automated analytical instrumentation infrastructure for combinatorial and high-throughput development of polymeric sensor materials for optical sensors. Our approach is based on the fabrication and performance screening of discrete and gradient sensor arrays. Simultaneous formation of multiple sensor coatings into discrete 4×6, 6×8, and 8×12 element arrays (3-15μL volume per element) and their screening provides not only a well-recognized acceleration in the screening rate, but also considerably reduces or even eliminates sources of variability, which are randomly affecting sensors response during a conventional one-at-a-time sensor coating evaluation. The application of gradient sensor arrays provides additional capabilities for rapid finding of the optimal formulation parameters.
He, Jiankang; Du, Yanan; Guo, Yuqi; Hancock, Matthew J.; Wang, Ben; Shin, Hyeongho; Wu, Jinhui; Li, Dichen; Khademhosseini, Ali
2010-01-01
Combinatorial material synthesis is a powerful approach for creating composite material libraries for the high-throughput screening of cell–material interactions. Although current combinatorial screening platforms have been tremendously successful in identifying target (termed “hit”) materials from composite material libraries, new material synthesis approaches are needed to further optimize the concentrations and blending ratios of the component materials. Here we employed a microfluidic platform to rapidly synthesize composite materials containing cross-gradients of gelatin and chitosan for investigating cell–biomaterial interactions. The microfluidic synthesis of the cross-gradient was optimized experimentally and theoretically to produce quantitatively controllable variations in the concentrations and blending ratios of the two components. The anisotropic chemical compositions of the gelatin/chitosan cross-gradients were characterized by Fourier transform infrared spectrometry and X-ray photoelectron spectrometry. The three-dimensional (3D) porous gelatin/chitosan cross-gradient materials were shown to regulate the cellular morphology and proliferation of smooth muscle cells (SMCs) in a gradient-dependent manner. We envision that our microfluidic cross-gradient platform may accelerate the material development processes involved in a wide range of biomedical applications. PMID:20721897
Human Performance on Hard Non-Euclidean Graph Problems: Vertex Cover
ERIC Educational Resources Information Center
Carruthers, Sarah; Masson, Michael E. J.; Stege, Ulrike
2012-01-01
Recent studies on a computationally hard visual optimization problem, the Traveling Salesperson Problem (TSP), indicate that humans are capable of finding close to optimal solutions in near-linear time. The current study is a preliminary step in investigating human performance on another hard problem, the Minimum Vertex Cover Problem, in which…
Quantum annealing with parametrically driven nonlinear oscillators
NASA Astrophysics Data System (ADS)
Puri, Shruti
While progress has been made towards building Ising machines to solve hard combinatorial optimization problems, quantum speedups have so far been elusive. Furthermore, protecting annealers against decoherence and achieving long-range connectivity remain important outstanding challenges. With the hope of overcoming these challenges, I introduce a new paradigm for quantum annealing that relies on continuous variable states. Unlike the more conventional approach based on two-level systems, in this approach, quantum information is encoded in two coherent states that are stabilized by parametrically driving a nonlinear resonator. I will show that a fully connected Ising problem can be mapped onto a network of such resonators, and outline an annealing protocol based on adiabatic quantum computing. During the protocol, the resonators in the network evolve from vacuum to coherent states representing the ground state configuration of the encoded problem. In short, the system evolves between two classical states following non-classical dynamics. As will be supported by numerical results, this new annealing paradigm leads to superior noise resilience. Finally, I will discuss a realistic circuit QED realization of an all-to-all connected network of parametrically driven nonlinear resonators. The continuous variable nature of the states in the large Hilbert space of the resonator provides new opportunities for exploring quantum phase transitions and non-stoquastic dynamics during the annealing schedule.
NASA Astrophysics Data System (ADS)
Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.
2016-11-01
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.
Colored Traveling Salesman Problem.
Li, Jun; Zhou, MengChu; Sun, Qirui; Dai, Xianzhong; Yu, Xiaolong
2015-11-01
The multiple traveling salesman problem (MTSP) is an important combinatorial optimization problem. It has been widely and successfully applied to the practical cases in which multiple traveling individuals (salesmen) share the common workspace (city set). However, it cannot represent some application problems where multiple traveling individuals not only have their own exclusive tasks but also share a group of tasks with each other. This work proposes a new MTSP called colored traveling salesman problem (CTSP) for handling such cases. Two types of city groups are defined, i.e., each group of exclusive cities of a single color for a salesman to visit and a group of shared cities of multiple colors allowing all salesmen to visit. Evidences show that CTSP is NP-hard and a multidepot MTSP and multiple single traveling salesman problems are its special cases. We present a genetic algorithm (GA) with dual-chromosome coding for CTSP and analyze the corresponding solution space. Then, GA is improved by incorporating greedy, hill-climbing (HC), and simulated annealing (SA) operations to achieve better performance. By experiments, the limitation of the exact solution method is revealed and the performance of the presented GAs is compared. The results suggest that SAGA can achieve the best quality of solutions and HCGA should be the choice making good tradeoff between the solution quality and computing time.
Optimal Control Surface Layout for an Aeroservoelastic Wingbox
NASA Technical Reports Server (NTRS)
Stanford, Bret K.
2017-01-01
This paper demonstrates a technique for locating the optimal control surface layout of an aeroservoelastic Common Research Model wingbox, in the context of maneuver load alleviation and active utter suppression. The combinatorial actuator layout design is solved using ideas borrowed from topology optimization, where the effectiveness of a given control surface is tied to a layout design variable, which varies from zero (the actuator is removed) to one (the actuator is retained). These layout design variables are optimized concurrently with a large number of structural wingbox sizing variables and control surface actuation variables, in order to minimize the sum of structural weight and actuator weight. Results are presented that demonstrate interdependencies between structural sizing patterns and optimal control surface layouts, for both static and dynamic aeroelastic physics.
Arbitrary norm support vector machines.
Huang, Kaizhu; Zheng, Danian; King, Irwin; Lyu, Michael R
2009-02-01
Support vector machines (SVM) are state-of-the-art classifiers. Typically L2-norm or L1-norm is adopted as a regularization term in SVMs, while other norm-based SVMs, for example, the L0-norm SVM or even the L(infinity)-norm SVM, are rarely seen in the literature. The major reason is that L0-norm describes a discontinuous and nonconvex term, leading to a combinatorially NP-hard optimization problem. In this letter, motivated by Bayesian learning, we propose a novel framework that can implement arbitrary norm-based SVMs in polynomial time. One significant feature of this framework is that only a sequence of sequential minimal optimization problems needs to be solved, thus making it practical in many real applications. The proposed framework is important in the sense that Bayesian priors can be efficiently plugged into most learning methods without knowing the explicit form. Hence, this builds a connection between Bayesian learning and the kernel machines. We derive the theoretical framework, demonstrate how our approach works on the L0-norm SVM as a typical example, and perform a series of experiments to validate its advantages. Experimental results on nine benchmark data sets are very encouraging. The implemented L0-norm is competitive with or even better than the standard L2-norm SVM in terms of accuracy but with a reduced number of support vectors, -9.46% of the number on average. When compared with another sparse model, the relevance vector machine, our proposed algorithm also demonstrates better sparse properties with a training speed over seven times faster.
Concepts and applications of "natural computing" techniques in de novo drug and peptide design.
Hiss, Jan A; Hartenfeller, Markus; Schneider, Gisbert
2010-05-01
Evolutionary algorithms, particle swarm optimization, and ant colony optimization have emerged as robust optimization methods for molecular modeling and peptide design. Such algorithms mimic combinatorial molecule assembly by using molecular fragments as building-blocks for compound construction, and relying on adaptation and emergence of desired pharmacological properties in a population of virtual molecules. Nature-inspired algorithms might be particularly suited for bioisosteric replacement or scaffold-hopping from complex natural products to synthetically more easily accessible compounds that are amenable to optimization by medicinal chemistry. The theory and applications of selected nature-inspired algorithms for drug design are reviewed, together with practical applications and a discussion of their advantages and limitations.
Harańczyk, Maciej; Gutowski, Maciej
2007-01-01
We describe a procedure of finding low-energy tautomers of a molecule. The procedure consists of (i) combinatorial generation of a library of tautomers, (ii) screening based on the results of geometry optimization of initial structures performed at the density functional level of theory, and (iii) final refinement of geometry for the top hits at the second-order Möller-Plesset level of theory followed by single-point energy calculations at the coupled cluster level of theory with single, double, and perturbative triple excitations. The library of initial structures of various tautomers is generated with TauTGen, a tautomer generator program. The procedure proved to be successful for these molecular systems for which common chemical knowledge had not been sufficient to predict the most stable structures.
Stochastic dynamics and combinatorial optimization
NASA Astrophysics Data System (ADS)
Ovchinnikov, Igor V.; Wang, Kang L.
2017-11-01
Natural dynamics is often dominated by sudden nonlinear processes such as neuroavalanches, gamma-ray bursts, solar flares, etc., that exhibit scale-free statistics much in the spirit of the logarithmic Ritcher scale for earthquake magnitudes. On phase diagrams, stochastic dynamical systems (DSs) exhibiting this type of dynamics belong to the finite-width phase (N-phase for brevity) that precedes ordinary chaotic behavior and that is known under such names as noise-induced chaos, self-organized criticality, dynamical complexity, etc. Within the recently proposed supersymmetric theory of stochastic dynamics, the N-phase can be roughly interpreted as the noise-induced “overlap” between integrable and chaotic deterministic dynamics. As a result, the N-phase dynamics inherits the properties of the both. Here, we analyze this unique set of properties and conclude that the N-phase DSs must naturally be the most efficient optimizers: on one hand, N-phase DSs have integrable flows with well-defined attractors that can be associated with candidate solutions and, on the other hand, the noise-induced attractor-to-attractor dynamics in the N-phase is effectively chaotic or aperiodic so that a DS must avoid revisiting solutions/attractors thus accelerating the search for the best solution. Based on this understanding, we propose a method for stochastic dynamical optimization using the N-phase DSs. This method can be viewed as a hybrid of the simulated and chaotic annealing methods. Our proposition can result in a new generation of hardware devices for efficient solution of various search and/or combinatorial optimization problems.
Fast Optimization of LiMgMnOx/La2O3 Catalysts for the Oxidative Coupling of Methane.
Li, Zhinian; He, Lei; Wang, Shenliang; Yi, Wuzhong; Zou, Shihui; Xiao, Liping; Fan, Jie
2017-01-09
The development of efficient catalyst for oxidative coupling of methane (OCM) reaction represents a grand challenge in direct conversion of methane into other useful products. Here, we reported that a newly developed combinatorial approach can be used for ultrafast optimization of La 2 O 3 -based multicomponent metal oxide catalysts in OCM reaction. This new approach integrated inkjet printing assisted synthesis (IJP-A) with multidimensional group testing strategy (m-GT) tactfully takes the place of conventionally high-throughput synthesis-and-screen experiment. Just within a week, 2048 formulated LiMgMnO x -La 2 O 3 catalysts in a 64·8·8·8·8 = 262 144 compositional space were fabricated by IJP-A in a four-round synthesis-and-screen process, and an optimized formulation has been successfully identified through only 4·8 = 32 times of tests via m-GT screening strategy. The screening process identifies the most promising ternary composition region is Li 0-0.48 Mg 0-6.54 Mn 0-0.62 -La 100 O x with an external C 2 yield of 10.87% at 700 °C. The yield of C 2 is two times as high as the pure nano-La 2 O 3 . The good performance of the optimized catalyst formulation has been validated by the manual preparation, which further prove the effectiveness of the new combinatorial methodology in fast discovery of heterogeneous catalyst.
2014-03-27
1959). On a linear-programming, combinatorial approach to the traveling - salesman problem . Operations Research, 58-66. Daugherty, P. J., Myers, M. B...1 Problem Statement... Problem Statement As of 01 September 2013, the USAF is tracking 12,571 individual Class VII assets valued at $213.5 million for final disposition
Haverkamp, Alexander; Hansson, Bill S.; Knaden, Markus
2018-01-01
Insects, including those which provide vital ecosystems services as well as those which are devastating pests or disease vectors, locate their resources mainly based on olfaction. Understanding insect olfaction not only from a neurobiological but also from an ecological perspective is therefore crucial to balance insect control and conservation. However, among all sensory stimuli olfaction is particularly hard to grasp. Our chemical environment is made up of thousands of different compounds, which might again be detected by our nose in multiple ways. Due to this complexity, researchers have only recently begun to explore the chemosensory ecology of model organisms such as Drosophila, linking the tools of chemical ecology to those of neurogenetics. This cross-disciplinary approach has enabled several studies that range from single odors and their ecological relevance, via olfactory receptor genes and neuronal processing, up to the insects' behavior. We learned that the insect olfactory system employs strategies of combinatorial coding to process general odors as well as labeled lines for specific compounds that call for an immediate response. These studies opened new doors to the olfactory world in which insects feed, oviposit, and mate. PMID:29449815
Exact solution of large asymmetric traveling salesman problems.
Miller, D L; Pekny, J F
1991-02-15
The traveling salesman problem is one of a class of difficult problems in combinatorial optimization that is representative of a large number of important scientific and engineering problems. A survey is given of recent applications and methods for solving large problems. In addition, an algorithm for the exact solution of the asymmetric traveling salesman problem is presented along with computational results for several classes of problems. The results show that the algorithm performs remarkably well for some classes of problems, determining an optimal solution even for problems with large numbers of cities, yet for other classes, even small problems thwart determination of a provably optimal solution.
Improved mine blast algorithm for optimal cost design of water distribution systems
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Guen Yoo, Do; Kim, Joong Hoon
2015-12-01
The design of water distribution systems is a large class of combinatorial, nonlinear optimization problems with complex constraints such as conservation of mass and energy equations. Since feasible solutions are often extremely complex, traditional optimization techniques are insufficient. Recently, metaheuristic algorithms have been applied to this class of problems because they are highly efficient. In this article, a recently developed optimizer called the mine blast algorithm (MBA) is considered. The MBA is improved and coupled with the hydraulic simulator EPANET to find the optimal cost design for water distribution systems. The performance of the improved mine blast algorithm (IMBA) is demonstrated using the well-known Hanoi, New York tunnels and Balerma benchmark networks. Optimization results obtained using IMBA are compared to those using MBA and other optimizers in terms of their minimum construction costs and convergence rates. For the complex Balerma network, IMBA offers the cheapest network design compared to other optimization algorithms.
Identifying Floppy and Rigid Regions in Proteins
NASA Astrophysics Data System (ADS)
Jacobs, D. J.; Thorpe, M. F.; Kuhn, L. A.
1998-03-01
In proteins it is possible to separate hard covalent forces involving bond lengths and bond angles from other weak forces. We model the microstructure of the protein as a generic bar-joint truss framework, where the hard covalent forces and strong hydrogen bonds are regarded as rigid bar constraints. We study the mechanical stability of proteins using FIRST (Floppy Inclusions and Rigid Substructure Topography) based on a recently developed combinatorial constraint counting algorithm (the 3D Pebble Game), which is a generalization of the 2D pebble game (D. J. Jacobs and M. F. Thorpe, ``Generic Rigidity: The Pebble Game'', Phys. Rev. Lett.) 75, 4051-4054 (1995) for the special class of bond-bending networks (D. J. Jacobs, "Generic Rigidity in Three Dimensional Bond-bending Networks", Preprint Aug (1997)). This approach is useful in identifying rigid motifs and flexible linkages in proteins, and thereby determines the essential degrees of freedom. We will show some preliminary results from the FIRST analysis on the myohemerythrin and lyozyme proteins.
An Adaptive Niching Genetic Algorithm using a niche size equalization mechanism
NASA Astrophysics Data System (ADS)
Nagata, Yuichi
Niching GAs have been widely investigated to apply genetic algorithms (GAs) to multimodal function optimization problems. In this paper, we suggest a new niching GA that attempts to form niches, each consisting of an equal number of individuals. The proposed GA can be applied also to combinatorial optimization problems by defining a distance metric in the search space. We apply the proposed GA to the job-shop scheduling problem (JSP) and demonstrate that the proposed niching method enhances the ability to maintain niches and improve the performance of GAs.
Saldaña, Erick; Siche, Raúl; da Silva Pinto, Jair Sebastião; de Almeida, Marcio Aurélio; Selani, Miriam Mabel; Rios-Mera, Juan; Contreras-Castillo, Carmen J
2018-02-01
This study aims to optimize simultaneously the lipid profile and instrumental hardness of low-fat mortadella. For lipid mixture optimization, the overlapping of surface boundaries was used to select the quantities of canola, olive, and fish oils, in order to maximize PUFAs, specifically the long-chain n-3 fatty acids (eicosapentaenoic-EPA, docosahexaenoic acids-DHA) using the minimum content of fish oil. Increased quantities of canola oil were associated with higher PUFA/SFA ratios. The presence of fish oil, even in small amounts, was effective in improving the nutritional quality of the mixture, showing lower n-6/n-3 ratios and significant levels of EPA and DHA. Thus, the optimal lipid mixture comprised of 20, 30 and 50% fish, olive and canola oils, respectively, which present PUFA/SFA (2.28) and n-6/n-3 (2.30) ratios within the recommendations of a healthy diet. Once the lipid mixture was optimized, components of the pre-emulsion used as fat replacer in the mortadella, such as lipid mixture (LM), sodium alginate (SA), and milk protein concentrate (PC), were studied to optimize hardness and springiness to target ranges of 13-16 N and 0.86-0.87, respectively. Results showed that springiness was not significantly affected by these variables. However, as the concentration of the three components increased, hardness decreased. Through the desirability function, the optimal proportions were 30% LM, 0.5% SA, and 0.5% PC. This study showed that the pre-emulsion decreases hardness of mortadella. In addition, response surface methodology was efficient to model lipid mixture and hardness, resulting in a product with improved texture and lipid quality.
Tabu Search enhances network robustness under targeted attacks
NASA Astrophysics Data System (ADS)
Sun, Shi-wen; Ma, Yi-lin; Li, Rui-qi; Wang, Li; Xia, Cheng-yi
2016-03-01
We focus on the optimization of network robustness with respect to intentional attacks on high-degree nodes. Given an existing network, this problem can be considered as a typical single-objective combinatorial optimization problem. Based on the heuristic Tabu Search optimization algorithm, a link-rewiring method is applied to reconstruct the network while keeping the degree of every node unchanged. Through numerical simulations, BA scale-free network and two real-world networks are investigated to verify the effectiveness of the proposed optimization method. Meanwhile, we analyze how the optimization affects other topological properties of the networks, including natural connectivity, clustering coefficient and degree-degree correlation. The current results can help to improve the robustness of existing complex real-world systems, as well as to provide some insights into the design of robust networks.
García-Pedrajas, Nicolás; Ortiz-Boyer, Domingo; Hervás-Martínez, César
2006-05-01
In this work we present a new approach to crossover operator in the genetic evolution of neural networks. The most widely used evolutionary computation paradigm for neural network evolution is evolutionary programming. This paradigm is usually preferred due to the problems caused by the application of crossover to neural network evolution. However, crossover is the most innovative operator within the field of evolutionary computation. One of the most notorious problems with the application of crossover to neural networks is known as the permutation problem. This problem occurs due to the fact that the same network can be represented in a genetic coding by many different codifications. Our approach modifies the standard crossover operator taking into account the special features of the individuals to be mated. We present a new model for mating individuals that considers the structure of the hidden layer and redefines the crossover operator. As each hidden node represents a non-linear projection of the input variables, we approach the crossover as a problem on combinatorial optimization. We can formulate the problem as the extraction of a subset of near-optimal projections to create the hidden layer of the new network. This new approach is compared to a classical crossover in 25 real-world problems with an excellent performance. Moreover, the networks obtained are much smaller than those obtained with classical crossover operator.
Effect of the Implicit Combinatorial Model on Combinatorial Reasoning in Secondary School Pupils.
ERIC Educational Resources Information Center
Batanero, Carmen; And Others
1997-01-01
Elementary combinatorial problems may be classified into three different combinatorial models: (1) selection; (2) partition; and (3) distribution. The main goal of this research was to determine the effect of the implicit combinatorial model on pupils' combinatorial reasoning before and after instruction. Gives an analysis of variance of the…
Solution for a bipartite Euclidean traveling-salesman problem in one dimension
NASA Astrophysics Data System (ADS)
Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M.
2018-05-01
The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.
Solution for a bipartite Euclidean traveling-salesman problem in one dimension.
Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M
2018-05-01
The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.
CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox
NASA Astrophysics Data System (ADS)
Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano
2018-03-01
Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.
Silva, Aleidy; Lee, Bai-Yu; Clemens, Daniel L; Kee, Theodore; Ding, Xianting; Ho, Chih-Ming; Horwitz, Marcus A
2016-04-12
Tuberculosis (TB) remains a major global public health problem, and improved treatments are needed to shorten duration of therapy, decrease disease burden, improve compliance, and combat emergence of drug resistance. Ideally, the most effective regimen would be identified by a systematic and comprehensive combinatorial search of large numbers of TB drugs. However, optimization of regimens by standard methods is challenging, especially as the number of drugs increases, because of the extremely large number of drug-dose combinations requiring testing. Herein, we used an optimization platform, feedback system control (FSC) methodology, to identify improved drug-dose combinations for TB treatment using a fluorescence-based human macrophage cell culture model of TB, in which macrophages are infected with isopropyl β-D-1-thiogalactopyranoside (IPTG)-inducible green fluorescent protein (GFP)-expressing Mycobacterium tuberculosis (Mtb). On the basis of only a single screening test and three iterations, we identified highly efficacious three- and four-drug combinations. To verify the efficacy of these combinations, we further evaluated them using a methodologically independent assay for intramacrophage killing of Mtb; the optimized combinations showed greater efficacy than the current standard TB drug regimen. Surprisingly, all top three- and four-drug optimized regimens included the third-line drug clofazimine, and none included the first-line drugs isoniazid and rifampin, which had insignificant or antagonistic impacts on efficacy. Because top regimens also did not include a fluoroquinolone or aminoglycoside, they are potentially of use for treating many cases of multidrug- and extensively drug-resistant TB. Our study shows the power of an FSC platform to identify promising previously unidentified drug-dose combinations for treatment of TB.
Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa
2008-01-01
Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.
Masmoudi, Fatma; Ben Khedher, Saoussen; Kamoun, Amel; Zouari, Nabil; Tounsi, Slim; Trigui, Mohamed
2017-04-01
This work is directed towards Bacillus amyloliquefaciens strain BLB371 metabolite production for biocontrol of fungal phytopathogens. In order to maximise antifungal metabolite production by this strain, two approaches were combined: random mutagenesis and medium component optimization. After three rounds of mutagenesis, a hyper active mutant, named M3-7, was obtained. It produces 7 fold more antifungal metabolites (1800AU/mL) than the wild strain in MC medium. A hybrid design was applied to optimise a new medium to enhance antifungal metabolite production by M3-7. The new optimized medium (35g/L of peptone, 32.5g/L of sucrose, 10.5g/L of yeast extract, 2.4g/L of KH 2 PO 4 , 1.3g/L of MgSO 4 and 23mg/L of MnSO 4 ) achieved 1.62 fold enhancement in antifungal compound production (3000AU/mL) by this mutant, compared to that achieved in MC medium. Therefore, combinatory effect of these two approaches (mutagenesis and medium component optimization) allowed 12 fold improvement in antifungal activity (from 250UA/mL to 3000UA/mL). This improvement was confirmed against several phytopathogenic fungi with an increase of MIC and MFC over than 50%. More interestingly, a total eradication of gray mold was obtained on tomato fruits infected by Botrytis cinerea and treated by M3-7, compared to those treated by BLB371. From the practical point of view, combining random mutagenesis and medium optimization could be considered as an excellent tool for obtaining promising biological products useful against phytopathogenic fungi. Copyright © 2017 Elsevier GmbH. All rights reserved.
Control Coordination of Multiple Agents Through Decision Theoretic and Economic Methods
2003-02-01
instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information...investigated the design of test data for benchmarking such optimization algorithms. Our other research on combinatorial auctions included I...average combination rule. We exemplified these theoretical results with experiments on stock market data , demonstrating how ensembles of classifiers can
ERIC Educational Resources Information Center
MacGregor, James N.; Chronicle, Edward P.; Ormerod, Thomas C.
2006-01-01
We compared the performance of three heuristics with that of subjects on variants of a well-known combinatorial optimization task, the Traveling Salesperson Problem (TSP). The present task consisted of finding the shortest path through an array of points from one side of the array to the other. Like the standard TSP, the task is computationally…
Navigation Solution for a Multiple Satellite and Multiple Ground Architecture
2014-09-14
Primer Vector Theory . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2.6 The Traveling Salesman Problem . . . . . . . . . . . . . . . . . . 12...the Traveling Salesman problem [42]. It is framed as a nonlinear programming, complete combinatorial optimization where the orbital debris pieces relate...impulsive maneuvers and applies his findings to a Hohmann transfer with the addition of mid-course burns and wait times. 2.2.6 The Traveling Salesman
Teixidó, Meritxell; Belda, Ignasi; Zurita, Esther; Llorà, Xavier; Fabre, Myriam; Vilaró, Senén; Albericio, Fernando; Giralt, Ernest
2005-12-01
The use of high-throughput methods in drug discovery allows the generation and testing of a large number of compounds, but at the price of providing redundant information. Evolutionary combinatorial chemistry combines the selection and synthesis of biologically active compounds with artificial intelligence optimization methods, such as genetic algorithms (GA). Drug candidates for the treatment of central nervous system (CNS) disorders must overcome the blood-brain barrier (BBB). This paper reports a new genetic algorithm that searches for the optimal physicochemical properties for peptide transport across the blood-brain barrier. A first generation of peptides has been generated and synthesized. Due to the high content of N-methyl amino acids present in most of these peptides, their syntheses were especially challenging due to over-incorporations, deletions and DKP formations. Distinct fragmentation patterns during peptide cleavage have been identified. The first generation of peptides has been studied by evaluation techniques such as immobilized artificial membrane chromatography (IAMC), a cell-based assay, log Poctanol/water calculations, etc. Finally, a second generation has been proposed. (c) 2005 European Peptide Society and John Wiley & Sons, Ltd.
Lin, Jingjing; Jing, Honglei
2016-01-01
Artificial immune system is one of the most recently introduced intelligence methods which was inspired by biological immune system. Most immune system inspired algorithms are based on the clonal selection principle, known as clonal selection algorithms (CSAs). When coping with complex optimization problems with the characteristics of multimodality, high dimension, rotation, and composition, the traditional CSAs often suffer from the premature convergence and unsatisfied accuracy. To address these concerning issues, a recombination operator inspired by the biological combinatorial recombination is proposed at first. The recombination operator could generate the promising candidate solution to enhance search ability of the CSA by fusing the information from random chosen parents. Furthermore, a modified hypermutation operator is introduced to construct more promising and efficient candidate solutions. A set of 16 common used benchmark functions are adopted to test the effectiveness and efficiency of the recombination and hypermutation operators. The comparisons with classic CSA, CSA with recombination operator (RCSA), and CSA with recombination and modified hypermutation operator (RHCSA) demonstrate that the proposed algorithm significantly improves the performance of classic CSA. Moreover, comparison with the state-of-the-art algorithms shows that the proposed algorithm is quite competitive. PMID:27698662
Liu, Chun; Kroll, Andreas
2016-01-01
Multi-robot task allocation determines the task sequence and distribution for a group of robots in multi-robot systems, which is one of constrained combinatorial optimization problems and more complex in case of cooperative tasks because they introduce additional spatial and temporal constraints. To solve multi-robot task allocation problems with cooperative tasks efficiently, a subpopulation-based genetic algorithm, a crossover-free genetic algorithm employing mutation operators and elitism selection in each subpopulation, is developed in this paper. Moreover, the impact of mutation operators (swap, insertion, inversion, displacement, and their various combinations) is analyzed when solving several industrial plant inspection problems. The experimental results show that: (1) the proposed genetic algorithm can obtain better solutions than the tested binary tournament genetic algorithm with partially mapped crossover; (2) inversion mutation performs better than other tested mutation operators when solving problems without cooperative tasks, and the swap-inversion combination performs better than other tested mutation operators/combinations when solving problems with cooperative tasks. As it is difficult to produce all desired effects with a single mutation operator, using multiple mutation operators (including both inversion and swap) is suggested when solving similar combinatorial optimization problems.
Synthesis and characterization of catalysts and electrocatalysts using combinatorial methods
NASA Astrophysics Data System (ADS)
Ramanathan, Ramnarayanan
This thesis documents attempts at solving three problems. Bead-based parallel synthetic and screening methods based on matrix algorithms were developed. The method was applied to search for new heterogeneous catalysts for dehydrogenation of methylcyclohexane. The most powerful use of the method to date was to optimize metal adsorption and evaluate catalysts as a function of incident energy, likely to be important in the future, should availability of energy be an optimization parameter. This work also highlighted the importance of order of addition of metal salts on catalytic activity and a portion of this work resulted in a patent with UOP LLC, Desplaines, Illinois. Combinatorial methods were also investigated as a tool to search for carbon-monoxide tolerant anode electrocatalysts and methanol tolerant cathode electrocatalysts, resulting in discovery of no new electrocatalysts. A physically intuitive scaling criterion was developed to analyze all experiments on electrocatalysts, providing insight for future experiments. We attempted to solve the CO poisoning problem in polymer electrolyte fuel cells using carbon molecular sieves as a separator. This approach was unsuccessful in solving the CO poisoning problem, possibly due to the tendency of the carbon molecular sieves to concentrate CO and CO 2 in pore walls.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardirossian, Narbe; Head-Gordon, Martin
2016-06-07
A combinatorially optimized, range-separated hybrid, meta-GGA density functional with VV10 nonlocal correlation is presented in this paper. The final 12-parameter functional form is selected from approximately 10 × 10 9 candidate fits that are trained on a training set of 870 data points and tested on a primary test set of 2964 data points. The resulting density functional, ωB97M-V, is further tested for transferability on a secondary test set of 1152 data points. For comparison, ωB97M-V is benchmarked against 11 leading density functionals including M06-2X, ωB97X-D, M08-HX, M11, ωM05-D, ωB97X-V, and MN15. Encouragingly, the overall performance of ωB97M-V on nearlymore » 5000 data points clearly surpasses that of all of the tested density functionals. Finally, in order to facilitate the use of ωB97M-V, its basis set dependence and integration grid sensitivity are thoroughly assessed, and recommendations that take into account both efficiency and accuracy are provided.« less
Discovery of Peptidomimetic Ligands of EED as Allosteric Inhibitors of PRC2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnash, Kimberly D.; The, Juliana; Norris-Drouin, Jacqueline L.
The function of EED within polycomb repressive complex 2 (PRC2) is mediated by a complex network of protein–protein interactions. Allosteric activation of PRC2 by binding of methylated proteins to the embryonic ectoderm development (EED) aromatic cage is essential for full catalytic activity, but details of this regulation are not fully understood. EED’s recognition of the product of PRC2 activity, histone H3 lysine 27 trimethylation (H3K27me3), stimulates PRC2 methyltransferase activity at adjacent nucleosomes leading to H3K27me3 propagation and, ultimately, gene repression. By coupling combinatorial chemistry and structure-based design, we optimized a low-affinity methylated jumonji, AT-rich interactive domain 2 (Jarid2) peptide tomore » a smaller, more potent peptidomimetic ligand (K d = 1.14 ± 0.14 μM) of the aromatic cage of EED. Our strategy illustrates the effectiveness of applying combinatorial chemistry to achieve both ligand potency and property optimization. Furthermore, the resulting ligands, UNC5114 and UNC5115, demonstrate that targeted disruption of EED’s reader function can lead to allosteric inhibition of PRC2 catalytic activity.« less
Robust quantum optimizer with full connectivity.
Nigg, Simon E; Lörch, Niels; Tiwari, Rakesh P
2017-04-01
Quantum phenomena have the potential to speed up the solution of hard optimization problems. For example, quantum annealing, based on the quantum tunneling effect, has recently been shown to scale exponentially better with system size than classical simulated annealing. However, current realizations of quantum annealers with superconducting qubits face two major challenges. First, the connectivity between the qubits is limited, excluding many optimization problems from a direct implementation. Second, decoherence degrades the success probability of the optimization. We address both of these shortcomings and propose an architecture in which the qubits are robustly encoded in continuous variable degrees of freedom. By leveraging the phenomenon of flux quantization, all-to-all connectivity with sufficient tunability to implement many relevant optimization problems is obtained without overhead. Furthermore, we demonstrate the robustness of this architecture by simulating the optimal solution of a small instance of the nondeterministic polynomial-time hard (NP-hard) and fully connected number partitioning problem in the presence of dissipation.
Optimization of Error-Bounded Lossy Compression for Hard-to-Compress HPC Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di, Sheng; Cappello, Franck
Since today’s scientific applications are producing vast amounts of data, compressing them before storage/transmission is critical. Results of existing compressors show two types of HPC data sets: highly compressible and hard to compress. In this work, we carefully design and optimize the error-bounded lossy compression for hard-tocompress scientific data. We propose an optimized algorithm that can adaptively partition the HPC data into best-fit consecutive segments each having mutually close data values, such that the compression condition can be optimized. Another significant contribution is the optimization of shifting offset such that the XOR-leading-zero length between two consecutive unpredictable data points canmore » be maximized. We finally devise an adaptive method to select the best-fit compressor at runtime for maximizing the compression factor. We evaluate our solution using 13 benchmarks based on real-world scientific problems, and we compare it with 9 other state-of-the-art compressors. Experiments show that our compressor can always guarantee the compression errors within the user-specified error bounds. Most importantly, our optimization can improve the compression factor effectively, by up to 49% for hard-tocompress data sets with similar compression/decompression time cost.« less
Solving NP-Hard Problems with Physarum-Based Ant Colony System.
Liu, Yuxin; Gao, Chao; Zhang, Zili; Lu, Yuxiao; Chen, Shi; Liang, Mingxin; Tao, Li
2017-01-01
NP-hard problems exist in many real world applications. Ant colony optimization (ACO) algorithms can provide approximate solutions for those NP-hard problems, but the performance of ACO algorithms is significantly reduced due to premature convergence and weak robustness, etc. With these observations in mind, this paper proposes a Physarum-based pheromone matrix optimization strategy in ant colony system (ACS) for solving NP-hard problems such as traveling salesman problem (TSP) and 0/1 knapsack problem (0/1 KP). In the Physarum-inspired mathematical model, one of the unique characteristics is that critical tubes can be reserved in the process of network evolution. The optimized updating strategy employs the unique feature and accelerates the positive feedback process in ACS, which contributes to the quick convergence of the optimal solution. Some experiments were conducted using both benchmark and real datasets. The experimental results show that the optimized ACS outperforms other meta-heuristic algorithms in accuracy and robustness for solving TSPs. Meanwhile, the convergence rate and robustness for solving 0/1 KPs are better than those of classical ACS.
Global Optimal Trajectory in Chaos and NP-Hardness
NASA Astrophysics Data System (ADS)
Latorre, Vittorio; Gao, David Yang
This paper presents an unconventional theory and method for solving general nonlinear dynamical systems. Instead of the direct iterative methods, the discretized nonlinear system is first formulated as a global optimization problem via the least squares method. A newly developed canonical duality theory shows that this nonconvex minimization problem can be solved deterministically in polynomial time if a global optimality condition is satisfied. The so-called pseudo-chaos produced by linear iterative methods are mainly due to the intrinsic numerical error accumulations. Otherwise, the global optimization problem could be NP-hard and the nonlinear system can be really chaotic. A conjecture is proposed, which reveals the connection between chaos in nonlinear dynamics and NP-hardness in computer science. The methodology and the conjecture are verified by applications to the well-known logistic equation, a forced memristive circuit and the Lorenz system. Computational results show that the canonical duality theory can be used to identify chaotic systems and to obtain realistic global optimal solutions in nonlinear dynamical systems. The method and results presented in this paper should bring some new insights into nonlinear dynamical systems and NP-hardness in computational complexity theory.
MDTS: automatic complex materials design using Monte Carlo tree search.
M Dieb, Thaer; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-01-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
MDTS: automatic complex materials design using Monte Carlo tree search
NASA Astrophysics Data System (ADS)
Dieb, Thaer M.; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-12-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
A theoretical comparison of evolutionary algorithms and simulated annealing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, W.E.
1995-08-28
This paper theoretically compares the performance of simulated annealing and evolutionary algorithms. Our main result is that under mild conditions a wide variety of evolutionary algorithms can be shown to have greater performance than simulated annealing after a sufficiently large number of function evaluations. This class of EAs includes variants of evolutionary strategie and evolutionary programming, the canonical genetic algorithm, as well as a variety of genetic algorithms that have been applied to combinatorial optimization problems. The proof of this result is based on a performance analysis of a very general class of stochastic optimization algorithms, which has implications formore » the performance of a variety of other optimization algorithm.« less
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical bio-sensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Li, J.; Ye, Q.; Koehne, J.; Chen, H.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical biosensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Adaptiveness in monotone pseudo-Boolean optimization and stochastic neural computation.
Grossi, Giuliano
2009-08-01
Hopfield neural network (HNN) is a nonlinear computational model successfully applied in finding near-optimal solutions of several difficult combinatorial problems. In many cases, the network energy function is obtained through a learning procedure so that its minima are states falling into a proper subspace (feasible region) of the search space. However, because of the network nonlinearity, a number of undesirable local energy minima emerge from the learning procedure, significantly effecting the network performance. In the neural model analyzed here, we combine both a penalty and a stochastic process in order to enhance the performance of a binary HNN. The penalty strategy allows us to gradually lead the search towards states representing feasible solutions, so avoiding oscillatory behaviors or asymptotically instable convergence. Presence of stochastic dynamics potentially prevents the network to fall into shallow local minima of the energy function, i.e., quite far from global optimum. Hence, for a given fixed network topology, the desired final distribution on the states can be reached by carefully modulating such process. The model uses pseudo-Boolean functions both to express problem constraints and cost function; a combination of these two functions is then interpreted as energy of the neural network. A wide variety of NP-hard problems fall in the class of problems that can be solved by the model at hand, particularly those having a monotonic quadratic pseudo-Boolean function as constraint function. That is, functions easily derived by closed algebraic expressions representing the constraint structure and easy (polynomial time) to maximize. We show the asymptotic convergence properties of this model characterizing its state space distribution at thermal equilibrium in terms of Markov chain and give evidence of its ability to find high quality solutions on benchmarks and randomly generated instances of two specific problems taken from the computational graph theory.
NASA Astrophysics Data System (ADS)
Masuda, Kazuaki; Aiyoshi, Eitaro
We propose a method for solving optimal price decision problems for simultaneous multi-article auctions. An auction problem, originally formulated as a combinatorial problem, determines both every seller's whether or not to sell his/her article and every buyer's which article(s) to buy, so that the total utility of buyers and sellers will be maximized. Due to the duality theory, we transform it equivalently into a dual problem in which Lagrange multipliers are interpreted as articles' transaction price. As the dual problem is a continuous optimization problem with respect to the multipliers (i.e., the transaction prices), we propose a numerical method to solve it by applying heuristic global search methods. In this paper, Particle Swarm Optimization (PSO) is used to solve the dual problem, and experimental results are presented to show the validity of the proposed method.
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T.
2004-01-01
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems. PMID:15556886
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T
2004-11-22
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems.
Chatterjee, Kaushik; Lin-Gibson, Sheng; Wallace, William E.; Parekh, Sapun H.; Lee, Young J.; Cicerone, Marcus T.; Young, Marian F.; Simon, Carl G.
2011-01-01
Cells are known to sense and respond to the physical properties of their environment and those of tissue scaffolds. Optimizing these cell-material interactions is critical in tissue engineering. In this work, a simple and inexpensive combinatorial platform was developed to rapidly screen three-dimensional (3D) tissue scaffolds and was applied to screen the effect of scaffold properties for tissue engineering of bone. Differentiation of osteoblasts was examined in poly(ethylene glycol) hydrogel gradients spanning a 30-fold range in compressive modulus (≈ 10 kPa to ≈ 300 kPa). Results demonstrate that material properties (gel stiffness) of scaffolds can be leveraged to induce cell differentiation in 3D culture as an alternative to biochemical cues such as soluble supplements, immobilized biomolecules and vectors, which are often expensive, labile and potentially carcinogenic. Gel moduli of ≈ 225 kPa and higher enhanced osteogenesis. Furthermore, it is proposed that material-induced cell differentiation can be modulated to engineer seamless tissue interfaces between mineralized bone tissue and softer tissues such as ligaments and tendons. This work presents a combinatorial method to screen biological response to 3D hydrogel scaffolds that more closely mimics the 3D environment experienced by cells in vivo. PMID:20378163
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.
2002-02-01
Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.
Developing recombinant antibodies for biomarker detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baird, Cheryl L.; Fischer, Christopher J.; Pefaur, Noah B.
2010-10-01
Monoclonal antibodies (mAbs) have an essential role in biomarker validation and diagnostic assays. A barrier to pursuing these applications is the reliance on immunization and hybridomas to produce mAbs, which is time-consuming and may not yield the desired mAb. We recommend a process flow for affinity reagent production that utilizes combinatorial protein display systems (eg, yeast surface display or phage display) rather than hybridomas. These systems link a selectable phenotype-binding conferred by an antibody fragment-with a means for recovering the encoding gene. Recombinant libraries obtained from immunizations can produce high-affinity antibodies (<10 nM) more quickly than other methods. Non-immune librariesmore » provide an alternate route when immunizations are not possible, or when suitable mAbs are not recovered from an immune library. Directed molecular evolution (DME) is an integral part of optimizing mAbs obtained from combinatorial protein display, but can also be used on hybridoma-derived mAbs. Variants can easily be obtained and screened to increase the affinity of the parent mAb (affinity maturation). We discuss examples where DME has been used to tailor affinity reagents to specific applications. Combinatorial protein display also provides an accessible method for identifying antibody pairs, which are necessary for sandwich-type diagnostic assays.« less
Combinatorial influence of environmental parameters on transcription factor activity.
Knijnenburg, T A; Wessels, L F A; Reinders, M J T
2008-07-01
Cells receive a wide variety of environmental signals, which are often processed combinatorially to generate specific genetic responses. Changes in transcript levels, as observed across different environmental conditions, can, to a large extent, be attributed to changes in the activity of transcription factors (TFs). However, in unraveling these transcription regulation networks, the actual environmental signals are often not incorporated into the model, simply because they have not been measured. The unquantified heterogeneity of the environmental parameters across microarray experiments frustrates regulatory network inference. We propose an inference algorithm that models the influence of environmental parameters on gene expression. The approach is based on a yeast microarray compendium of chemostat steady-state experiments. Chemostat cultivation enables the accurate control and measurement of many of the key cultivation parameters, such as nutrient concentrations, growth rate and temperature. The observed transcript levels are explained by inferring the activity of TFs in response to combinations of cultivation parameters. The interplay between activated enhancers and repressors that bind a gene promoter determine the possible up- or downregulation of the gene. The model is translated into a linear integer optimization problem. The resulting regulatory network identifies the combinatorial effects of environmental parameters on TF activity and gene expression. The Matlab code is available from the authors upon request. Supplementary data are available at Bioinformatics online.
One step DNA assembly for combinatorial metabolic engineering.
Coussement, Pieter; Maertens, Jo; Beauprez, Joeri; Van Bellegem, Wouter; De Mey, Marjan
2014-05-01
The rapid and efficient assembly of multi-step metabolic pathways for generating microbial strains with desirable phenotypes is a critical procedure for metabolic engineering, and remains a significant challenge in synthetic biology. Although several DNA assembly methods have been developed and applied for metabolic pathway engineering, many of them are limited by their suitability for combinatorial pathway assembly. The introduction of transcriptional (promoters), translational (ribosome binding site (RBS)) and enzyme (mutant genes) variability to modulate pathway expression levels is essential for generating balanced metabolic pathways and maximizing the productivity of a strain. We report a novel, highly reliable and rapid single strand assembly (SSA) method for pathway engineering. The method was successfully optimized and applied to create constructs containing promoter, RBS and/or mutant enzyme libraries. To demonstrate its efficiency and reliability, the method was applied to fine-tune multi-gene pathways. Two promoter libraries were simultaneously introduced in front of two target genes, enabling orthogonal expression as demonstrated by principal component analysis. This shows that SSA will increase our ability to tune multi-gene pathways at all control levels for the biotechnological production of complex metabolites, achievable through the combinatorial modulation of transcription, translation and enzyme activity. Copyright © 2014 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Combinatorial and Algorithmic Rigidity: Beyond Two Dimensions
2012-12-01
problem. Manuscript, 2010. [35] G. Panina and I. Streinu. Flattening single-vertex origami : the non- expansive case. Computational Geometry : Theory and...in 2008, under the DARPA solicitation “Mathemat- ical Challenges, BAA 07-68”. It addressed Mathematical Challenge Ten: Al- gorithmic Origami and...a number of optimal algorithms and provided critical complexity analysis. The topic of algorithmic origami was successfully engaged from the same
Data-Driven Online and Real-Time Combinatorial Optimization
2013-10-30
Problem , the online Traveling Salesman Problem , and variations of the online Quota Hamil- tonian Path Problem and the online Traveling ...has the lowest competitive ratio among all algorithms of this kind. Second, we consider the Online Traveling Salesman Problem , and consider randomized...matroid secretary problem on a partition matroid. 6. Jaillet, P. and X. Lu. “Online Traveling Salesman Problems with Rejection Options”, submitted
Towards a theory of automated elliptic mesh generation
NASA Technical Reports Server (NTRS)
Cordova, J. Q.
1992-01-01
The theory of elliptic mesh generation is reviewed and the fundamental problem of constructing computational space is discussed. It is argued that the construction of computational space is an NP-Complete problem and therefore requires a nonstandard approach for its solution. This leads to the development of graph-theoretic, combinatorial optimization and integer programming algorithms. Methods for the construction of two dimensional computational space are presented.
Siol, Sebastian; Dhakal, Tara P; Gudavalli, Ganesh S; Rajbhandari, Pravakar P; DeHart, Clay; Baranowski, Lauryn L; Zakutayev, Andriy
2016-06-08
High-throughput computational and experimental techniques have been used in the past to accelerate the discovery of new promising solar cell materials. An important part of the development of novel thin film solar cell technologies, that is still considered a bottleneck for both theory and experiment, is the search for alternative interfacial contact (buffer) layers. The research and development of contact materials is difficult due to the inherent complexity that arises from its interactions at the interface with the absorber. A promising alternative to the commonly used CdS buffer layer in thin film solar cells that contain absorbers with lower electron affinity can be found in β-In2S3. However, the synthesis conditions for the sputter deposition of this material are not well-established. Here, In2S3 is investigated as a solar cell contact material utilizing a high-throughput combinatorial screening of the temperature-flux parameter space, followed by a number of spatially resolved characterization techniques. It is demonstrated that, by tuning the sulfur partial pressure, phase pure β-In2S3 could be deposited using a broad range of substrate temperatures between 500 °C and ambient temperature. Combinatorial photovoltaic device libraries with Al/ZnO/In2S3/Cu2ZnSnS4/Mo/SiO2 structure were built at optimal processing conditions to investigate the feasibility of the sputtered In2S3 buffer layers and of an accelerated optimization of the device structure. The performance of the resulting In2S3/Cu2ZnSnS4 photovoltaic devices is on par with CdS/Cu2ZnSnS4 reference solar cells with similar values for short circuit currents and open circuit voltages, despite the overall quite low efficiency of the devices (∼2%). Overall, these results demonstrate how a high-throughput experimental approach can be used to accelerate the development of contact materials and facilitate the optimization of thin film solar cell devices.
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Chan, Ting-Shan; Liu, Yao-Min; Liu, Ru-Shi
2008-01-01
The present investigation aims at the synthesis of KSr 1-x-y PO 4:Tb(3+) x Eu(2+) y phosphors using the combinatorial chemistry method. We have developed square-type arrays consisting of 121 compositions to investigate the optimum composition and luminescence properties of KSrPO 4 host matrix under 365 nm ultraviolet (UV) light. The optimized compositions of phosphors were found to be KSr 0.93PO 4:Tb(3+) 0.07 (green) and KSr 0.995PO 4:Eu(2+) 0.005 (blue). These phosphors showed good thermal luminescence stability better than commercially available YAG:Ce at temperature above 200 degrees C. The result indicates that the KSr 1-x-y PO 4:Tb(3+) x Eu (2+)y can be potentially useful as a UV radiation-converting phosphor for light-emitting diodes.
Martínez-Ceron, María C; Giudicessi, Silvana L; Marani, Mariela M; Albericio, Fernando; Cascone, Osvaldo; Erra-Balsells, Rosa; Camperi, Silvia A
2010-05-15
Optimization of bead analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) after the screening of one-bead-one-peptide combinatorial libraries was achieved, involving the fine-tuning of the whole process. Guanidine was replaced by acetonitrile (MeCN)/acetic acid (AcOH)/water (H(2)O), improving matrix crystallization. Peptide-bead cleavage with NH(4)OH was cheaper and safer than, yet as efficient as, NH(3)/tetrahydrofuran (THF). Peptide elution in microtubes instead of placing the beads in the sample plate yielded more sample aliquots. Successive dry layers deposit sample preparation was better than the dried droplet method. Among the matrices analyzed, alpha-cyano-4-hydroxycinnamic acid resulted in the best peptide ion yield. Cluster formation was minimized by the addition of additives to the matrix. Copyright 2010 Elsevier Inc. All rights reserved.
Perspective: Stochastic magnetic devices for cognitive computing
NASA Astrophysics Data System (ADS)
Roy, Kaushik; Sengupta, Abhronil; Shim, Yong
2018-06-01
Stochastic switching of nanomagnets can potentially enable probabilistic cognitive hardware consisting of noisy neural and synaptic components. Furthermore, computational paradigms inspired from the Ising computing model require stochasticity for achieving near-optimality in solutions to various types of combinatorial optimization problems such as the Graph Coloring Problem or the Travelling Salesman Problem. Achieving optimal solutions in such problems are computationally exhaustive and requires natural annealing to arrive at the near-optimal solutions. Stochastic switching of devices also finds use in applications involving Deep Belief Networks and Bayesian Inference. In this article, we provide a multi-disciplinary perspective across the stack of devices, circuits, and algorithms to illustrate how the stochastic switching dynamics of spintronic devices in the presence of thermal noise can provide a direct mapping to the computational units of such probabilistic intelligent systems.
Optimal mapping of irregular finite element domains to parallel processors
NASA Technical Reports Server (NTRS)
Flower, J.; Otto, S.; Salama, M.
1987-01-01
Mapping the solution domain of n-finite elements into N-subdomains that may be processed in parallel by N-processors is an optimal one if the subdomain decomposition results in a well-balanced workload distribution among the processors. The problem is discussed in the context of irregular finite element domains as an important aspect of the efficient utilization of the capabilities of emerging multiprocessor computers. Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids. The simulated annealing analogy and algorithm are described, and numerical results are given for mapping an irregular two-dimensional finite element domain containing a singularity onto the Hypercube computer.
Three-dimensional unstructured grid generation via incremental insertion and local optimization
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Wiltberger, N. Lyn; Gandhi, Amar S.
1992-01-01
Algorithms for the generation of 3D unstructured surface and volume grids are discussed. These algorithms are based on incremental insertion and local optimization. The present algorithms are very general and permit local grid optimization based on various measures of grid quality. This is very important; unlike the 2D Delaunay triangulation, the 3D Delaunay triangulation appears not to have a lexicographic characterization of angularity. (The Delaunay triangulation is known to minimize that maximum containment sphere, but unfortunately this is not true lexicographically). Consequently, Delaunay triangulations in three-space can result in poorly shaped tetrahedral elements. Using the present algorithms, 3D meshes can be constructed which optimize a certain angle measure, albeit locally. We also discuss the combinatorial aspects of the algorithm as well as implementational details.
Ant Colony Optimization for Markowitz Mean-Variance Portfolio Model
NASA Astrophysics Data System (ADS)
Deng, Guang-Feng; Lin, Woo-Tsong
This work presents Ant Colony Optimization (ACO), which was initially developed to be a meta-heuristic for combinatorial optimization, for solving the cardinality constraints Markowitz mean-variance portfolio model (nonlinear mixed quadratic programming problem). To our knowledge, an efficient algorithmic solution for this problem has not been proposed until now. Using heuristic algorithms in this case is imperative. Numerical solutions are obtained for five analyses of weekly price data for the following indices for the period March, 1992 to September, 1997: Hang Seng 31 in Hong Kong, DAX 100 in Germany, FTSE 100 in UK, S&P 100 in USA and Nikkei 225 in Japan. The test results indicate that the ACO is much more robust and effective than Particle swarm optimization (PSO), especially for low-risk investment portfolios.
Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko
2013-06-18
Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.
Evaluation of Recoverable-Robust Timetables on Tree Networks
NASA Astrophysics Data System (ADS)
D'Angelo, Gianlorenzo; di Stefano, Gabriele; Navarra, Alfredo
In the context of scheduling and timetabling, we study a challenging combinatorial problem which is interesting from both a practical and a theoretical point of view. The motivation behind it is to cope with scheduled activities which might be subject to unavoidable disturbances, such as delays, occurring during the operational phase. The idea is to preventively plan some extra time for the scheduled activities in order to be "prepared" if a delay occurs, and to absorb it without the necessity of re-scheduling the activities from scratch. This realizes the concept of designing so called robust timetables. During the planning phase, one has to consider recovery features that might be applied at runtime if delays occur. Such recovery capabilities are given as input along with the possible delays that must be considered. The objective is the minimization of the overall needed time. The quality of a robust timetable is measured by the price of robustness, i.e. the ratio between the cost of the robust timetable and that of a non-robust optimal timetable. The considered problem is known to be NP-hard. We propose a pseudo-polynomial time algorithm and apply it on random networks and real case scenarios provided by Italian railways. We evaluate the effect of robustness on the scheduling of the activities and provide the price of robustness with respect to different scenarios. We experimentally show the practical effectiveness and efficiency of the proposed algorithm.
Optimization design of urban expressway ramp control
NASA Astrophysics Data System (ADS)
Xu, Hongke; Li, Peiqi; Zheng, Jinnan; Sun, Xiuzhen; Lin, Shan
2017-05-01
In this paper, various types of expressway systems are analyzed, and a variety of signal combinations are proposed to mitigate traffic congestion. And various signal combinations are used to verify the effectiveness of the multi-signal combinatorial control strategy. The simulation software VISSIM was used to simulate the system. Based on the network model of 25 kinds of road length combinations and the simulation results, an optimization scheme suitable for the practical road model is summarized. The simulation results show that the controller can reduce the travel time by 25% under the large traffic flow and improve the road capacity by about 20%.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.
Barhen, J; Toomarian, N; Protopopescu, V
1987-12-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot
NASA Technical Reports Server (NTRS)
Barhen, Jacob; Toomarian, N.; Protopopescu, V.
1987-01-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov
2015-08-01
Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.
MASM: a market architecture for sensor management in distributed sensor networks
NASA Astrophysics Data System (ADS)
Viswanath, Avasarala; Mullen, Tracy; Hall, David; Garga, Amulya
2005-03-01
Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.
Optimization of personalized therapies for anticancer treatment.
Vazquez, Alexei
2013-04-12
As today, there are hundreds of targeted therapies for the treatment of cancer, many of which have companion biomarkers that are in use to inform treatment decisions. If we would consider this whole arsenal of targeted therapies as a treatment option for every patient, very soon we will reach a scenario where each patient is positive for several markers suggesting their treatment with several targeted therapies. Given the documented side effects of anticancer drugs, it is clear that such a strategy is unfeasible. Here, we propose a strategy that optimizes the design of combinatorial therapies to achieve the best response rates with the minimal toxicity. In this methodology markers are assigned to drugs such that we achieve a high overall response rate while using personalized combinations of minimal size. We tested this methodology in an in silico cancer patient cohort, constructed from in vitro data for 714 cell lines and 138 drugs reported by the Sanger Institute. Our analysis indicates that, even in the context of personalized medicine, combinations of three or more drugs are required to achieve high response rates. Furthermore, patient-to-patient variations in pharmacokinetics have a significant impact in the overall response rate. A 10 fold increase in the pharmacokinetics variations resulted in a significant drop the overall response rate. The design of optimal combinatorial therapy for anticancer treatment requires a transition from the one-drug/one-biomarker approach to global strategies that simultaneously assign makers to a catalog of drugs. The methodology reported here provides a framework to achieve this transition.
Scott-Phillips, Thomas C; Blythe, Richard A
2013-11-06
In a combinatorial communication system, some signals consist of the combinations of other signals. Such systems are more efficient than equivalent, non-combinatorial systems, yet despite this they are rare in nature. Why? Previous explanations have focused on the adaptive limits of combinatorial communication, or on its purported cognitive difficulties, but neither of these explains the full distribution of combinatorial communication in the natural world. Here, we present a nonlinear dynamical model of the emergence of combinatorial communication that, unlike previous models, considers how initially non-communicative behaviour evolves to take on a communicative function. We derive three basic principles about the emergence of combinatorial communication. We hence show that the interdependence of signals and responses places significant constraints on the historical pathways by which combinatorial signals might emerge, to the extent that anything other than the most simple form of combinatorial communication is extremely unlikely. We also argue that these constraints can be bypassed if individuals have the socio-cognitive capacity to engage in ostensive communication. Humans, but probably no other species, have this ability. This may explain why language, which is massively combinatorial, is such an extreme exception to nature's general trend for non-combinatorial communication.
Robust quantum optimizer with full connectivity
Nigg, Simon E.; Lörch, Niels; Tiwari, Rakesh P.
2017-01-01
Quantum phenomena have the potential to speed up the solution of hard optimization problems. For example, quantum annealing, based on the quantum tunneling effect, has recently been shown to scale exponentially better with system size than classical simulated annealing. However, current realizations of quantum annealers with superconducting qubits face two major challenges. First, the connectivity between the qubits is limited, excluding many optimization problems from a direct implementation. Second, decoherence degrades the success probability of the optimization. We address both of these shortcomings and propose an architecture in which the qubits are robustly encoded in continuous variable degrees of freedom. By leveraging the phenomenon of flux quantization, all-to-all connectivity with sufficient tunability to implement many relevant optimization problems is obtained without overhead. Furthermore, we demonstrate the robustness of this architecture by simulating the optimal solution of a small instance of the nondeterministic polynomial-time hard (NP-hard) and fully connected number partitioning problem in the presence of dissipation. PMID:28435880
Perspective: Memcomputing: Leveraging memory and physics to compute efficiently
NASA Astrophysics Data System (ADS)
Di Ventra, Massimiliano; Traversa, Fabio L.
2018-05-01
It is well known that physical phenomena may be of great help in computing some difficult problems efficiently. A typical example is prime factorization that may be solved in polynomial time by exploiting quantum entanglement on a quantum computer. There are, however, other types of (non-quantum) physical properties that one may leverage to compute efficiently a wide range of hard problems. In this perspective, we discuss how to employ one such property, memory (time non-locality), in a novel physics-based approach to computation: Memcomputing. In particular, we focus on digital memcomputing machines (DMMs) that are scalable. DMMs can be realized with non-linear dynamical systems with memory. The latter property allows the realization of a new type of Boolean logic, one that is self-organizing. Self-organizing logic gates are "terminal-agnostic," namely, they do not distinguish between the input and output terminals. When appropriately assembled to represent a given combinatorial/optimization problem, the corresponding self-organizing circuit converges to the equilibrium points that express the solutions of the problem at hand. In doing so, DMMs take advantage of the long-range order that develops during the transient dynamics. This collective dynamical behavior, reminiscent of a phase transition, or even the "edge of chaos," is mediated by families of classical trajectories (instantons) that connect critical points of increasing stability in the system's phase space. The topological character of the solution search renders DMMs robust against noise and structural disorder. Since DMMs are non-quantum systems described by ordinary differential equations, not only can they be built in hardware with the available technology, they can also be simulated efficiently on modern classical computers. As an example, we will show the polynomial-time solution of the subset-sum problem for the worst cases, and point to other types of hard problems where simulations of DMMs' equations of motion on classical computers have already demonstrated substantial advantages over traditional approaches. We conclude this article by outlining further directions of study.
Galaxy Redshifts from Discrete Optimization of Correlation Functions
NASA Astrophysics Data System (ADS)
Lee, Benjamin C. G.; Budavári, Tamás; Basu, Amitabh; Rahman, Mubdi
2016-12-01
We propose a new method of constraining the redshifts of individual extragalactic sources based on celestial coordinates and their ensemble statistics. Techniques from integer linear programming (ILP) are utilized to optimize simultaneously for the angular two-point cross- and autocorrelation functions. Our novel formalism introduced here not only transforms the otherwise hopelessly expensive, brute-force combinatorial search into a linear system with integer constraints but also is readily implementable in off-the-shelf solvers. We adopt Gurobi, a commercial optimization solver, and use Python to build the cost function dynamically. The preliminary results on simulated data show potential for future applications to sky surveys by complementing and enhancing photometric redshift estimators. Our approach is the first application of ILP to astronomical analysis.
Programmable synaptic devices for electronic neural nets
NASA Technical Reports Server (NTRS)
Moopenn, A.; Thakoor, A. P.
1990-01-01
The architecture, design, and operational characteristics of custom VLSI and thin film synaptic devices are described. The devices include CMOS-based synaptic chips containing 1024 reprogrammable synapses with a 6-bit dynamic range, and nonvolatile, write-once, binary synaptic arrays based on memory switching in hydrogenated amorphous silicon films. Their suitability for embodiment of fully parallel and analog neural hardware is discussed. Specifically, a neural network solution to an assignment problem of combinatorial global optimization, implemented in fully parallel hardware using the synaptic chips, is described. The network's ability to provide optimal and near optimal solutions over a time scale of few neuron time constants has been demonstrated and suggests a speedup improvement of several orders of magnitude over conventional search methods.
Fast globally optimal segmentation of 3D prostate MRI with axial symmetry prior.
Qiu, Wu; Yuan, Jing; Ukwatta, Eranga; Sun, Yue; Rajchl, Martin; Fenster, Aaron
2013-01-01
We propose a novel global optimization approach to segmenting a given 3D prostate T2w magnetic resonance (MR) image, which enforces the inherent axial symmetry of the prostate shape and simultaneously performs a sequence of 2D axial slice-wise segmentations with a global 3D coherence prior. We show that the proposed challenging combinatorial optimization problem can be solved globally and exactly by means of convex relaxation. With this regard, we introduce a novel coupled continuous max-flow model, which is dual to the studied convex relaxed optimization formulation and leads to an efficient multiplier augmented algorithm based on the modern convex optimization theory. Moreover, the new continuous max-flow based algorithm was implemented on GPUs to achieve a substantial improvement in computation. Experimental results using public and in-house datasets demonstrate great advantages of the proposed method in terms of both accuracy and efficiency.
NASA Astrophysics Data System (ADS)
Orito, Yukiko; Yamamoto, Hisashi; Tsujimura, Yasuhiro; Kambayashi, Yasushi
The portfolio optimizations are to determine the proportion-weighted combination in the portfolio in order to achieve investment targets. This optimization is one of the multi-dimensional combinatorial optimizations and it is difficult for the portfolio constructed in the past period to keep its performance in the future period. In order to keep the good performances of portfolios, we propose the extended information ratio as an objective function, using the information ratio, beta, prime beta, or correlation coefficient in this paper. We apply the simulated annealing (SA) to optimize the portfolio employing the proposed ratio. For the SA, we make the neighbor by the operation that changes the structure of the weights in the portfolio. In the numerical experiments, we show that our portfolios keep the good performances when the market trend of the future period becomes different from that of the past period.
Optimizing an Actuator Array for the Control of Multi-Frequency Noise in Aircraft Interiors
NASA Technical Reports Server (NTRS)
Palumbo, D. L.; Padula, S. L.
1997-01-01
Techniques developed for selecting an optimized actuator array for interior noise reduction at a single frequency are extended to the multi-frequency case. Transfer functions for 64 actuators were obtained at 5 frequencies from ground testing the rear section of a fully trimmed DC-9 fuselage. A single loudspeaker facing the left side of the aircraft was the primary source. A combinatorial search procedure (tabu search) was employed to find optimum actuator subsets of from 2 to 16 actuators. Noise reduction predictions derived from the transfer functions were used as a basis for evaluating actuator subsets during optimization. Results indicate that it is necessary to constrain actuator forces during optimization. Unconstrained optimizations selected actuators which require unrealistically large forces. Two methods of constraint are evaluated. It is shown that a fast, but approximate, method yields results equivalent to an accurate, but computationally expensive, method.
NASA Technical Reports Server (NTRS)
Phillips, K.
1976-01-01
A mathematical model for job scheduling in a specified context is presented. The model uses both linear programming and combinatorial methods. While designed with a view toward optimization of scheduling of facility and plant operations at the Deep Space Communications Complex, the context is sufficiently general to be widely applicable. The general scheduling problem including options for scheduling objectives is discussed and fundamental parameters identified. Mathematical algorithms for partitioning problems germane to scheduling are presented.
Combinatorial approaches to gene recognition.
Roytberg, M A; Astakhova, T V; Gelfand, M S
1997-01-01
Recognition of genes via exon assembly approaches leads naturally to the use of dynamic programming. We consider the general graph-theoretical formulation of the exon assembly problem and analyze in detail some specific variants: multicriterial optimization in the case of non-linear gene-scoring functions; context-dependent schemes for scoring exons and related procedures for exon filtering; and highly specific recognition of arbitrary gene segments, oligonucleotide probes and polymerase chain reaction (PCR) primers.
Improved Modeling of Side-Chain–Base Interactions and Plasticity in Protein–DNA Interface Design
Thyme, Summer B.; Baker, David; Bradley, Philip
2012-01-01
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed “motifs”) was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein–DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. PMID:22426128
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses.
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C; Altman, Sidney; Schwarz, Udo D; Kyriakides, Themis R; Schroers, Jan
2016-05-27
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
NASA Astrophysics Data System (ADS)
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-05-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Solving Connected Subgraph Problems in Wildlife Conservation
NASA Astrophysics Data System (ADS)
Dilkina, Bistra; Gomes, Carla P.
We investigate mathematical formulations and solution techniques for a variant of the Connected Subgraph Problem. Given a connected graph with costs and profits associated with the nodes, the goal is to find a connected subgraph that contains a subset of distinguished vertices. In this work we focus on the budget-constrained version, where we maximize the total profit of the nodes in the subgraph subject to a budget constraint on the total cost. We propose several mixed-integer formulations for enforcing the subgraph connectivity requirement, which plays a key role in the combinatorial structure of the problem. We show that a new formulation based on subtour elimination constraints is more effective at capturing the combinatorial structure of the problem, providing significant advantages over the previously considered encoding which was based on a single commodity flow. We test our formulations on synthetic instances as well as on real-world instances of an important problem in environmental conservation concerning the design of wildlife corridors. Our encoding results in a much tighter LP relaxation, and more importantly, it results in finding better integer feasible solutions as well as much better upper bounds on the objective (often proving optimality or within less than 1% of optimality), both when considering the synthetic instances as well as the real-world wildlife corridor instances.
EcoFlex: A Multifunctional MoClo Kit for E. coli Synthetic Biology.
Moore, Simon J; Lai, Hung-En; Kelwick, Richard J R; Chee, Soo Mei; Bell, David J; Polizzi, Karen Marie; Freemont, Paul S
2016-10-21
Golden Gate cloning is a prominent DNA assembly tool in synthetic biology for the assembly of plasmid constructs often used in combinatorial pathway optimization, with a number of assembly kits developed specifically for yeast and plant-based expression. However, its use for synthetic biology in commonly used bacterial systems such as Escherichia coli has surprisingly been overlooked. Here, we introduce EcoFlex a simplified modular package of DNA parts for a variety of applications in E. coli, cell-free protein synthesis, protein purification and hierarchical assembly of transcription units based on the MoClo assembly standard. The kit features a library of constitutive promoters, T7 expression, RBS strength variants, synthetic terminators, protein purification tags and fluorescence proteins. We validate EcoFlex by assembling a 68-part containing (20 genes) plasmid (31 kb), characterize in vivo and in vitro library parts, and perform combinatorial pathway assembly, using pooled libraries of either fluorescent proteins or the biosynthetic genes for the antimicrobial pigment violacein as a proof-of-concept. To minimize pathway screening, we also introduce a secondary module design site to simplify MoClo pathway optimization. In summary, EcoFlex provides a standardized and multifunctional kit for a variety of applications in E. coli synthetic biology.
Improved modeling of side-chain--base interactions and plasticity in protein--DNA interface design.
Thyme, Summer B; Baker, David; Bradley, Philip
2012-06-08
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed "motifs") was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein-DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. Published by Elsevier Ltd.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-01-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design. PMID:27230692
Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle
Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza; ...
2017-05-18
We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less
Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza
We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less
ERIC Educational Resources Information Center
Duarte, Robert; Nielson, Janne T.; Dragojlovic, Veljko
2004-01-01
A group of techniques aimed at synthesizing a large number of structurally diverse compounds is called combinatorial synthesis. Synthesis of chemiluminescence esters using parallel combinatorial synthesis and mix-and-split combinatorial synthesis is experimented.
Hard decoding algorithm for optimizing thresholds under general Markovian noise
NASA Astrophysics Data System (ADS)
Chamberland, Christopher; Wallman, Joel; Beale, Stefanie; Laflamme, Raymond
2017-04-01
Quantum error correction is instrumental in protecting quantum systems from noise in quantum computing and communication settings. Pauli channels can be efficiently simulated and threshold values for Pauli error rates under a variety of error-correcting codes have been obtained. However, realistic quantum systems can undergo noise processes that differ significantly from Pauli noise. In this paper, we present an efficient hard decoding algorithm for optimizing thresholds and lowering failure rates of an error-correcting code under general completely positive and trace-preserving (i.e., Markovian) noise. We use our hard decoding algorithm to study the performance of several error-correcting codes under various non-Pauli noise models by computing threshold values and failure rates for these codes. We compare the performance of our hard decoding algorithm to decoders optimized for depolarizing noise and show improvements in thresholds and reductions in failure rates by several orders of magnitude. Our hard decoding algorithm can also be adapted to take advantage of a code's non-Pauli transversal gates to further suppress noise. For example, we show that using the transversal gates of the 5-qubit code allows arbitrary rotations around certain axes to be perfectly corrected. Furthermore, we show that Pauli twirling can increase or decrease the threshold depending upon the code properties. Lastly, we show that even if the physical noise model differs slightly from the hypothesized noise model used to determine an optimized decoder, failure rates can still be reduced by applying our hard decoding algorithm.
ACOustic: A Nature-Inspired Exploration Indicator for Ant Colony Optimization.
Sagban, Rafid; Ku-Mahamud, Ku Ruhana; Abu Bakar, Muhamad Shahbani
2015-01-01
A statistical machine learning indicator, ACOustic, is proposed to evaluate the exploration behavior in the iterations of ant colony optimization algorithms. This idea is inspired by the behavior of some parasites in their mimicry to the queens' acoustics of their ant hosts. The parasites' reaction results from their ability to indicate the state of penetration. The proposed indicator solves the problem of robustness that results from the difference of magnitudes in the distance's matrix, especially when combinatorial optimization problems with rugged fitness landscape are applied. The performance of the proposed indicator is evaluated against the existing indicators in six variants of ant colony optimization algorithms. Instances for travelling salesman problem and quadratic assignment problem are used in the experimental evaluation. The analytical results showed that the proposed indicator is more informative and more robust.
Asessing for Structural Understanding in Childrens' Combinatorial Problem Solving.
ERIC Educational Resources Information Center
English, Lyn
1999-01-01
Assesses children's structural understanding of combinatorial problems when presented in a variety of task situations. Provides an explanatory model of students' combinatorial understandings that informs teaching and assessment. Addresses several components of children's structural understanding of elementary combinatorial problems. (Contains 50…
System and method for bullet tracking and shooter localization
Roberts, Randy S [Livermore, CA; Breitfeller, Eric F [Dublin, CA
2011-06-21
A system and method of processing infrared imagery to determine projectile trajectories and the locations of shooters with a high degree of accuracy. The method includes image processing infrared image data to reduce noise and identify streak-shaped image features, using a Kalman filter to estimate optimal projectile trajectories, updating the Kalman filter with new image data, determining projectile source locations by solving a combinatorial least-squares solution for all optimal projectile trajectories, and displaying all of the projectile source locations. Such a shooter-localization system is of great interest for military and law enforcement applications to determine sniper locations, especially in urban combat scenarios.
An application of different dioids in public key cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com
2014-11-18
Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less
Combinatorial influence of environmental parameters on transcription factor activity
Knijnenburg, T.A.; Wessels, L.F.A.; Reinders, M.J.T.
2008-01-01
Motivation: Cells receive a wide variety of environmental signals, which are often processed combinatorially to generate specific genetic responses. Changes in transcript levels, as observed across different environmental conditions, can, to a large extent, be attributed to changes in the activity of transcription factors (TFs). However, in unraveling these transcription regulation networks, the actual environmental signals are often not incorporated into the model, simply because they have not been measured. The unquantified heterogeneity of the environmental parameters across microarray experiments frustrates regulatory network inference. Results: We propose an inference algorithm that models the influence of environmental parameters on gene expression. The approach is based on a yeast microarray compendium of chemostat steady-state experiments. Chemostat cultivation enables the accurate control and measurement of many of the key cultivation parameters, such as nutrient concentrations, growth rate and temperature. The observed transcript levels are explained by inferring the activity of TFs in response to combinations of cultivation parameters. The interplay between activated enhancers and repressors that bind a gene promoter determine the possible up- or downregulation of the gene. The model is translated into a linear integer optimization problem. The resulting regulatory network identifies the combinatorial effects of environmental parameters on TF activity and gene expression. Availability: The Matlab code is available from the authors upon request. Contact: t.a.knijnenburg@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18586711
Chimeric Rhinoviruses Displaying MPER Epitopes Elicit Anti-HIV Neutralizing Responses
Yi, Guohua; Lapelosa, Mauro; Bradley, Rachel; Mariano, Thomas M.; Dietz, Denise Elsasser; Hughes, Scott; Wrin, Terri; Petropoulos, Chris; Gallicchio, Emilio; Levy, Ronald M.; Arnold, Eddy; Arnold, Gail Ferstandig
2013-01-01
Background The development of an effective AIDS vaccine has been a formidable task, but remains a critical necessity. The well conserved membrane-proximal external region (MPER) of the HIV-1 gp41 glycoprotein is one of the crucial targets for AIDS vaccine development, as it has the necessary attribute of being able to elicit antibodies capable of neutralizing diverse isolates of HIV. Methodology/Principle Findings Guided by X-ray crystallography, molecular modeling, combinatorial chemistry, and powerful selection techniques, we designed and produced six combinatorial libraries of chimeric human rhinoviruses (HRV) displaying the MPER epitopes corresponding to mAbs 2F5, 4E10, and/or Z13e1, connected to an immunogenic surface loop of HRV via linkers of varying lengths and sequences. Not all libraries led to viable chimeric viruses with the desired sequences, but the combinatorial approach allowed us to examine large numbers of MPER-displaying chimeras. Among the chimeras were five that elicited antibodies capable of significantly neutralizing HIV-1 pseudoviruses from at least three subtypes, in one case leading to neutralization of 10 pseudoviruses from all six subtypes tested. Conclusions Optimization of these chimeras or closely related chimeras could conceivably lead to useful components of an effective AIDS vaccine. While the MPER of HIV may not be immunodominant in natural infection by HIV-1, its presence in a vaccine cocktail could provide critical breadth of protection. PMID:24039745
Parameter meta-optimization of metaheuristics of solving specific NP-hard facility location problem
NASA Astrophysics Data System (ADS)
Skakov, E. S.; Malysh, V. N.
2018-03-01
The aim of the work is to create an evolutionary method for optimizing the values of the control parameters of metaheuristics of solving the NP-hard facility location problem. A system analysis of the tuning process of optimization algorithms parameters is carried out. The problem of finding the parameters of a metaheuristic algorithm is formulated as a meta-optimization problem. Evolutionary metaheuristic has been chosen to perform the task of meta-optimization. Thus, the approach proposed in this work can be called “meta-metaheuristic”. Computational experiment proving the effectiveness of the procedure of tuning the control parameters of metaheuristics has been performed.
FTMP - A highly reliable Fault-Tolerant Multiprocessor for aircraft
NASA Technical Reports Server (NTRS)
Hopkins, A. L., Jr.; Smith, T. B., III; Lala, J. H.
1978-01-01
The FTMP (Fault-Tolerant Multiprocessor) is a complex multiprocessor computer that employs a form of redundancy related to systems considered by Mathur (1971), in which each major module can substitute for any other module of the same type. Despite the conceptual simplicity of the redundancy form, the implementation has many intricacies owing partly to the low target failure rate, and partly to the difficulty of eliminating single-fault vulnerability. An extensive analysis of the computer through the use of such modeling techniques as Markov processes and combinatorial mathematics shows that for random hard faults the computer can meet its requirements. It is also shown that the maintenance scheduled at intervals of 200 hr or more can be adequate most of the time.
NASA Astrophysics Data System (ADS)
Tahernezhad-Javazm, Farajollah; Azimirad, Vahid; Shoaran, Maryam
2018-04-01
Objective. Considering the importance and the near-future development of noninvasive brain-machine interface (BMI) systems, this paper presents a comprehensive theoretical-experimental survey on the classification and evolutionary methods for BMI-based systems in which EEG signals are used. Approach. The paper is divided into two main parts. In the first part, a wide range of different types of the base and combinatorial classifiers including boosting and bagging classifiers and evolutionary algorithms are reviewed and investigated. In the second part, these classifiers and evolutionary algorithms are assessed and compared based on two types of relatively widely used BMI systems, sensory motor rhythm-BMI and event-related potentials-BMI. Moreover, in the second part, some of the improved evolutionary algorithms as well as bi-objective algorithms are experimentally assessed and compared. Main results. In this study two databases are used, and cross-validation accuracy (CVA) and stability to data volume (SDV) are considered as the evaluation criteria for the classifiers. According to the experimental results on both databases, regarding the base classifiers, linear discriminant analysis and support vector machines with respect to CVA evaluation metric, and naive Bayes with respect to SDV demonstrated the best performances. Among the combinatorial classifiers, four classifiers, Bagg-DT (bagging decision tree), LogitBoost, and GentleBoost with respect to CVA, and Bagging-LR (bagging logistic regression) and AdaBoost (adaptive boosting) with respect to SDV had the best performances. Finally, regarding the evolutionary algorithms, single-objective invasive weed optimization (IWO) and bi-objective nondominated sorting IWO algorithms demonstrated the best performances. Significance. We present a general survey on the base and the combinatorial classification methods for EEG signals (sensory motor rhythm and event-related potentials) as well as their optimization methods through the evolutionary algorithms. In addition, experimental and statistical significance tests are carried out to study the applicability and effectiveness of the reviewed methods.
Combinatorial structures to modeling simple games and applications
NASA Astrophysics Data System (ADS)
Molinero, Xavier
2017-09-01
We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.
Heterogeneous Catalysis: Understanding for Designing, and Designing for Applications.
Corma, Avelino
2016-05-17
"… Despite the introduction of high-throughput and combinatorial methods that certainly can be useful in the process of catalysts optimization, it is recognized that the generation of fundamental knowledge at the molecular level is key for the development of new concepts and for reaching the final objective of solid catalysts by design …" Read more in the Editorial by Avelino Corma. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Feng, Qiang; Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success.
Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success. PMID:24892046
1989-12-01
to construct because the mechanism is a dispatching procedure. Since all nonpreemptive schedules are contained in the set of all preemptive schedules...the optimal value of T’.. in the preemptive case is at least a lower bound on the optimal T., for the nonpreemptive schedules. This principle is the...adapt to changes in the enviro.nment. In hard real-time systems, tasks are also distinguished as preemptable and nonpreemptable . A task is preemptable
Finding Minimal Addition Chains with a Particle Swarm Optimization Algorithm
NASA Astrophysics Data System (ADS)
León-Javier, Alejandro; Cruz-Cortés, Nareli; Moreno-Armendáriz, Marco A.; Orantes-Jiménez, Sandra
The addition chains with minimal length are the basic block to the optimal computation of finite field exponentiations. It has very important applications in the areas of error-correcting codes and cryptography. However, obtaining the shortest addition chains for a given exponent is a NP-hard problem. In this work we propose the adaptation of a Particle Swarm Optimization algorithm to deal with this problem. Our proposal is tested on several exponents whose addition chains are considered hard to find. We obtained very promising results.
Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems
Van Benthem, Mark H.; Keenan, Michael R.
2008-11-11
A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.
Making it stick: chasing the optimal stem cells for cardiac regeneration
Quijada, Pearl; Sussman, Mark A
2014-01-01
Despite the increasing use of stem cells for regenerative-based cardiac therapy, the optimal stem cell population(s) remains in a cloud of uncertainty. In the past decade, the field has witnessed a surge of researchers discovering stem cell populations reported to directly and/or indirectly contribute to cardiac regeneration through processes of cardiomyogenic commitment and/or release of cardioprotective paracrine factors. This review centers upon defining basic biological characteristics of stem cells used for sustaining cardiac integrity during disease and maintenance of communication between the cardiac environment and stem cells. Given the limited successes achieved so far in regenerative therapy, the future requires development of unprecedented concepts involving combinatorial approaches to create and deliver the optimal stem cell(s) that will enhance myocardial healing. PMID:25340282
Optimal placement of actuators and sensors in control augmented structural optimization
NASA Technical Reports Server (NTRS)
Sepulveda, A. E.; Schmit, L. A., Jr.
1990-01-01
A control-augmented structural synthesis methodology is presented in which actuator and sensor placement is treated in terms of (0,1) variables. Structural member sizes and control variables are treated simultaneously as design variables. A multiobjective utopian approach is used to obtain a compromise solution for inherently conflicting objective functions such as strucutal mass control effort and number of actuators. Constraints are imposed on transient displacements, natural frequencies, actuator forces and dynamic stability as well as controllability and observability of the system. The combinatorial aspects of the mixed - (0,1) continuous variable design optimization problem are made tractable by combining approximation concepts with branch and bound techniques. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure set forth.
A coherent Ising machine for 2000-node optimization problems
NASA Astrophysics Data System (ADS)
Inagaki, Takahiro; Haribara, Yoshitaka; Igarashi, Koji; Sonobe, Tomohiro; Tamate, Shuhei; Honjo, Toshimori; Marandi, Alireza; McMahon, Peter L.; Umeki, Takeshi; Enbutsu, Koji; Tadanaga, Osamu; Takenouchi, Hirokazu; Aihara, Kazuyuki; Kawarabayashi, Ken-ichi; Inoue, Kyo; Utsunomiya, Shoko; Takesue, Hiroki
2016-11-01
The analysis and optimization of complex systems can be reduced to mathematical problems collectively known as combinatorial optimization. Many such problems can be mapped onto ground-state search problems of the Ising model, and various artificial spin systems are now emerging as promising approaches. However, physical Ising machines have suffered from limited numbers of spin-spin couplings because of implementations based on localized spins, resulting in severe scalability problems. We report a 2000-spin network with all-to-all spin-spin couplings. Using a measurement and feedback scheme, we coupled time-multiplexed degenerate optical parametric oscillators to implement maximum cut problems on arbitrary graph topologies with up to 2000 nodes. Our coherent Ising machine outperformed simulated annealing in terms of accuracy and computation time for a 2000-node complete graph.
Su, Zhangli
2016-01-01
Combinatorial patterns of histone modifications are key indicators of different chromatin states. Most of the current approaches rely on the usage of antibodies to analyze combinatorial histone modifications. Here we detail an antibody-free method named MARCC (Matrix-Assisted Reader Chromatin Capture) to enrich combinatorial histone modifications. The combinatorial patterns are enriched on native nucleosomes extracted from cultured mammalian cells and prepared by micrococcal nuclease digestion. Such enrichment is achieved by recombinant chromatin-interacting protein modules, or so-called reader domains, which can bind in a combinatorial modification-dependent manner. The enriched chromatin can be quantified by western blotting or mass spectrometry for the co-existence of histone modifications, while the associated DNA content can be analyzed by qPCR or next-generation sequencing. Altogether, MARCC provides a reproducible, efficient and customizable solution to enrich and analyze combinatorial histone modifications. PMID:26131849
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, K.; Raju, Ch.; Suresh, Ch.
2017-08-01
The present study aims to compare the conventional cutting inserts with wiper cutting inserts during the hard turning of AISI 4340 steel at different workpiece hardness. Type of insert, hardness, cutting speed, feed, and depth of cut are taken as process parameters. Taguchi’s L18 orthogonal array was used to conduct the experimental tests. Parametric analysis carried in order to know the influence of each process parameter on the three important Surface Roughness Characteristics (Ra, Rz, and Rt) and Material Removal Rate. Taguchi based Grey Relational Analysis (GRA) used to optimize the process parameters for individual response and multi-response outputs. Additionally, the analysis of variance (ANOVA) is also applied to identify the most significant factor.
A New Approach for Proving or Generating Combinatorial Identities
ERIC Educational Resources Information Center
Gonzalez, Luis
2010-01-01
A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…
NASA Astrophysics Data System (ADS)
Strom, C. S.; Bennema, P.
1997-03-01
This work (Part II) explores the relation between units and morphology. It shows the equivalence in behaviour between the attachment energies and the results of Monte Carlo growth kinetics simulations. The energetically optimal combination of the F slices in 1 1 0, 0 1 1 and 1 1 1 in a monomolecular interpretation leads to unsatisfactory agreement with experimentally observed morphology. In a tetrameric (or octameric) interpretation, the unit cell must be subdivided self-consistently in terms of stable molecular clusters. Thus, the presence or absence of the 1 1 1 form functions as a direct experimental criterion for distinguishing between monomolecular growth layers, and tetrameric (or octameric) growth layers of the same composition, but subjected to the condition of combinatorial compatibility, as the F slices combine to produce the growth habit. When that condition is taken into account, the tetrameric (or octameric) theoretical morphology in the broken bond model is in good agreement with experiment over a wide range. Subjectmatter for future study is summarized.
Wafer-scale growth of VO2 thin films using a combinatorial approach
Zhang, Hai-Tian; Zhang, Lei; Mukherjee, Debangshu; Zheng, Yuan-Xia; Haislmaier, Ryan C.; Alem, Nasim; Engel-Herbert, Roman
2015-01-01
Transition metal oxides offer functional properties beyond conventional semiconductors. Bridging the gap between the fundamental research frontier in oxide electronics and their realization in commercial devices demands a wafer-scale growth approach for high-quality transition metal oxide thin films. Such a method requires excellent control over the transition metal valence state to avoid performance deterioration, which has been proved challenging. Here we present a scalable growth approach that enables a precise valence state control. By creating an oxygen activity gradient across the wafer, a continuous valence state library is established to directly identify the optimal growth condition. Single-crystalline VO2 thin films have been grown on wafer scale, exhibiting more than four orders of magnitude change in resistivity across the metal-to-insulator transition. It is demonstrated that ‘electronic grade' transition metal oxide films can be realized on a large scale using a combinatorial growth approach, which can be extended to other multivalent oxide systems. PMID:26450653
Fuentes, Paulina; Zhou, Fei; Erban, Alexander; Karcher, Daniel; Kopka, Joachim; Bock, Ralph
2016-06-14
Artemisinin-based therapies are the only effective treatment for malaria, the most devastating disease in human history. To meet the growing demand for artemisinin and make it accessible to the poorest, an inexpensive and rapidly scalable production platform is urgently needed. Here we have developed a new synthetic biology approach, combinatorial supertransformation of transplastomic recipient lines (COSTREL), and applied it to introduce the complete pathway for artemisinic acid, the precursor of artemisinin, into the high-biomass crop tobacco. We first introduced the core pathway of artemisinic acid biosynthesis into the chloroplast genome. The transplastomic plants were then combinatorially supertransformed with cassettes for all additional enzymes known to affect flux through the artemisinin pathway. By screening large populations of COSTREL lines, we isolated plants that produce more than 120 milligram artemisinic acid per kilogram biomass. Our work provides an efficient strategy for engineering complex biochemical pathways into plants and optimizing the metabolic output.
NASA Astrophysics Data System (ADS)
Martin, Brian
Combinatorial approaches have proven useful for rapid alloy fabrication and optimization. A new method of producing controlled isothermal gradients using the Gleeble Thermomechanical simulator has been developed, and demonstrated on the metastable beta-Ti alloy beta-21S, achieving a thermal gradient of 525-700 °C. This thermal gradient method has subsequently been coupled with existing combinatorial methods of producing composition gradients using the LENS(TM) additive manufacturing system, through the use of elemental blended powders. This has been demonstrated with a binary Ti-(0-15) wt% Cr build, which has subsequently been characterized with optical and electron microscopy, with special attention to the precipitate of TiCr2 Laves phases. The TiCr2 phase has been explored for its high temperature mechanical properties in a new oxidation resistant beta-Ti alloy, which serves as a demonstration of the new bicombinatorial methods developed as applied to a multicomponent alloy system.
Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.
2015-01-01
SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231
Combinatorics of least-squares trees.
Mihaescu, Radu; Pachter, Lior
2008-09-09
A recurring theme in the least-squares approach to phylogenetics has been the discovery of elegant combinatorial formulas for the least-squares estimates of edge lengths. These formulas have proved useful for the development of efficient algorithms, and have also been important for understanding connections among popular phylogeny algorithms. For example, the selection criterion of the neighbor-joining algorithm is now understood in terms of the combinatorial formulas of Pauplin for estimating tree length. We highlight a phylogenetically desirable property that weighted least-squares methods should satisfy, and provide a complete characterization of methods that satisfy the property. The necessary and sufficient condition is a multiplicative four-point condition that the variance matrix needs to satisfy. The proof is based on the observation that the Lagrange multipliers in the proof of the Gauss-Markov theorem are tree-additive. Our results generalize and complete previous work on ordinary least squares, balanced minimum evolution, and the taxon-weighted variance model. They also provide a time-optimal algorithm for computation.
NASA Astrophysics Data System (ADS)
Goto, Masahiro; Sasaki, Michiko; Xu, Yibin; Zhan, Tianzhuo; Isoda, Yukihiro; Shinohara, Yoshikazu
2017-06-01
p- and n-type bismuth telluride thin films have been synthesized by using a combinatorial sputter coating system (COSCOS). The crystal structure and crystal preferred orientation of the thin films were changed by controlling the coating condition of the radio frequency (RF) power during the sputter coating. As a result, the p- and n-type films and their dimensionless figure of merit (ZT) were optimized by the technique. The properties of the thin films such as the crystal structure, crystal preferred orientation, material composition and surface morphology were analyzed by X-ray diffraction, energy-dispersive X-ray spectroscopy and atomic force microscopy. Also, the thermoelectric properties of the Seebeck coefficient, electrical conductivity and thermal conductivity were measured. ZT for n- and p-type bismuth telluride thin films was found to be 0.27 and 0.40 at RF powers of 90 and 120 W, respectively. The proposed technology can be used to fabricate thermoelectric p-n modules of bismuth telluride without any doping process.
Zhang, Weizhe; Bai, Enci; He, Hui; Cheng, Albert M.K.
2015-01-01
Reducing energy consumption is becoming very important in order to keep battery life and lower overall operational costs for heterogeneous real-time multiprocessor systems. In this paper, we first formulate this as a combinatorial optimization problem. Then, a successful meta-heuristic, called Shuffled Frog Leaping Algorithm (SFLA) is proposed to reduce the energy consumption. Precocity remission and local optimal avoidance techniques are proposed to avoid the precocity and improve the solution quality. Convergence acceleration significantly reduces the search time. Experimental results show that the SFLA-based energy-aware meta-heuristic uses 30% less energy than the Ant Colony Optimization (ACO) algorithm, and 60% less energy than the Genetic Algorithm (GA) algorithm. Remarkably, the running time of the SFLA-based meta-heuristic is 20 and 200 times less than ACO and GA, respectively, for finding the optimal solution. PMID:26110406
An Analytical Framework for Runtime of a Class of Continuous Evolutionary Algorithms.
Zhang, Yushan; Hu, Guiwu
2015-01-01
Although there have been many studies on the runtime of evolutionary algorithms in discrete optimization, relatively few theoretical results have been proposed on continuous optimization, such as evolutionary programming (EP). This paper proposes an analysis of the runtime of two EP algorithms based on Gaussian and Cauchy mutations, using an absorbing Markov chain. Given a constant variation, we calculate the runtime upper bound of special Gaussian mutation EP and Cauchy mutation EP. Our analysis reveals that the upper bounds are impacted by individual number, problem dimension number n, searching range, and the Lebesgue measure of the optimal neighborhood. Furthermore, we provide conditions whereby the average runtime of the considered EP can be no more than a polynomial of n. The condition is that the Lebesgue measure of the optimal neighborhood is larger than a combinatorial calculation of an exponential and the given polynomial of n.
NASA Astrophysics Data System (ADS)
Samimi, Peyman
The relatively low oxidation resistance and subsequent surface embrittlement have often limited the use of titanium alloys in elevated temperature structural applications. Although extensive effort is spent to investigate the high temperature oxidation performance of titanium alloys, the studies are often constrained to complex technical titanium alloys and neither the mechanisms associated with evolution of the oxide scale nor the effect of oxygen ingress on the microstructure of the base metal are well-understood. In addition lack of systematic oxidation studies across a wider domain of the alloy composition has complicated the determination of composition-mechanism-property relationships. Clearly, it would be ideal to assess the influence of composition and exposure time on the oxidation resistance, independent of experimental variabilities regarding time, temperature and atmosphere as the potential source of error. Such studies might also provide a series of metrics (e.g., hardness, scale, etc) that could be interpreted together and related to the alloy composition. In this thesis a novel combinatorial approach was adopted whereby a series of compositionally graded specimens, (Ti-xMo, Ti-xCr, Ti-xAl and Ti-xW) were prepared using Laser Engineered Net Shaping (LENS(TM)) technology and exposed to still-air at 650 °C. (Abstract shortened by ProQuest.).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Duren, Jeroen K; Koch, Carl; Luo, Alan
The primary limitation of today’s lightweight structural alloys is that specific yield strengths (SYS) higher than 200MPa x cc/g (typical value for titanium alloys) are extremely difficult to achieve. This holds true especially at a cost lower than 5dollars/kg (typical value for magnesium alloys). Recently, high-entropy alloys (HEA) have shown promising SYS, yet the large composition space of HEA makes screening compositions complex and time-consuming. Over the course of this 2-year project we started from 150 billion compositions and reduced the number of potential low-density (<5g/cc), low-cost (<5dollars/kg) high-entropy alloy (LDHEA) candidates that are single-phase, disordered, solid-solution (SPSS) to amore » few thousand compositions. This was accomplished by means of machine learning to guide design for SPSS LDHEA based on a combination of recursive partitioning, an extensive, experimental HEA database compiled from 24 literature sources, and 91 calculated parameters serving as phenomenological selection rules. Machine learning shows an accuracy of 82% in identifying which compositions of a separate, smaller, experimental HEA database are SPSS HEA. Calculation of Phase Diagrams (CALPHAD) shows an accuracy of 71-77% for the alloys supported by the CALPHAD database, where 30% of the compiled HEA database is not supported by CALPHAD. In addition to machine learning, and CALPHAD, a third tool was developed to aid design of SPSS LDHEA. Phase diagrams were calculated by constructing the Gibbs-free energy convex hull based on easily accessible enthalpy and entropy terms. Surprisingly, accuracy was 78%. Pursuing these LDHEA candidates by high-throughput experimental methods resulted in SPSS LDHEA composed of transition metals (e.g. Cr, Mn, Fe, Ni, Cu) alloyed with Al, yet the high concentration of Al, necessary to bring the mass density below 5.0g/cc, makes these materials hard and brittle, body-centered-cubic (BCC) alloys. A related, yet multi-phase BCC alloy, based on Al-Cr-Fe-Ni, shows compressive strain >10% and specific compressive yield strength of 229 MPa x cc/g, yet does not show ductility in tensile tests due to cleavage. When replacing Cr in Al-Cr-Fe-based 4- and 5-element LDHEA with Mn, hardness drops 2x. Combined with compression test results, including those on the ternaries Al-Cr-Fe and Al-Mn-Fe suggest that Al-Mn-Fe-based LDHEA are still worth pursuing. These initial results only represent one compressive stress-strain curve per composition without any property optimization. As such, reproducibility needs to be followed by optimization to show their full potential. When including Li, Mg, and Zn, single-phase Li-Mg-Al-Ti-Zn LDHEA has been found with a specific ultimate compressive strength of 289MPa x cc/g. Al-Ti-Mn-Zn showed a specific ultimate compressive strength of 73MPa x cc/g. These initial results after hot isostatic pressing (HIP) of the ball-milled powders represent the lower end of what is possible, since no secondary processing (e.g. extrusion) has been performed to optimize strength and ductility. Compositions for multi-phase (e.g. dual-phase) LDHEA were identified largely by automated searches through CALPHAD databases, while screening for large face-centered-cubic (FCC) volume fractions, followed by experimental verification. This resulted in several new alloys. Li-Mg-Al-Mn-Fe and Mg-Mn-Fe-Co ball-milled powders upon HIP show specific ultimate compressive strengths of 198MPa x cc/g and 45MPa x cc/g, respectively. Several malleable quarternary Al-Zn-based alloys have been found upon arc/induction melting, yet with limited specific compressive yield strength (<75 MPa x cc/g). These initial results are all without any optimization for strength and/or ductility. High-throughput experimentation allowed us to triple the existing experimental HEA database as published in the past 10 years in less than 2 years which happened at a rate 10x higher than previous methods. Furthermore, we showed that high-throughput thin-film combinatorial methods can be used to get insight in isothermal phase diagram slices. Although it is straightforward to map hardness as a function of composition for sputtered, thin-film, compositional gradients by nano-indentation and compare the results to micro-indentation on bulk samples, the simultaneous impact of composition, roughness, film density, and microstructure on hardness requires monitoring all these properties as a function of location on the compositional gradient, including dissecting the impact of these 4 factors on the hardness map. These additional efforts impact throughput significantly. This work shows that a lot of progress has been made over the years in predicting phase formation that aids the discovery of new alloys, yet that a lot of work needs to be done to predict phases more accurately for LDHEA, whether done by CALPHAD or by other means. More importantly, more work needs to be done to predict mechanical properties of novel alloys, like yield strength, and ductility. Furthermore, this work shows that there is a need for the generation of an empirical alloy database covering strategic points in a multi-dimensional composition space to allow for faster and more accurate predictive interpolations to identify the oasis in the dessert more quickly. Finally, this work suggests that it is worth pursuing a ductile alloy with a SYS > 300 MPa x cc/g in a mass density range of 6-7 g/cc, since the chances for a single-phase or majority-phase FCC increase significantly. Today’s lightweight steels are in this density range.« less
Dynamic combinatorial libraries: new opportunities in systems chemistry.
Hunt, Rosemary A R; Otto, Sijbren
2011-01-21
Combinatorial chemistry is a tool for selecting molecules with special properties. Dynamic combinatorial chemistry started off aiming to be just that. However, unlike ordinary combinatorial chemistry, the interconnectedness of dynamic libraries gives them an extra dimension. An understanding of these molecular networks at systems level is essential for their use as a selection tool and creates exciting new opportunities in systems chemistry. In this feature article we discuss selected examples and considerations related to the advanced exploitation of dynamic combinatorial libraries for their originally conceived purpose of identifying strong binding interactions. Also reviewed are examples illustrating a trend towards increasing complexity in terms of network behaviour and reversible chemistry. Finally, new applications of dynamic combinatorial chemistry in self-assembly, transport and self-replication are discussed.
NASA Astrophysics Data System (ADS)
Tajuddin, Wan Ahmad
1994-02-01
Ease in finding the configuration at the global energy minimum in a symmetric neural network is important for combinatorial optimization problems. We carry out a comprehensive survey of available strategies for seeking global minima by comparing their performances in the binary representation problem. We recall our previous comparison of steepest descent with analog dynamics, genetic hill-climbing, simulated diffusion, simulated annealing, threshold accepting and simulated tunneling. To this, we add comparisons to other strategies including taboo search and one with field-ordered updating.
δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms
NASA Astrophysics Data System (ADS)
Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi
In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.
Orbit Clustering Based on Transfer Cost
NASA Technical Reports Server (NTRS)
Gustafson, Eric D.; Arrieta-Camacho, Juan J.; Petropoulos, Anastassios E.
2013-01-01
We propose using cluster analysis to perform quick screening for combinatorial global optimization problems. The key missing component currently preventing cluster analysis from use in this context is the lack of a useable metric function that defines the cost to transfer between two orbits. We study several proposed metrics and clustering algorithms, including k-means and the expectation maximization algorithm. We also show that proven heuristic methods such as the Q-law can be modified to work with cluster analysis.
Caracciolo, Sergio; Sicuro, Gabriele
2014-10-01
We discuss the equivalence relation between the Euclidean bipartite matching problem on the line and on the circumference and the Brownian bridge process on the same domains. The equivalence allows us to compute the correlation function and the optimal cost of the original combinatorial problem in the thermodynamic limit; moreover, we solve also the minimax problem on the line and on the circumference. The properties of the average cost and correlation functions are discussed.
An Evaluation of a Modified Simulated Annealing Algorithm for Various Formulations
1990-08-01
trials of the K"h Markov chain, is sufficiently close to q(c, ), the stationary distribution at ck la (Lk,c,,) - q(c.) < epsilon Requiring the final...Wiley and Sons . Aarts, E. H. L., & Van Laarhoven, P. J. M. (1985). Statistical cooling: A general approach to combinatorial optimization problems...Birkhoff, G. (1946). Tres observaciones sobre el algebra lineal, Rev. Univ. Nac. TucumanSer. A, 5, 147-151. Bohr, Niels (1913). Old quantum theory
Selection criteria for wear resistant powder coatings under extreme erosive wear conditions
NASA Astrophysics Data System (ADS)
Kulu, P.; Pihl, T.
2002-12-01
Wear-resistant thermal spray coatings for sliding wear are hard but brittle (such as carbide and oxide based coatings), which makes them useless under impact loading conditions and sensitive to fatigue. Under extreme conditions of erosive wear (impact loading, high hardness of abrasives, and high velocity of abradant particles), composite coatings ensure optimal properties of hardness and toughness. The article describes tungsten carbide-cobalt (WC-Co) systems and self-fluxing alloys, containing tungsten carbide based hardmetal particles [NiCrSiB-(WC-Co)] deposited by the detonation gun, continuous detonation spraying, and spray fusion processes. Different powder compositions and processes were studied, and the effect of the coating structure and wear parameters on the wear resistance of coatings are evaluated. The dependence of the wear resistance of sprayed and fused coatings on their hardness is discussed, and hardness criteria for coating selection are proposed. The so-called “double cemented” structure of WC-Co based hardmetal or metal matrix composite coatings, as compared with a simple cobalt matrix containing particles of WC, was found optimal. Structural criteria for coating selection are provided. To assist the end user in selecting an optimal deposition method and materials, coating selection diagrams of wear resistance versus hardness are given. This paper also discusses the cost-effectiveness of coatings in the application areas that are more sensitive to cost, and composite coatings based on recycled materials are offered.
Lv, Xiaomei; Gu, Jiali; Wang, Fan; Xie, Wenping; Liu, Min; Ye, Lidan; Yu, Hongwei
2016-12-01
Metabolic engineering of microorganisms for heterologous biosynthesis is a promising route to sustainable chemical production which attracts increasing research and industrial interest. However, the efficiency of microbial biosynthesis is often restricted by insufficient activity of pathway enzymes and unbalanced utilization of metabolic intermediates. This work presents a combinatorial strategy integrating modification of multiple rate-limiting enzymes and modular pathway engineering to simultaneously improve intra- and inter-pathway balance, which might be applicable for a range of products, using isoprene as an example product. For intra-module engineering within the methylerythritol-phosphate (MEP) pathway, directed co-evolution of DXS/DXR/IDI was performed adopting a lycopene-indicated high-throughput screening method developed herein, leading to 60% improvement of isoprene production. In addition, inter-module engineering between the upstream MEP pathway and the downstream isoprene-forming pathway was conducted via promoter manipulation, which further increased isoprene production by 2.94-fold compared to the recombinant strain with solely protein engineering and 4.7-fold compared to the control strain containing wild-type enzymes. These results demonstrated the potential of pathway optimization in isoprene overproduction as well as the effectiveness of combining metabolic regulation and protein engineering in improvement of microbial biosynthesis. Biotechnol. Bioeng. 2016;113: 2661-2669. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Umam, M. I. H.; Santosa, B.
2018-04-01
Combinatorial optimization has been frequently used to solve both problems in science, engineering, and commercial applications. One combinatorial problems in the field of transportation is to find a shortest travel route that can be taken from the initial point of departure to point of destination, as well as minimizing travel costs and travel time. When the distance from one (initial) node to another (destination) node is the same with the distance to travel back from destination to initial, this problems known to the Traveling Salesman Problem (TSP), otherwise it call as an Asymmetric Traveling Salesman Problem (ATSP). The most recent optimization techniques is Symbiotic Organisms Search (SOS). This paper discuss how to hybrid the SOS algorithm with variable neighborhoods search (SOS-VNS) that can be applied to solve the ATSP problem. The proposed mechanism to add the variable neighborhoods search as a local search is to generate the better initial solution and then we modify the phase of parasites with adapting mechanism of mutation. After modification, the performance of the algorithm SOS-VNS is evaluated with several data sets and then the results is compared with the best known solution and some algorithm such PSO algorithm and SOS original algorithm. The SOS-VNS algorithm shows better results based on convergence, divergence and computing time.
cDREM: inferring dynamic combinatorial gene regulation.
Wise, Aaron; Bar-Joseph, Ziv
2015-04-01
Genes are often combinatorially regulated by multiple transcription factors (TFs). Such combinatorial regulation plays an important role in development and facilitates the ability of cells to respond to different stresses. While a number of approaches have utilized sequence and ChIP-based datasets to study combinational regulation, these have often ignored the combinational logic and the dynamics associated with such regulation. Here we present cDREM, a new method for reconstructing dynamic models of combinatorial regulation. cDREM integrates time series gene expression data with (static) protein interaction data. The method is based on a hidden Markov model and utilizes the sparse group Lasso to identify small subsets of combinatorially active TFs, their time of activation, and the logical function they implement. We tested cDREM on yeast and human data sets. Using yeast we show that the predicted combinatorial sets agree with other high throughput genomic datasets and improve upon prior methods developed to infer combinatorial regulation. Applying cDREM to study human response to flu, we were able to identify several combinatorial TF sets, some of which were known to regulate immune response while others represent novel combinations of important TFs.
NASA Technical Reports Server (NTRS)
Rash, James L.
2010-01-01
NASA's space data-communications infrastructure, the Space Network and the Ground Network, provide scheduled (as well as some limited types of unscheduled) data-communications services to user spacecraft via orbiting relay satellites and ground stations. An implementation of the methods and algorithms disclosed herein will be a system that produces globally optimized schedules with not only optimized service delivery by the space data-communications infrastructure but also optimized satisfaction of all user requirements and prescribed constraints, including radio frequency interference (RFI) constraints. Evolutionary search, a class of probabilistic strategies for searching large solution spaces, constitutes the essential technology in this disclosure. Also disclosed are methods and algorithms for optimizing the execution efficiency of the schedule-generation algorithm itself. The scheduling methods and algorithms as presented are adaptable to accommodate the complexity of scheduling the civilian and/or military data-communications infrastructure. Finally, the problem itself, and the methods and algorithms, are generalized and specified formally, with applicability to a very broad class of combinatorial optimization problems.
Combinatorial algorithms for design of DNA arrays.
Hannenhalli, Sridhar; Hubell, Earl; Lipshutz, Robert; Pevzner, Pavel A
2002-01-01
Optimal design of DNA arrays requires the development of algorithms with two-fold goals: reducing the effects caused by unintended illumination (border length minimization problem) and reducing the complexity of masks (mask decomposition problem). We describe algorithms that reduce the number of rectangles in mask decomposition by 20-30% as compared to a standard array design under the assumption that the arrangement of oligonucleotides on the array is fixed. This algorithm produces provably optimal solution for all studied real instances of array design. We also address the difficult problem of finding an arrangement which minimizes the border length and come up with a new idea of threading that significantly reduces the border length as compared to standard designs.
Statistical mechanics of budget-constrained auctions
NASA Astrophysics Data System (ADS)
Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.
2009-07-01
Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heredia-Langner, Alejandro; Amidan, Brett G.; Matzner, Shari
We present results from the optimization of a re-identification process using two sets of biometric data obtained from the Civilian American and European Surface Anthropometry Resource Project (CAESAR) database. The datasets contain real measurements of features for 2378 individuals in a standing (43 features) and seated (16 features) position. A genetic algorithm (GA) was used to search a large combinatorial space where different features are available between the probe (seated) and gallery (standing) datasets. Results show that optimized model predictions obtained using less than half of the 43 gallery features and data from roughly 16% of the individuals available producemore » better re-identification rates than two other approaches that use all the information available.« less
Optimization of Smart Structure for Improving Servo Performance of Hard Disk Drive
NASA Astrophysics Data System (ADS)
Kajiwara, Itsuro; Takahashi, Masafumi; Arisaka, Toshihiro
Head positioning accuracy of the hard disk drive should be improved to meet today's increasing performance demands. Vibration suppression of the arm in the hard disk drive is very important to enhance the servo bandwidth of the head positioning system. In this study, smart structure technology is introduced into the hard disk drive to suppress the vibration of the head actuator. It has been expected that the smart structure technology will contribute to the development of small and light-weight mechatronics devices with the required performance. First, modeling of the system is conducted with finite element method and modal analysis. Next, the actuator location and the control system are simultaneously optimized using genetic algorithm. Vibration control effect with the proposed vibration control mechanisms has been evaluated by some simulations.
Extremal Optimization: Methods Derived from Co-Evolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.G.
1999-07-13
We describe a general-purpose method for finding high-quality solutions to hard optimization problems, inspired by self-organized critical models of co-evolution such as the Bak-Sneppen model. The method, called Extremal Optimization, successively eliminates extremely undesirable components of sub-optimal solutions, rather than ''breeding'' better components. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, Extremal Optimization improves on a single candidate solution by treating each of its components as species co-evolving according to Darwinian principles. Unlike Simulated Annealing, its non-equilibrium approach effects an algorithm requiring few parameters to tune. With only one adjustable parameter, its performance provesmore » competitive with, and often superior to, more elaborate stochastic optimization procedures. We demonstrate it here on two classic hard optimization problems: graph partitioning and the traveling salesman problem.« less
Lee, M L; Schneider, G
2001-01-01
Natural products were analyzed to determine whether they contain appealing novel scaffold architectures for potential use in combinatorial chemistry. Ring systems were extracted and clustered on the basis of structural similarity. Several such potential scaffolds for combinatorial chemistry were identified that are not present in current trade drugs. For one of these scaffolds a virtual combinatorial library was generated. Pharmacophoric properties of natural products, trade drugs, and the virtual combinatorial library were assessed using a self-organizing map. Obviously, current trade drugs and natural products have several topological pharmacophore patterns in common. These features can be systematically explored with selected combinatorial libraries based on a combination of natural product-derived and synthetic molecular building blocks.
Use of combinatorial chemistry to speed drug discovery.
Rádl, S
1998-10-01
IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.
Royston, Kendra J.; Udayakumar, Neha; Lewis, Kayla; Tollefsbol, Trygve O.
2017-01-01
With cancer often classified as a disease that has an important epigenetic component, natural compounds that have the ability to regulate the epigenome become ideal candidates for study. Humans have a complex diet, which illustrates the need to elucidate the mechanisms of interaction between these bioactive compounds in combination. The natural compounds withaferin A (WA), from the Indian winter cherry, and sulforaphane (SFN), from cruciferous vegetables, have numerous anti-cancer effects and some report their ability to regulate epigenetic processes. Our study is the first to investigate the combinatorial effects of low physiologically achievable concentrations of WA and SFN on breast cancer cell proliferation, histone deacetylase1 (HDAC1) and DNA methyltransferases (DNMTs). No adverse effects were observed on control cells at optimal concentrations. There was synergistic inhibition of cellular viability in MCF-7 cells and a greater induction of apoptosis with the combinatorial approach than with either compound administered alone in both MDA-MB-231 and MCF-7 cells. HDAC expression was down-regulated at multiple levels. Lastly, we determined the combined effects of these bioactive compounds on the pro-apoptotic BAX and anti-apoptotic BCL-2 and found decreases in BCL-2 and increases in BAX. Taken together, our findings demonstrate the ability of low concentrations of combinatorial WA and SFN to promote cancer cell death and regulate key epigenetic modifiers in human breast cancer cells. PMID:28534825
Royston, Kendra J; Udayakumar, Neha; Lewis, Kayla; Tollefsbol, Trygve O
2017-05-19
With cancer often classified as a disease that has an important epigenetic component, natural compounds that have the ability to regulate the epigenome become ideal candidates for study. Humans have a complex diet, which illustrates the need to elucidate the mechanisms of interaction between these bioactive compounds in combination. The natural compounds withaferin A (WA), from the Indian winter cherry, and sulforaphane (SFN), from cruciferous vegetables, have numerous anti-cancer effects and some report their ability to regulate epigenetic processes. Our study is the first to investigate the combinatorial effects of low physiologically achievable concentrations of WA and SFN on breast cancer cell proliferation, histone deacetylase1 (HDAC1) and DNA methyltransferases (DNMTs). No adverse effects were observed on control cells at optimal concentrations. There was synergistic inhibition of cellular viability in MCF-7 cells and a greater induction of apoptosis with the combinatorial approach than with either compound administered alone in both MDA-MB-231 and MCF-7 cells. HDAC expression was down-regulated at multiple levels. Lastly, we determined the combined effects of these bioactive compounds on the pro-apoptotic BAX and anti-apoptotic BCL-2 and found decreases in BCL-2 and increases in BAX . Taken together, our findings demonstrate the ability of low concentrations of combinatorial WA and SFN to promote cancer cell death and regulate key epigenetic modifiers in human breast cancer cells.
Cryptographic Combinatorial Securities Exchanges
NASA Astrophysics Data System (ADS)
Thorpe, Christopher; Parkes, David C.
We present a useful new mechanism that facilitates the atomic exchange of many large baskets of securities in a combinatorial exchange. Cryptography prevents information about the securities in the baskets from being exploited, enhancing trust. Our exchange offers institutions who wish to trade large positions a new alternative to existing methods of block trading: they can reduce transaction costs by taking advantage of other institutions’ available liquidity, while third party liquidity providers guarantee execution—preserving their desired portfolio composition at all times. In our exchange, institutions submit encrypted orders which are crossed, leaving a “remainder”. The exchange proves facts about the portfolio risk of this remainder to third party liquidity providers without revealing the securities in the remainder, the knowledge of which could also be exploited. The third parties learn either (depending on the setting) the portfolio risk parameters of the remainder itself, or how their own portfolio risk would change if they were to incorporate the remainder into a portfolio they submit. In one setting, these third parties submit bids on the commission, and the winner supplies necessary liquidity for the entire exchange to clear. This guaranteed clearing, coupled with external price discovery from the primary markets for the securities, sidesteps difficult combinatorial optimization problems. This latter method of proving how taking on the remainder would change risk parameters of one’s own portfolio, without revealing the remainder’s contents or its own risk parameters, is a useful protocol of independent interest.
NATURAL PRODUCTS: A CONTINUING SOURCE OF NOVEL DRUG LEADS
Cragg, Gordon M.; Newman, David J.
2013-01-01
1. Background Nature has been a source of medicinal products for millennia, with many useful drugs developed from plant sources. Following discovery of the penicillins, drug discovery from microbial sources occurred and diving techniques in the 1970s opened the seas. Combinatorial chemistry (late 1980s), shifted the focus of drug discovery efforts from Nature to the laboratory bench. 2. Scope of Review This review traces natural products drug discovery, outlining important drugs from natural sources that revolutionized treatment of serious diseases. It is clear Nature will continue to be a major source of new structural leads, and effective drug development depends on multidisciplinary collaborations. 3. Major Conclusions The explosion of genetic information led not only to novel screens, but the genetic techniques permitted the implementation of combinatorial biosynthetic technology and genome mining. The knowledge gained has allowed unknown molecules to be identified. These novel bioactive structures can be optimized by using combinatorial chemistry generating new drug candidates for many diseases. 4 General Significance: The advent of genetic techniques that permitted the isolation / expression of biosynthetic cassettes from microbes may well be the new frontier for natural products lead discovery. It is now apparent that biodiversity may be much greater in those organisms. The numbers of potential species involved in the microbial world are many orders of magnitude greater than those of plants and multi-celled animals. Coupling these numbers to the number of currently unexpressed biosynthetic clusters now identified (>10 per species) the potential of microbial diversity remains essentially untapped. PMID:23428572
Sawama, Yoshinari; Masuda, Masahiro; Honda, Akie; Yokoyama, Hiroki; Park, Kwihwan; Yasukawa, Naoki; Monguchi, Yasunari; Sajiki, Hironao
2016-01-01
The deprotection of the methoxyphenylmethyl (MPM) ether and ester derivatives can be generally achieved by the combinatorial use of a catalytic Lewis acid and stoichiometric nucleophile. The deprotections of 2,4-dimethoxyphenylmethyl (DMPM)-protected alcohols and carboxylic acids were found to be effectively catalyzed by iron(III) chloride without any additional nucleophile to form the deprotected mother alcohols and carboxylic acids in excellent yields. Since the present deprotection proceeds via the self-assembling mechanism of the 2,4-DMPM protective group itself to give the hardly-soluble resorcinarene derivative as a precipitate, the rigorous purification process by silica-gel column chromatography was unnecessary and the sufficiently-pure alcohols and carboxylic acids were easily obtained in satisfactory yields after simple filtration.
Effects of hard mask etch on final topography of advanced phase shift masks
NASA Astrophysics Data System (ADS)
Hortenbach, Olga; Rolff, Haiko; Lajn, Alexander; Baessler, Martin
2017-07-01
Continuous shrinking of the semiconductor device dimensions demands steady improvements of the lithographic resolution on wafer level. These requirements challenge the photomask industry to further improve the mask quality in all relevant printing characteristics. In this paper topography of the Phase Shift Masks (PSM) was investigated. Effects of hard mask etch on phase shift uniformity and mask absorber profile were studied. Design of experiments method (DoE) was used for the process optimization, whereas gas composition, bias power of the hard mask main etch and bias power of the over-etch were varied. In addition, influence of the over-etch time was examined at the end of the experiment. Absorber depth uniformity, sidewall angle (SWA), reactive ion etch lag (RIE lag) and through pitch (TP) dependence were analyzed. Measurements were performed by means of Atomic-force microscopy (AFM) using critical dimension (CD) mode with a boot-shaped tip. Scanning electron microscope (SEM) cross-section images were prepared to verify the profile quality. Finally CD analysis was performed to confirm the optimal etch conditions. Significant dependence of the absorber SWA on hard mask (HM) etch conditions was observed revealing an improvement potential for the mask absorber profile. It was found that hard mask etch can leave a depth footprint in the absorber layer. Thus, the etch depth uniformity of hard mask etch is crucial for achieving a uniform phase shift over the active mask area. The optimized hard mask etch process results in significantly improved mask topography without deterioration of tight CD specifications.
Li, Desheng
2014-01-01
This paper proposes a novel variant of cooperative quantum-behaved particle swarm optimization (CQPSO) algorithm with two mechanisms to reduce the search space and avoid the stagnation, called CQPSO-DVSA-LFD. One mechanism is called Dynamic Varying Search Area (DVSA), which takes charge of limiting the ranges of particles' activity into a reduced area. On the other hand, in order to escape the local optima, Lévy flights are used to generate the stochastic disturbance in the movement of particles. To test the performance of CQPSO-DVSA-LFD, numerical experiments are conducted to compare the proposed algorithm with different variants of PSO. According to the experimental results, the proposed method performs better than other variants of PSO on both benchmark test functions and the combinatorial optimization issue, that is, the job-shop scheduling problem.
Forghani, Fereidoun; Park, Joong-Hyun; Oh, Deog-Hwan
2015-06-01
Slightly acidic electrolyzed water (SAEW) has been proved as an effective sanitizer against microorganisms attached to foods. However, its physical properties and inactivation efficacy are affected by several factors such as water hardness. Therefore, in this study the effect of water hardness on SAEW properties were studied. Pure cultures of foodborne bacteria were used in vitro and in vivo to evaluate the inactivation efficacy of the SAEWs produced. Results obtained showed water hardness to be an important factor in the production of SAEW. Low water hardness may result in the necessity of further optimization of production process. In this study the addition of 5% HCl and 2 M NaCl at 1.5 mL/min flow rate was found to be the best electrolyte concentration for the optimization of SAEW production from low hardness water (34 ± 2 mg/L). Furthermore, the results showed that pre-heating was a better approach compared to post-production heating of SAEW, resulting in higher ACC values and therefor better sanitization efficacy. Copyright © 2014 Elsevier Ltd. All rights reserved.
Scoring of Side-Chain Packings: An Analysis of Weight Factors and Molecular Dynamics Structures.
Colbes, Jose; Aguila, Sergio A; Brizuela, Carlos A
2018-02-26
The protein side-chain packing problem (PSCPP) is a central task in computational protein design. The problem is usually modeled as a combinatorial optimization problem, which consists of searching for a set of rotamers, from a given rotamer library, that minimizes a scoring function (SF). The SF is a weighted sum of terms, that can be decomposed in physics-based and knowledge-based terms. Although there are many methods to obtain approximate solutions for this problem, all of them have similar performances and there has not been a significant improvement in recent years. Studies on protein structure prediction and protein design revealed the limitations of current SFs to achieve further improvements for these two problems. In the same line, a recent work reported a similar result for the PSCPP. In this work, we ask whether or not this negative result regarding further improvements in performance is due to (i) an incorrect weighting of the SFs terms or (ii) the constrained conformation resulting from the protein crystallization process. To analyze these questions, we (i) model the PSCPP as a bi-objective combinatorial optimization problem, optimizing, at the same time, the two most important terms of two SFs of state-of-the-art algorithms and (ii) performed a preprocessing relaxation of the crystal structure through molecular dynamics to simulate the protein in the solvent and evaluated the performance of these two state-of-the-art SFs under these conditions. Our results indicate that (i) no matter what combination of weight factors we use the current SFs will not lead to better performances and (ii) the evaluated SFs will not be able to improve performance on relaxed structures. Furthermore, the experiments revealed that the SFs and the methods are biased toward crystallized structures.
Hybridization of decomposition and local search for multiobjective optimization.
Ke, Liangjun; Zhang, Qingfu; Battiti, Roberto
2014-10-01
Combining ideas from evolutionary algorithms, decomposition approaches, and Pareto local search, this paper suggests a simple yet efficient memetic algorithm for combinatorial multiobjective optimization problems: memetic algorithm based on decomposition (MOMAD). It decomposes a combinatorial multiobjective problem into a number of single objective optimization problems using an aggregation method. MOMAD evolves three populations: 1) population P(L) for recording the current solution to each subproblem; 2) population P(P) for storing starting solutions for Pareto local search; and 3) an external population P(E) for maintaining all the nondominated solutions found so far during the search. A problem-specific single objective heuristic can be applied to these subproblems to initialize the three populations. At each generation, a Pareto local search method is first applied to search a neighborhood of each solution in P(P) to update P(L) and P(E). Then a single objective local search is applied to each perturbed solution in P(L) for improving P(L) and P(E), and reinitializing P(P). The procedure is repeated until a stopping condition is met. MOMAD provides a generic hybrid multiobjective algorithmic framework in which problem specific knowledge, well developed single objective local search and heuristics and Pareto local search methods can be hybridized. It is a population based iterative method and thus an anytime algorithm. Extensive experiments have been conducted in this paper to study MOMAD and compare it with some other state-of-the-art algorithms on the multiobjective traveling salesman problem and the multiobjective knapsack problem. The experimental results show that our proposed algorithm outperforms or performs similarly to the best so far heuristics on these two problems.
MIFT: GIFT Combinatorial Geometry Input to VCS Code
1977-03-01
r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package
On k-ary n-cubes: Theory and applications
NASA Technical Reports Server (NTRS)
Mao, Weizhen; Nicol, David M.
1994-01-01
Many parallel processing networks can be viewed as graphs called k-ary n-cubes, whose special cases include rings, hypercubes and toruses. In this paper, combinatorial properties of k-ary n-cubes are explored. In particular, the problem of characterizing the subgraph of a given number of nodes with the maximum edge count is studied. These theoretical results are then used to compute a lower bounding function in branch-and-bound partitioning algorithms and to establish the optimality of some irregular partitions.
Probabilistic Analysis of Combinatorial Optimization Problems on Hypergraph Matchings
2012-02-01
per dimension” ( recall that d is equal to the number of independent subsets of vertices Vk in the hypergraph Hd jn, and n denotes the number of...disjoint solutions whose costs are iid random variables. First, recalling the interpretation of feasible MAP solu- tions as paths in the index graph G, we...elements. On the other hand, recall that a (feasible) path G can be described as a set of n vectors D f.i .1/ 1 ; : : : ; i .1/ d /; : : : ; .i .n
Feature selection methods for big data bioinformatics: A survey from the search perspective.
Wang, Lipo; Wang, Yaoli; Chang, Qing
2016-12-01
This paper surveys main principles of feature selection and their recent applications in big data bioinformatics. Instead of the commonly used categorization into filter, wrapper, and embedded approaches to feature selection, we formulate feature selection as a combinatorial optimization or search problem and categorize feature selection methods into exhaustive search, heuristic search, and hybrid methods, where heuristic search methods may further be categorized into those with or without data-distilled feature ranking measures. Copyright © 2016 Elsevier Inc. All rights reserved.
A stochastic discrete optimization model for designing container terminal facilities
NASA Astrophysics Data System (ADS)
Zukhruf, Febri; Frazila, Russ Bona; Burhani, Jzolanda Tsavalista
2017-11-01
As uncertainty essentially affect the total transportation cost, it remains important in the container terminal that incorporates several modes and transshipments process. This paper then presents a stochastic discrete optimization model for designing the container terminal, which involves the decision of facilities improvement action. The container terminal operation model is constructed by accounting the variation of demand and facilities performance. In addition, for illustrating the conflicting issue that practically raises in the terminal operation, the model also takes into account the possible increment delay of facilities due to the increasing number of equipment, especially the container truck. Those variations expectantly reflect the uncertainty issue in the container terminal operation. A Monte Carlo simulation is invoked to propagate the variations by following the observed distribution. The problem is constructed within the framework of the combinatorial optimization problem for investigating the optimal decision of facilities improvement. A new variant of glow-worm swarm optimization (GSO) is thus proposed for solving the optimization, which is rarely explored in the transportation field. The model applicability is tested by considering the actual characteristics of the container terminal.
Pruning-Based, Energy-Optimal, Deterministic I/O Device Scheduling for Hard Real-Time Systems
2005-02-01
However, DPM via I/O device scheduling for hard real - time systems has received relatively little attention. In this paper,we present an offline I/O...polynomial time. We present experimental results to show that EDS and MDO reduce the energy consumption of I/O devices significantly for hard real - time systems .
The Role of the Goal in Solving Hard Computational Problems: Do People Really Optimize?
ERIC Educational Resources Information Center
Carruthers, Sarah; Stege, Ulrike; Masson, Michael E. J.
2018-01-01
The role that the mental, or internal, representation plays when people are solving hard computational problems has largely been overlooked to date, despite the reality that this internal representation drives problem solving. In this work we investigate how performance on versions of two hard computational problems differs based on what internal…
Parameterized Algorithmics for Finding Exact Solutions of NP-Hard Biological Problems.
Hüffner, Falk; Komusiewicz, Christian; Niedermeier, Rolf; Wernicke, Sebastian
2017-01-01
Fixed-parameter algorithms are designed to efficiently find optimal solutions to some computationally hard (NP-hard) problems by identifying and exploiting "small" problem-specific parameters. We survey practical techniques to develop such algorithms. Each technique is introduced and supported by case studies of applications to biological problems, with additional pointers to experimental results.
FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science
NASA Astrophysics Data System (ADS)
Chikyo, Toyohiro
2011-10-01
About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.
Combinatorial Nano-Bio Interfaces.
Cai, Pingqiang; Zhang, Xiaoqian; Wang, Ming; Wu, Yun-Long; Chen, Xiaodong
2018-06-08
Nano-bio interfaces are emerging from the convergence of engineered nanomaterials and biological entities. Despite rapid growth, clinical translation of biomedical nanomaterials is heavily compromised by the lack of comprehensive understanding of biophysicochemical interactions at nano-bio interfaces. In the past decade, a few investigations have adopted a combinatorial approach toward decoding nano-bio interfaces. Combinatorial nano-bio interfaces comprise the design of nanocombinatorial libraries and high-throughput bioevaluation. In this Perspective, we address challenges in combinatorial nano-bio interfaces and call for multiparametric nanocombinatorics (composition, morphology, mechanics, surface chemistry), multiscale bioevaluation (biomolecules, organelles, cells, tissues/organs), and the recruitment of computational modeling and artificial intelligence. Leveraging combinatorial nano-bio interfaces will shed light on precision nanomedicine and its potential applications.
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
Zhang, Guozhu; Xie, Changsheng; Zhang, Shunping; Zhao, Jianwei; Lei, Tao; Zeng, Dawen
2014-09-08
A combinatorial high-throughput temperature-programmed method to obtain the optimal operating temperature (OOT) of gas sensor materials is demonstrated here for the first time. A material library consisting of SnO2, ZnO, WO3, and In2O3 sensor films was fabricated by screen printing. Temperature-dependent conductivity curves were obtained by scanning this gas sensor library from 300 to 700 K in different atmospheres (dry air, formaldehyde, carbon monoxide, nitrogen dioxide, toluene and ammonia), giving the OOT of each sensor formulation as a function of the carrier and analyte gases. A comparative study of the temperature-programmed method and a conventional method showed good agreement in measured OOT.
'Extremotaxis': computing with a bacterial-inspired algorithm.
Nicolau, Dan V; Burrage, Kevin; Nicolau, Dan V; Maini, Philip K
2008-01-01
We present a general-purpose optimization algorithm inspired by "run-and-tumble", the biased random walk chemotactic swimming strategy used by the bacterium Escherichia coli to locate regions of high nutrient concentration The method uses particles (corresponding to bacteria) that swim through the variable space (corresponding to the attractant concentration profile). By constantly performing temporal comparisons, the particles drift towards the minimum or maximum of the function of interest. We illustrate the use of our method with four examples. We also present a discrete version of the algorithm. The new algorithm is expected to be useful in combinatorial optimization problems involving many variables, where the functional landscape is apparently stochastic and has local minima, but preserves some derivative structure at intermediate scales.
Singh, Narender; Guha, Rajarshi; Giulianotti, Marc; Pinilla, Clemencia; Houghten, Richard; Medina-Franco, Jose L.
2009-01-01
A multiple criteria approach is presented, that is used to perform a comparative analysis of four recently developed combinatorial libraries to drugs, Molecular Libraries Small Molecule Repository (MLSMR) and natural products. The compound databases were assessed in terms of physicochemical properties, scaffolds and fingerprints. The approach enables the analysis of property space coverage, degree of overlap between collections, scaffold and structural diversity and overall structural novelty. The degree of overlap between combinatorial libraries and drugs was assessed using the R-NN curve methodology, which measures the density of chemical space around a query molecule embedded in the chemical space of a target collection. The combinatorial libraries studied in this work exhibit scaffolds that were not observed in the drug, MLSMR and natural products collections. The fingerprint-based comparisons indicate that these combinatorial libraries are structurally different to current drugs. The R-NN curve methodology revealed that a proportion of molecules in the combinatorial libraries are located within the property space of the drugs. However, the R-NN analysis also showed that there are a significant number of molecules in several combinatorial libraries that are located in sparse regions of the drug space. PMID:19301827
A combinatorial approach towards the design of nanofibrous scaffolds for chondrogenesis
NASA Astrophysics Data System (ADS)
Ahmed, Maqsood; Ramos, Tiago André Da Silva; Damanik, Febriyani; Quang Le, Bach; Wieringa, Paul; Bennink, Martin; van Blitterswijk, Clemens; de Boer, Jan; Moroni, Lorenzo
2015-10-01
The extracellular matrix (ECM) is a three-dimensional (3D) structure composed of proteinaceous fibres that provide physical and biological cues to direct cell behaviour. Here, we build a library of hybrid collagen-polymer fibrous scaffolds with nanoscale dimensions and screen them for their ability to grow chondrocytes for cartilage repair. Poly(lactic acid) and poly (lactic-co-glycolic acid) at two different monomer ratios (85:15 and 50:50) were incrementally blended with collagen. Physical properties (wettability and stiffness) of the scaffolds were characterized and related to biological performance (proliferation, ECM production, and gene expression) and structure-function relationships were developed. We found that soft scaffolds with an intermediate wettability composed of the highly biodegradable PLGA50:50 and collagen, in two ratios (40:60 and 60:40), were optimal for chondrogenic differentiation of ATDC5 cells as determined by increased ECM production and enhanced cartilage specific gene expression. Long-term cultures indicated a stable phenotype with minimal de-differentiation or hypertrophy. The combinatorial methodology applied herein is a promising approach for the design and development of scaffolds for regenerative medicine.
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
Diversity of bacteria and archaea from two shallow marine hydrothermal vents from Vulcano Island.
Antranikian, Garabed; Suleiman, Marcel; Schäfers, Christian; Adams, Michael W W; Bartolucci, Simonetta; Blamey, Jenny M; Birkeland, Nils-Kåre; Bonch-Osmolovskaya, Elizaveta; da Costa, Milton S; Cowan, Don; Danson, Michael; Forterre, Patrick; Kelly, Robert; Ishino, Yoshizumi; Littlechild, Jennifer; Moracci, Marco; Noll, Kenneth; Oshima, Tairo; Robb, Frank; Rossi, Mosè; Santos, Helena; Schönheit, Peter; Sterner, Reinhard; Thauer, Rudolf; Thomm, Michael; Wiegel, Jürgen; Stetter, Karl Otto
2017-07-01
To obtain new insights into community compositions of hyperthermophilic microorganisms, defined as having optimal growth temperatures of 80 °C and above, sediment and water samples were taken from two shallow marine hydrothermal vents (I and II) with temperatures of 100 °C at Vulcano Island, Italy. A combinatorial approach of denaturant gradient gel electrophoresis (DGGE) and metagenomic sequencing was used for microbial community analyses of the samples. In addition, enrichment cultures, growing anaerobically on selected polysaccharides such as starch and cellulose, were also analyzed by the combinatorial approach. Our results showed a high abundance of hyperthermophilic archaea, especially in sample II, and a comparable diverse archaeal community composition in both samples. In particular, the strains of the hyperthermophilic anaerobic genera Staphylothermus and Thermococcus, and strains of the aerobic hyperthermophilic genus Aeropyrum, were abundant. Regarding the bacterial community, ε-Proteobacteria, especially the genera Sulfurimonas and Sulfurovum, were highly abundant. The microbial diversity of the enrichment cultures changed significantly by showing a high dominance of archaea, particularly the genera Thermococcus and Palaeococcus, depending on the carbon source and the selected temperature.
A combinatorial approach towards the design of nanofibrous scaffolds for chondrogenesis.
Ahmed, Maqsood; Ramos, Tiago André da Silva; Damanik, Febriyani; Quang Le, Bach; Wieringa, Paul; Bennink, Martin; van Blitterswijk, Clemens; de Boer, Jan; Moroni, Lorenzo
2015-10-07
The extracellular matrix (ECM) is a three-dimensional (3D) structure composed of proteinaceous fibres that provide physical and biological cues to direct cell behaviour. Here, we build a library of hybrid collagen-polymer fibrous scaffolds with nanoscale dimensions and screen them for their ability to grow chondrocytes for cartilage repair. Poly(lactic acid) and poly (lactic-co-glycolic acid) at two different monomer ratios (85:15 and 50:50) were incrementally blended with collagen. Physical properties (wettability and stiffness) of the scaffolds were characterized and related to biological performance (proliferation, ECM production, and gene expression) and structure-function relationships were developed. We found that soft scaffolds with an intermediate wettability composed of the highly biodegradable PLGA50:50 and collagen, in two ratios (40:60 and 60:40), were optimal for chondrogenic differentiation of ATDC5 cells as determined by increased ECM production and enhanced cartilage specific gene expression. Long-term cultures indicated a stable phenotype with minimal de-differentiation or hypertrophy. The combinatorial methodology applied herein is a promising approach for the design and development of scaffolds for regenerative medicine.
Rapid NMR Assignments of Proteins by Using Optimized Combinatorial Selective Unlabeling.
Dubey, Abhinav; Kadumuri, Rajashekar Varma; Jaipuria, Garima; Vadrevu, Ramakrishna; Atreya, Hanudatta S
2016-02-15
A new approach for rapid resonance assignments in proteins based on amino acid selective unlabeling is presented. The method involves choosing a set of multiple amino acid types for selective unlabeling and identifying specific tripeptides surrounding the labeled residues from specific 2D NMR spectra in a combinatorial manner. The methodology directly yields sequence specific assignments, without requiring a contiguously stretch of amino acid residues to be linked, and is applicable to deuterated proteins. We show that a 2D [(15) N,(1) H] HSQC spectrum with two 2D spectra can result in ∼50 % assignments. The methodology was applied to two proteins: an intrinsically disordered protein (12 kDa) and the 29 kDa (268 residue) α-subunit of Escherichia coli tryptophan synthase, which presents a challenging case with spectral overlaps and missing peaks. The method can augment existing approaches and will be useful for applications such as identifying active-site residues involved in ligand binding, phosphorylation, or protein-protein interactions, even prior to complete resonance assignments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Combinatorial Approaches for the Identification of Brain Drug Delivery Targets
Stutz, Charles C.; Zhang, Xiaobin; Shusta, Eric V.
2018-01-01
The blood-brain barrier (BBB) represents a large obstacle for the treatment of central nervous system diseases. Targeting endogenous nutrient transporters that transcytose the BBB is one promising approach to selectively and noninvasively deliver a drug payload to the brain. The main limitations of the currently employed transcytosing receptors are their ubiquitous expression in the peripheral vasculature and the inherent low levels of transcytosis mediated by such systems. In this review, approaches designed to increase the repertoire of transcytosing receptors which can be targeted for the purpose of drug delivery are discussed. In particular, combinatorial protein libraries can be screened on BBB cells in vitro or in vivo to isolate targeting peptides or antibodies that can trigger transcytosis. Once these targeting reagents are discovered, the cognate BBB transcytosis system can be identified using techniques such as expression cloning or immunoprecipitation coupled with mass spectrometry. Continued technological advances in BBB genomics and proteomics, membrane protein manipulation, and in vitro BBB technology promise to further advance the capability to identify and optimize peptides and antibodies capable of mediating drug transport across the BBB. PMID:23789958
NASA Astrophysics Data System (ADS)
Zadeh, S. M.; Powers, D. M. W.; Sammut, K.; Yazdani, A. M.
2016-12-01
Autonomous Underwater Vehicles (AUVs) are capable of spending long periods of time for carrying out various underwater missions and marine tasks. In this paper, a novel conflict-free motion planning framework is introduced to enhance underwater vehicle's mission performance by completing maximum number of highest priority tasks in a limited time through a large scale waypoint cluttered operating field, and ensuring safe deployment during the mission. The proposed combinatorial route-path planner model takes the advantages of the Biogeography-Based Optimization (BBO) algorithm toward satisfying objectives of both higher-lower level motion planners and guarantees maximization of the mission productivity for a single vehicle operation. The performance of the model is investigated under different scenarios including the particular cost constraints in time-varying operating fields. To show the reliability of the proposed model, performance of each motion planner assessed separately and then statistical analysis is undertaken to evaluate the total performance of the entire model. The simulation results indicate the stability of the contributed model and its feasible application for real experiments.
Fuentes, Paulina; Zhou, Fei; Erban, Alexander; Karcher, Daniel; Kopka, Joachim; Bock, Ralph
2016-01-01
Artemisinin-based therapies are the only effective treatment for malaria, the most devastating disease in human history. To meet the growing demand for artemisinin and make it accessible to the poorest, an inexpensive and rapidly scalable production platform is urgently needed. Here we have developed a new synthetic biology approach, combinatorial supertransformation of transplastomic recipient lines (COSTREL), and applied it to introduce the complete pathway for artemisinic acid, the precursor of artemisinin, into the high-biomass crop tobacco. We first introduced the core pathway of artemisinic acid biosynthesis into the chloroplast genome. The transplastomic plants were then combinatorially supertransformed with cassettes for all additional enzymes known to affect flux through the artemisinin pathway. By screening large populations of COSTREL lines, we isolated plants that produce more than 120 milligram artemisinic acid per kilogram biomass. Our work provides an efficient strategy for engineering complex biochemical pathways into plants and optimizing the metabolic output. DOI: http://dx.doi.org/10.7554/eLife.13664.001 PMID:27296645
Abeylath, Sampath C.; Amiji, Mansoor
2011-01-01
With the non-specific toxicity of anticancer drugs to healthy tissues upon systemic administration, formulations capable of enhanced selectivity in delivery to the tumor mass and cells are highly desirable. Based on the diversity of the drug payloads, we have investigated a combinatorial-designed strategy where the nano-sized formulations are tailored based on the physicochemical properties of the drug and the delivery needs. Individually functionalized C2 to C12 lipid-, thiol-, and poly(ethylene glycol) (PEG)-modified dextran derivatives were synthesized via “click” chemistry from O-pentynyl dextran and relevant azides. These functionalized dextrans in combination with anticancer drugs form nanoparticles by self-assembling in aqueous medium having PEG surface functionalization and intermolecular disulfide bonds. Using anticancer drugs with logP values ranging from −0.5 to 3.0, the optimized nanoparticles formulations were evaluated for preliminary cellular delivery and cytotoxic effects in SKOV3 human ovarian adenocarcinoma cells. The results show that with the appropriate selection of lipid-modified dextran, one can effectively tailor the self-assembled nano-formulation for intended therapeutic payload. PMID:21978947
Preparation of cherry-picked combinatorial libraries by string synthesis.
Furka, Arpád; Dibó, Gábor; Gombosuren, Naran
2005-03-01
String synthesis [1-3] is an efficient and cheap manual method for preparation of combinatorial libraries by using macroscopic solid support units. Sorting the units between two synthetic steps is an important operation of the procedure. The software developed to guide sorting can be used only when complete combinatorial libraries are prepared. Since very often only selected components of the full libraries are needed, new software was constructed that guides sorting in preparation of non-complete combinatorial libraries. Application of the software is described in details.
Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH.
Volk, Jochen; Herrmann, Torsten; Wüthrich, Kurt
2008-07-01
MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness.
Effect of crospovidone and hydroxypropyl cellulose on carbamazepine in high-dose tablet formulation.
Flicker, Felicia; Betz, Gabriele
2012-06-01
The aim of this study was to develop a high-dose tablet formulation of the poorly soluble carbamazepine (CBZ) with sufficient tablet hardness and immediate drug release. A further aim was to investigate the influence of various commercial CBZ raw materials on the optimized tablet formulation. Hydroxypropyl cellulose (HPC-SL) was selected as a dry binder and crospovidone (CrosPVP) as a superdisintegrant. A direct compacted tablet formulation of 70% CBZ was optimized by a 3² full factorial design with two input variables, HPC (0--10%) and CrosPVP (0--5%). Response variables included disintegration time, amount of drug released at 15 and 60 min, and tablet hardness, all analyzed according to USP 31. Increasing HPC-SL together with CrosPVP not only increased tablet hardness but also reduced disintegration time. Optimal condition was achieved in the range of 5--9% HPC and 3--5% CrosPVP, where tablet properties were at least 70 N tablet hardness, less than 1 min disintegration, and within the USP requirements for drug release. Testing the optimized formulation with four different commercial CBZ samples, their variability was still observed. Nonetheless, all formulations conformed to the USP specifications. With the excipients CrosPVP and HPC-SL an immediate release tablet formulation was successfully formulated for high-dose CBZ of various commercial sources.
Li, Desheng
2014-01-01
This paper proposes a novel variant of cooperative quantum-behaved particle swarm optimization (CQPSO) algorithm with two mechanisms to reduce the search space and avoid the stagnation, called CQPSO-DVSA-LFD. One mechanism is called Dynamic Varying Search Area (DVSA), which takes charge of limiting the ranges of particles' activity into a reduced area. On the other hand, in order to escape the local optima, Lévy flights are used to generate the stochastic disturbance in the movement of particles. To test the performance of CQPSO-DVSA-LFD, numerical experiments are conducted to compare the proposed algorithm with different variants of PSO. According to the experimental results, the proposed method performs better than other variants of PSO on both benchmark test functions and the combinatorial optimization issue, that is, the job-shop scheduling problem. PMID:24851085
Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.
Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai
2005-10-01
A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.
NASA Astrophysics Data System (ADS)
Menshikh, V.; Samorokovskiy, A.; Avsentev, O.
2018-03-01
The mathematical model of optimizing the allocation of resources to reduce the time for management decisions and algorithms to solve the general problem of resource allocation. The optimization problem of choice of resources in organizational systems in order to reduce the total execution time of a job is solved. This problem is a complex three-level combinatorial problem, for the solving of which it is necessary to implement the solution to several specific problems: to estimate the duration of performing each action, depending on the number of performers within the group that performs this action; to estimate the total execution time of all actions depending on the quantitative composition of groups of performers; to find such a distribution of the existing resource of performers in groups to minimize the total execution time of all actions. In addition, algorithms to solve the general problem of resource allocation are proposed.
Annealing Ant Colony Optimization with Mutation Operator for Solving TSP.
Mohsen, Abdulqader M
2016-01-01
Ant Colony Optimization (ACO) has been successfully applied to solve a wide range of combinatorial optimization problems such as minimum spanning tree, traveling salesman problem, and quadratic assignment problem. Basic ACO has drawbacks of trapping into local minimum and low convergence rate. Simulated annealing (SA) and mutation operator have the jumping ability and global convergence; and local search has the ability to speed up the convergence. Therefore, this paper proposed a hybrid ACO algorithm integrating the advantages of ACO, SA, mutation operator, and local search procedure to solve the traveling salesman problem. The core of algorithm is based on the ACO. SA and mutation operator were used to increase the ants population diversity from time to time and the local search was used to exploit the current search area efficiently. The comparative experiments, using 24 TSP instances from TSPLIB, show that the proposed algorithm outperformed some well-known algorithms in the literature in terms of solution quality.
Validation of an Instrument and Testing Protocol for Measuring the Combinatorial Analysis Schema.
ERIC Educational Resources Information Center
Staver, John R.; Harty, Harold
1979-01-01
Designs a testing situation to examine the presence of combinatorial analysis, to establish construct validity in the use of an instrument, Combinatorial Analysis Behavior Observation Scheme (CABOS), and to investigate the presence of the schema in young adolescents. (Author/GA)
Molecular biomimetics: nanotechnology through biology.
Sarikaya, Mehmet; Tamerler, Candan; Jen, Alex K-Y; Schulten, Klaus; Baneyx, François
2003-09-01
Proteins, through their unique and specific interactions with other macromolecules and inorganics, control structures and functions of all biological hard and soft tissues in organisms. Molecular biomimetics is an emerging field in which hybrid technologies are developed by using the tools of molecular biology and nanotechnology. Taking lessons from biology, polypeptides can now be genetically engineered to specifically bind to selected inorganic compounds for applications in nano- and biotechnology. This review discusses combinatorial biological protocols, that is, bacterial cell surface and phage-display technologies, in the selection of short sequences that have affinity to (noble) metals, semiconducting oxides and other technological compounds. These genetically engineered proteins for inorganics (GEPIs) can be used in the assembly of functional nanostructures. Based on the three fundamental principles of molecular recognition, self-assembly and DNA manipulation, we highlight successful uses of GEPI in nanotechnology.
Some unsolved problems in discrete mathematics and mathematical cybernetics
NASA Astrophysics Data System (ADS)
Korshunov, Aleksei D.
2009-10-01
There are many unsolved problems in discrete mathematics and mathematical cybernetics. Writing a comprehensive survey of such problems involves great difficulties. First, such problems are rather numerous and varied. Second, they greatly differ from each other in degree of completeness of their solution. Therefore, even a comprehensive survey should not attempt to cover the whole variety of such problems; only the most important and significant problems should be reviewed. An impersonal choice of problems to include is quite hard. This paper includes 13 unsolved problems related to combinatorial mathematics and computational complexity theory. The problems selected give an indication of the author's studies for 50 years; for this reason, the choice of the problems reviewed here is, to some extent, subjective. At the same time, these problems are very difficult and quite important for discrete mathematics and mathematical cybernetics. Bibliography: 74 items.
Herrgård, Markus J.
2014-01-01
High-cell-density fermentation for industrial production of chemicals can impose numerous stresses on cells due to high substrate, product, and by-product concentrations; high osmolarity; reactive oxygen species; and elevated temperatures. There is a need to develop platform strains of industrial microorganisms that are more tolerant toward these typical processing conditions. In this study, the growth of six industrially relevant strains of Escherichia coli was characterized under eight stress conditions representative of fed-batch fermentation, and strains W and BL21(DE3) were selected as platforms for transposon (Tn) mutagenesis due to favorable resistance characteristics. Selection experiments, followed by either targeted or genome-wide next-generation-sequencing-based Tn insertion site determination, were performed to identify mutants with improved growth properties under a subset of three stress conditions and two combinations of individual stresses. A subset of the identified loss-of-function mutants were selected for a combinatorial approach, where strains with combinations of two and three gene deletions were systematically constructed and tested for single and multistress resistance. These approaches allowed identification of (i) strain-background-specific stress resistance phenotypes, (ii) novel gene deletion mutants in E. coli that confer single and multistress resistance in a strain-background-dependent manner, and (iii) synergistic effects of multiple gene deletions that confer improved resistance over single deletions. The results of this study underscore the suboptimality and strain-specific variability of the genetic network regulating growth under stressful conditions and suggest that further exploration of the combinatorial gene deletion space in multiple strain backgrounds is needed for optimizing strains for microbial bioprocessing applications. PMID:25085490
NASA Astrophysics Data System (ADS)
Zittersteijn, Michiel; Schildknecht, Thomas; Vananti, Alessandro; Dolado Perez, Juan Carlos; Martinot, Vincent
2016-07-01
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention. This problem is also known as the Multiple Target Tracking (MTT) problem. The complexity of the MTT problem is defined by its dimension S. Current research tends to focus on the S = 2 MTT problem. The reason for this is that for S = 2 the problem has a P-complexity. However, with S = 2 the decision to associate a set of observations is based on the minimum amount of information, in ambiguous situations (e.g. satellite clusters) this will lead to incorrect associations. The S > 2 MTT problem is an NP-hard combinatorial optimization problem. In previous work an Elitist Genetic Algorithm (EGA) was proposed as a method to approximately solve this problem. It was shown that the EGA is able to find a good approximate solution with a polynomial time complexity. The EGA relies on solving the Lambert problem in order to perform the necessary orbit determinations. This means that the algorithm is restricted to orbits that are described by Keplerian motion. The work presented in this paper focuses on the impact that this restriction has on the algorithm performance.
Simulated annealing algorithm for solving chambering student-case assignment problem
NASA Astrophysics Data System (ADS)
Ghazali, Saadiah; Abdul-Rahman, Syariza
2015-12-01
The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.
Radiation hardness of Ga0.5In0.5 P/GaAs tandem solar cells
NASA Technical Reports Server (NTRS)
Kurtz, Sarah R.; Olson, J. M.; Bertness, K. A.; Friedman, D. J.; Kibbler, A.; Cavicchi, B. T.; Krut, D. D.
1991-01-01
The radiation hardness of a two-junction monolithic Ga sub 0.5 In sub 0.5 P/GaAs cell with tunnel junction interconnect was investigated. Related single junction cells were also studied to identify the origins of the radiation losses. The optimal design of the cell is discussed. The air mass efficiency of an optimized tandem cell after irradiation with 10(exp 15) cm (-2) 1 MeV electrons is estimated to be 20 percent using currently available technology.
Combinatorial enzyme technology for the conversion of agricultural fibers to functional properties
USDA-ARS?s Scientific Manuscript database
The concept of combinatorial chemistry has received little attention in agriculture and food research, although its applications in this area were described more than fifteen years ago (1, 2). More recently, interest in the use of combinatorial chemistry in agrochemical discovery has been revitalize...
An Investigation into Post-Secondary Students' Understanding of Combinatorial Questions
ERIC Educational Resources Information Center
Bulone, Vincent William
2017-01-01
The purpose of this dissertation was to study aspects of how post-secondary students understand combinatorial problems. Within this dissertation, I considered understanding through two different lenses: i) student connections to previous problems; and ii) common combinatorial distinctions such as ordered versus unordered and repetitive versus…
ERIC Educational Resources Information Center
Harrison, Don K.; Brown, Dorothy R.
The hard-to-employ, both urban and rural, share common characteristics of inadequate income, slum housing, inferior education, no medical attention, and lack of real job opportunities. The deficiencies dove-tail, and families are often afflicted with all. The picture may seem bleak, but there is optimism in reclamation of the so-called…
PROBABILISTIC CROSS-IDENTIFICATION IN CROWDED FIELDS AS AN ASSIGNMENT PROBLEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budavári, Tamás; Basu, Amitabh, E-mail: budavari@jhu.edu, E-mail: basu.amitabh@jhu.edu
2016-10-01
One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.
NASA Astrophysics Data System (ADS)
1981-04-01
The main topics discussed were related to nonparametric statistics, plane and antiplane states in finite elasticity, free-boundary-variational inequalities, the numerical solution of free boundary-value problems, discrete and combinatorial optimization, mathematical modelling in fluid mechanics, a survey and comparison regarding thermodynamic theories, invariant and almost invariant subspaces in linear systems with applications to disturbance isolation, nonlinear acoustics, and methods of function theory in the case of partial differential equations, giving particular attention to elliptic problems in the plane.
Probabilistic Cross-identification in Crowded Fields as an Assignment Problem
NASA Astrophysics Data System (ADS)
Budavári, Tamás; Basu, Amitabh
2016-10-01
One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.
Experimental realization of a highly secure chaos communication under strong channel noise
NASA Astrophysics Data System (ADS)
Ye, Weiping; Dai, Qionglin; Wang, Shihong; Lu, Huaping; Kuang, Jinyu; Zhao, Zhenfeng; Zhu, Xiangqing; Tang, Guoning; Huang, Ronghuai; Hu, Gang
2004-09-01
A one-way coupled spatiotemporally chaotic map lattice is used to construct cryptosystem. With the combinatorial applications of both chaotic computations and conventional algebraic operations, our system has optimal cryptographic properties much better than the separative applications of known chaotic and conventional methods. We have realized experiments to practice duplex voice secure communications in realistic Wired Public Switched Telephone Network by applying our chaotic system and the system of Advanced Encryption Standard (AES), respectively, for cryptography. Our system can work stably against strong channel noise when AES fails to work.
Integrated Artificial Intelligence Approaches for Disease Diagnostics.
Vashistha, Rajat; Chhabra, Deepak; Shukla, Pratyoosh
2018-06-01
Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.
Xu, Huayong; Yu, Hui; Tu, Kang; Shi, Qianqian; Wei, Chaochun; Li, Yuan-Yuan; Li, Yi-Xue
2013-01-01
We are witnessing rapid progress in the development of methodologies for building the combinatorial gene regulatory networks involving both TFs (Transcription Factors) and miRNAs (microRNAs). There are a few tools available to do these jobs but most of them are not easy to use and not accessible online. A web server is especially needed in order to allow users to upload experimental expression datasets and build combinatorial regulatory networks corresponding to their particular contexts. In this work, we compiled putative TF-gene, miRNA-gene and TF-miRNA regulatory relationships from forward-engineering pipelines and curated them as built-in data libraries. We streamlined the R codes of our two separate forward-and-reverse engineering algorithms for combinatorial gene regulatory network construction and formalized them as two major functional modules. As a result, we released the cGRNB (combinatorial Gene Regulatory Networks Builder): a web server for constructing combinatorial gene regulatory networks through integrated engineering of seed-matching sequence information and gene expression datasets. The cGRNB enables two major network-building modules, one for MPGE (miRNA-perturbed gene expression) datasets and the other for parallel miRNA/mRNA expression datasets. A miRNA-centered two-layer combinatorial regulatory cascade is the output of the first module and a comprehensive genome-wide network involving all three types of combinatorial regulations (TF-gene, TF-miRNA, and miRNA-gene) are the output of the second module. In this article we propose cGRNB, a web server for building combinatorial gene regulatory networks through integrated engineering of seed-matching sequence information and gene expression datasets. Since parallel miRNA/mRNA expression datasets are rapidly accumulated by the advance of next-generation sequencing techniques, cGRNB will be very useful tool for researchers to build combinatorial gene regulatory networks based on expression datasets. The cGRNB web-server is free and available online at http://www.scbit.org/cgrnb.
Combinatorial effects on clumped isotopes and their significance in biogeochemistry
NASA Astrophysics Data System (ADS)
Yeung, Laurence Y.
2016-01-01
The arrangement of isotopes within a collection of molecules records their physical and chemical histories. Clumped-isotope analysis interrogates these arrangements, i.e., how often rare isotopes are bound together, which in many cases can be explained by equilibrium and/or kinetic isotope fractionation. However, purely combinatorial effects, rooted in the statistics of pairing atoms in a closed system, are also relevant, and not well understood. Here, I show that combinatorial isotope effects are most important when two identical atoms are neighbors on the same molecule (e.g., O2, N2, and D-D clumping in CH4). When the two halves of an atom pair are either assembled with different isotopic preferences or drawn from different reservoirs, combinatorial effects cause depletions in clumped-isotope abundance that are most likely between zero and -1‰, although they could potentially be -10‰ or larger for D-D pairs. These depletions are of similar magnitude, but of opposite sign, to low-temperature equilibrium clumped-isotope effects for many small molecules. Enzymatic isotope-pairing reactions, which can have site-specific isotopic fractionation factors and atom reservoirs, should express this class of combinatorial isotope effect, although it is not limited to biological reactions. Chemical-kinetic isotope effects, which are related to a bond-forming transition state, arise independently and express second-order combinatorial effects related to the abundance of the rare isotope. Heteronuclear moeties (e.g., Csbnd O and Csbnd H), are insensitive to direct combinatorial influences, but secondary combinatorial influences are evident. In general, both combinatorial and chemical-kinetic factors are important for calculating and interpreting clumped-isotope signatures of kinetically controlled reactions. I apply this analytical framework to isotope-pairing reactions relevant to geochemical oxygen, carbon, and nitrogen cycling that may be influenced by combinatorial clumped-isotope effects. These isotopic signatures, manifest as either directly bound isotope ;clumps; or as features of a molecule's isotopic anatomy, are linked to molecular mechanisms and may eventually provide additional information about biogeochemical cycling on environmentally relevant spatial scales.
Optimization of Materials and Interfaces for Spintronic Devices
NASA Astrophysics Data System (ADS)
Clark, Billy
In recent years' Spintronic devices have drawn a significant amount of research attention. This interest comes in large part from their ability to enable interesting and new technology such as Spin Torque Transfer Random Access Memory or improve existing technology such as High Signal Read Heads for Hard Disk Drives. For the former we worked on optimizing and improving magnetic tunnel junctions by optimizing their thermal stability by using Ta insertion layers in the free layer. We further tried to simplify the design of the MTJ stack by attempting to replace the Co/Pd multilayer with CoPd alloy. In this dissertation, we detail its development and examine the switching characteristics. Lastly we look at a highly spin polarized material, Fe2MnGe, for optimizing Hard Drive Disk read heads.
Zen, Nur Izzati Mohamad; Abd Gani, Siti Salwa; Shamsudin, Rosnah; Masoumi, Hamid Reza Fard
2015-01-01
The usage of soy is increasing year by year. It increases the problem of financial crisis due to the limited sources of soybeans. Therefore, production of oral tablets containing the nutritious leftover of soymilk production, called okara, as the main ingredient was investigated. The okara tablets were produced using the direct compression method. The percentage of okara, guar gum, microcrystalline cellulose (Avicel PH-101), and maltodextrin influenced tablets' hardness and friability which are analyzed using a D-optimal mixture design. Composition of Avicel PH-101 had positive effects for both hardness and friability tests of the tablets. Maltodextrin and okara composition had a significant positive effect on tablets' hardness, but not on percentage of friability of tablets. However, guar gum had a negative effect on both physical tests. The optimum tablet formulation was obtained: 47.0% of okara, 2.0% of guar gum, 35.0% of Avicel PH-101, and 14.0% of maltodextrin.
Mohamad Zen, Nur Izzati; Shamsudin, Rosnah
2015-01-01
The usage of soy is increasing year by year. It increases the problem of financial crisis due to the limited sources of soybeans. Therefore, production of oral tablets containing the nutritious leftover of soymilk production, called okara, as the main ingredient was investigated. The okara tablets were produced using the direct compression method. The percentage of okara, guar gum, microcrystalline cellulose (Avicel PH-101), and maltodextrin influenced tablets' hardness and friability which are analyzed using a D-optimal mixture design. Composition of Avicel PH-101 had positive effects for both hardness and friability tests of the tablets. Maltodextrin and okara composition had a significant positive effect on tablets' hardness, but not on percentage of friability of tablets. However, guar gum had a negative effect on both physical tests. The optimum tablet formulation was obtained: 47.0% of okara, 2.0% of guar gum, 35.0% of Avicel PH-101, and 14.0% of maltodextrin. PMID:26171418
The construction of combinatorial manifolds with prescribed sets of links of vertices
NASA Astrophysics Data System (ADS)
Gaifullin, A. A.
2008-10-01
To every oriented closed combinatorial manifold we assign the set (with repetitions) of isomorphism classes of links of its vertices. The resulting transformation \\mathcal{L} is the main object of study in this paper. We pose an inversion problem for \\mathcal{L} and show that this problem is closely related to Steenrod's problem on the realization of cycles and to the Rokhlin-Schwartz-Thom construction of combinatorial Pontryagin classes. We obtain a necessary condition for a set of isomorphism classes of combinatorial spheres to belong to the image of \\mathcal{L}. (Sets satisfying this condition are said to be balanced.) We give an explicit construction showing that every balanced set of isomorphism classes of combinatorial spheres falls into the image of \\mathcal{L} after passing to a multiple set and adding several pairs of the form (Z,-Z), where -Z is the sphere Z with the orientation reversed. Given any singular simplicial cycle \\xi of a space X, this construction enables us to find explicitly a combinatorial manifold M and a map \\varphi\\colon M\\to X such that \\varphi_* \\lbrack M \\rbrack =r[\\xi] for some positive integer r. The construction is based on resolving singularities of \\xi. We give applications of the main construction to cobordisms of manifolds with singularities and cobordisms of simple cells. In particular, we prove that every rational additive invariant of cobordisms of manifolds with singularities admits a local formula. Another application is the construction of explicit (though inefficient) local combinatorial formulae for polynomials in the rational Pontryagin classes of combinatorial manifolds.
ERIC Educational Resources Information Center
Barratt, Barnaby B.
1975-01-01
This study investigated the emergence of combinatorial competence in early adolescence and the effectiveness of a programmed discovery training procedure. Significant increases in combinatorial skill with age were shown; it was found that the expression of this skill was significantly facilitated if problems involved concrete material of low…
Invention as a combinatorial process: evidence from US patents
Youn, Hyejin; Strumsky, Deborah; Bettencourt, Luis M. A.; Lobo, José
2015-01-01
Invention has been commonly conceptualized as a search over a space of combinatorial possibilities. Despite the existence of a rich literature, spanning a variety of disciplines, elaborating on the recombinant nature of invention, we lack a formal and quantitative characterization of the combinatorial process underpinning inventive activity. Here, we use US patent records dating from 1790 to 2010 to formally characterize invention as a combinatorial process. To do this, we treat patented inventions as carriers of technologies and avail ourselves of the elaborate system of technology codes used by the United States Patent and Trademark Office to classify the technologies responsible for an invention's novelty. We find that the combinatorial inventive process exhibits an invariant rate of ‘exploitation’ (refinements of existing combinations of technologies) and ‘exploration’ (the development of new technological combinations). This combinatorial dynamic contrasts sharply with the creation of new technological capabilities—the building blocks to be combined—that has significantly slowed down. We also find that, notwithstanding the very reduced rate at which new technologies are introduced, the generation of novel technological combinations engenders a practically infinite space of technological configurations. PMID:25904530
Seo, Hyung-Min; Jeon, Jong-Min; Lee, Ju Hee; Song, Hun-Suk; Joo, Han-Byul; Park, Sung-Hee; Choi, Kwon-Young; Kim, Yong Hyun; Park, Kyungmoon; Ahn, Jungoh; Lee, Hongweon; Yang, Yung-Hun
2016-01-01
Furfural is a toxic by-product formulated from pretreatment processes of lignocellulosic biomass. In order to utilize the lignocellulosic biomass on isobutanol production, inhibitory effect of the furfural on isobutanol production was investigated and combinatorial application of two oxidoreductases, FucO and YqhD, was suggested as an alternative strategy. Furfural decreased cell growth and isobutanol production when only YqhD or FucO was employed as an isobutyraldehyde oxidoreductase. However, combinatorial overexpression of FucO and YqhD could overcome the inhibitory effect of furfural giving higher isobutanol production by 110% compared with overexpression of YqhD. The combinatorial oxidoreductases increased furfural detoxification rate 2.1-fold and also accelerated glucose consumption 1.4-fold. When it compares to another known system increasing furfural tolerance, membrane-bound transhydrogenase (pntAB), the combinatorial aldehyde oxidoreductases were better on cell growth and production. Thus, to control oxidoreductases is important to produce isobutanol using furfural-containing biomass and the combinatorial overexpression of FucO and YqhD can be an alternative strategy.
Tumor-targeting peptides from combinatorial libraries*
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S.
2018-01-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges infighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. PMID:27210583
Nonparametric Combinatorial Sequence Models
NASA Astrophysics Data System (ADS)
Wauthier, Fabian L.; Jordan, Michael I.; Jojic, Nebojsa
This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This paper presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three sequence datasets which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution induced by the prior. By integrating out the posterior our method compares favorably to leading binding predictors.
Dynamic combinatorial libraries: from exploring molecular recognition to systems chemistry.
Li, Jianwei; Nowak, Piotr; Otto, Sijbren
2013-06-26
Dynamic combinatorial chemistry (DCC) is a subset of combinatorial chemistry where the library members interconvert continuously by exchanging building blocks with each other. Dynamic combinatorial libraries (DCLs) are powerful tools for discovering the unexpected and have given rise to many fascinating molecules, ranging from interlocked structures to self-replicators. Furthermore, dynamic combinatorial molecular networks can produce emergent properties at systems level, which provide exciting new opportunities in systems chemistry. In this perspective we will highlight some new methodologies in this field and analyze selected examples of DCLs that are under thermodynamic control, leading to synthetic receptors, catalytic systems, and complex self-assembled supramolecular architectures. Also reviewed are extensions of the principles of DCC to systems that are not at equilibrium and may therefore harbor richer functional behavior. Examples include self-replication and molecular machines.
Joint optimization of maintenance, buffers and machines in manufacturing lines
NASA Astrophysics Data System (ADS)
Nahas, Nabil; Nourelfath, Mustapha
2018-01-01
This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.
Li, Jin; Tran, Maggie; Siwabessy, Justy
2016-01-01
Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307
Li, Jin; Tran, Maggie; Siwabessy, Justy
2016-01-01
Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.
Zhou, Qian-Mei; Chen, Qi-Long; Du, Jia; Wang, Xiu-Feng; Lu, Yi-Yu; Zhang, Hui; Su, Shi-Bing
2014-01-01
In order to explore the synergistic mechanisms of combinatorial treatment using curcumin and mitomycin C (MMC) for breast cancer, MCF-7 breast cancer xenografts were conducted to observe the synergistic effect of combinatorial treatment using curcumin and MMC at various dosages. The synergistic mechanisms of combinatorial treatment using curcumin and MMC on the inhibition of tumor growth were explored by differential gene expression profile, gene ontology (GO), ingenuity pathway analysis (IPA) and Signal–Net network analysis. The expression levels of selected genes identified by cDNA microarray expression profiling were validated by quantitative RT-PCR (qRT-PCR) and Western blot analysis. Effect of combinatorial treatment on the inhibition of cell growth was observed by MTT assay. Apoptosis was detected by flow cytometric analysis and Hoechst 33258 staining. The combinatorial treatment of 100 mg/kg curcumin and 1.5 mg/kg MMC revealed synergistic inhibition on tumor growth. Among 1501 differentially expressed genes, the expression of 25 genes exhibited an obvious change and a significant difference in 27 signal pathways was observed (p < 0.05). In addition, Mapk1 (ERK) and Mapk14 (MAPK p38) had more cross-interactions with other genes and revealed an increase in expression by 8.14- and 11.84-fold, respectively during the combinatorial treatment by curcumin and MMC when compared with the control. Moreover, curcumin can synergistically improve tumoricidal effect of MMC in another human breast cancer MDA-MB-231 cells. Apoptosis was significantly induced by the combinatorial treatment (p < 0.05) and significantly inhibited by ERK inhibitor (PD98059) in MCF-7 cells (p < 0.05). The synergistic effect of combinatorial treatment by curcumin and MMC on the induction of apoptosis in breast cancer cells may be via the ERK pathway. PMID:25226537
Combinatorial theory of Macdonald polynomials I: proof of Haglund's formula.
Haglund, J; Haiman, M; Loehr, N
2005-02-22
Haglund recently proposed a combinatorial interpretation of the modified Macdonald polynomials H(mu). We give a combinatorial proof of this conjecture, which establishes the existence and integrality of H(mu). As corollaries, we obtain the cocharge formula of Lascoux and Schutzenberger for Hall-Littlewood polynomials, a formula of Sahi and Knop for Jack's symmetric functions, a generalization of this result to the integral Macdonald polynomials J(mu), a formula for H(mu) in terms of Lascoux-Leclerc-Thibon polynomials, and combinatorial expressions for the Kostka-Macdonald coefficients K(lambda,mu) when mu is a two-column shape.
Learning With Mixed Hard/Soft Pointwise Constraints.
Gnecco, Giorgio; Gori, Marco; Melacci, Stefano; Sanguineti, Marcello
2015-09-01
A learning paradigm is proposed and investigated, in which the classical framework of learning from examples is enhanced by the introduction of hard pointwise constraints, i.e., constraints imposed on a finite set of examples that cannot be violated. Such constraints arise, e.g., when requiring coherent decisions of classifiers acting on different views of the same pattern. The classical examples of supervised learning, which can be violated at the cost of some penalization (quantified by the choice of a suitable loss function) play the role of soft pointwise constraints. Constrained variational calculus is exploited to derive a representer theorem that provides a description of the functional structure of the optimal solution to the proposed learning paradigm. It is shown that such an optimal solution can be represented in terms of a set of support constraints, which generalize the concept of support vectors and open the doors to a novel learning paradigm, called support constraint machines. The general theory is applied to derive the representation of the optimal solution to the problem of learning from hard linear pointwise constraints combined with soft pointwise constraints induced by supervised examples. In some cases, closed-form optimal solutions are obtained.
Signal dimensionality and the emergence of combinatorial structure.
Little, Hannah; Eryılmaz, Kerem; de Boer, Bart
2017-11-01
In language, a small number of meaningless building blocks can be combined into an unlimited set of meaningful utterances. This is known as combinatorial structure. One hypothesis for the initial emergence of combinatorial structure in language is that recombining elements of signals solves the problem of overcrowding in a signal space. Another hypothesis is that iconicity may impede the emergence of combinatorial structure. However, how these two hypotheses relate to each other is not often discussed. In this paper, we explore how signal space dimensionality relates to both overcrowding in the signal space and iconicity. We use an artificial signalling experiment to test whether a signal space and a meaning space having similar topologies will generate an iconic system and whether, when the topologies differ, the emergence of combinatorially structured signals is facilitated. In our experiments, signals are created from participants' hand movements, which are measured using an infrared sensor. We found that participants take advantage of iconic signal-meaning mappings where possible. Further, we use trajectory predictability, measures of variance, and Hidden Markov Models to measure the use of structure within the signals produced and found that when topologies do not match, then there is more evidence of combinatorial structure. The results from these experiments are interpreted in the context of the differences between the emergence of combinatorial structure in different linguistic modalities (speech and sign). Copyright © 2017 Elsevier B.V. All rights reserved.
New Hardness Results for Diophantine Approximation
NASA Astrophysics Data System (ADS)
Eisenbrand, Friedrich; Rothvoß, Thomas
We revisit simultaneous Diophantine approximation, a classical problem from the geometry of numbers which has many applications in algorithms and complexity. The input to the decision version of this problem consists of a rational vector α ∈ ℚ n , an error bound ɛ and a denominator bound N ∈ ℕ + . One has to decide whether there exists an integer, called the denominator Q with 1 ≤ Q ≤ N such that the distance of each number Q ·α i to its nearest integer is bounded by ɛ. Lagarias has shown that this problem is NP-complete and optimization versions have been shown to be hard to approximate within a factor n c/ loglogn for some constant c > 0. We strengthen the existing hardness results and show that the optimization problem of finding the smallest denominator Q ∈ ℕ + such that the distances of Q·α i to the nearest integer are bounded by ɛ is hard to approximate within a factor 2 n unless {textrm{P}} = NP.
ERIC Educational Resources Information Center
Stevens, Victoria
2014-01-01
The author considers combinatory play as an intersection between creativity, play, and neuroaesthetics. She discusses combinatory play as vital to the creative process in art and science, particularly with regard to the incubation of new ideas. She reviews findings from current neurobiological research and outlines the way that the brain activates…
Kim, Hyo Jin; Turner, Timothy Lee; Jin, Yong-Su
2013-11-01
Recent advances in metabolic engineering have enabled microbial factories to compete with conventional processes for producing fuels and chemicals. Both rational and combinatorial approaches coupled with synthetic and systematic tools play central roles in metabolic engineering to create and improve a selected microbial phenotype. Compared to knowledge-based rational approaches, combinatorial approaches exploiting biological diversity and high-throughput screening have been demonstrated as more effective tools for improving various phenotypes of interest. In particular, identification of unprecedented targets to rewire metabolic circuits for maximizing yield and productivity of a target chemical has been made possible. This review highlights general principles and the features of the combinatorial approaches using various libraries to implement desired phenotypes for strain improvement. In addition, recent applications that harnessed the combinatorial approaches to produce biofuels and biochemicals will be discussed. Copyright © 2013 Elsevier Inc. All rights reserved.
Tumor-targeting peptides from combinatorial libraries.
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S
2017-02-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges in fighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. Copyright © 2017. Published by Elsevier B.V.
CTLA-4 blockade plus adoptive T cell transfer promotes optimal melanoma immunity in mice
Mahvi, David A.; Meyers, Justin V.; Tatar, Andrew J.; Contreras, Amanda; Suresh, M.; Leverson, Glen E.; Sen, Siddhartha; Cho, Clifford S.
2014-01-01
Immunotherapeutic approaches to the treatment of advanced melanoma have relied on strategies that augment the responsiveness of endogenous tumor-specific T cell populations (e.g., CTLA-4 blockade-mediated checkpoint inhibition) or introduce exogenously-prepared tumor-specific T cell populations (e.g., adoptive cell transfer). Although both approaches have shown considerable promise, response rates to these therapies remain suboptimal. We hypothesized that a combinatorial approach to immunotherapy using both CTLA-4 blockade and non-lymphodepletional adoptive cell transfer could offer additive therapeutic benefit. C57BL/6 mice were inoculated with syngeneic B16F10 melanoma tumors transfected to express low levels of the lymphocytic choriomeningitis virus peptide GP33 (B16GP33), and treated with no immunotherapy, CTLA-4 blockade, adoptive cell transfer, or combination immunotherapy of CTLA-4 blockade with adoptive cell transfer. Combination immunotherapy resulted in optimal control of B16GP33 melanoma tumors. Combination immunotherapy promoted a stronger local immune response reflected by enhanced tumor-infiltrating lymphocyte populations, as well as a stronger systemic immune responses reflected by more potent tumor antigen-specific T cell activity in splenocytes. In addition, whereas both CTLA-4 blockade and combination immunotherapy were able to promote long-term immunity against B16GP33 tumors, only combination immunotherapy was capable of promoting immunity against parental B16F10 tumors as well. Our findings suggest that a combinatorial approach using CTLA-4 blockade with non-lymphodepletional adoptive cell transfer may promote additive endogenous and exogenous T cell activities that enable greater therapeutic efficacy in the treatment of melanoma. PMID:25658614
Advanced fitness landscape analysis and the performance of memetic algorithms.
Merz, Peter
2004-01-01
Memetic algorithms (MAs) have demonstrated very effective in combinatorial optimization. This paper offers explanations as to why this is so by investigating the performance of MAs in terms of efficiency and effectiveness. A special class of MAs is used to discuss efficiency and effectiveness for local search and evolutionary meta-search. It is shown that the efficiency of MAs can be increased drastically with the use of domain knowledge. However, effectiveness highly depends on the structure of the problem. As is well-known, identifying this structure is made easier with the notion of fitness landscapes: the local properties of the fitness landscape strongly influence the effectiveness of the local search while the global properties strongly influence the effectiveness of the evolutionary meta-search. This paper also introduces new techniques for analyzing the fitness landscapes of combinatorial problems; these techniques focus on the investigation of random walks in the fitness landscape starting at locally optimal solutions as well as on the escape from the basins of attractions of current local optima. It is shown for NK-landscapes and landscapes of the unconstrained binary quadratic programming problem (BQP) that a random walk to another local optimum can be used to explain the efficiency of recombination in comparison to mutation. Moreover, the paper shows that other aspects like the size of the basins of attractions of local optima are important for the efficiency of MAs and a local search escape analysis is proposed. These simple analysis techniques have several advantages over previously proposed statistical measures and provide valuable insight into the behaviour of MAs on different kinds of landscapes.
NASA Astrophysics Data System (ADS)
Fazli Shahri, Hamid Reza; Mahdavinejad, Ramezanali
2018-02-01
Thermal-based processes with Gaussian heat source often produce excessive temperature which can impose thermally-affected layers in specimens. Therefore, the temperature distribution and Heat Affected Zone (HAZ) of materials are two critical factors which are influenced by different process parameters. Measurement of the HAZ thickness and temperature distribution within the processes are not only difficult but also expensive. This research aims at finding a valuable knowledge on these factors by prediction of the process through a novel combinatory model. In this study, an integrated Artificial Neural Network (ANN) and genetic algorithm (GA) was used to predict the HAZ and temperature distribution of the specimens. To end this, a series of full factorial design of experiments were conducted by applying a Gaussian heat flux on Ti-6Al-4 V at first, then the temperature of the specimen was measured by Infrared thermography. The HAZ width of each sample was investigated through measuring the microhardness. Secondly, the experimental data was used to create a GA-ANN model. The efficiency of GA in design and optimization of the architecture of ANN was investigated. The GA was used to determine the optimal number of neurons in hidden layer, learning rate and momentum coefficient of both output and hidden layers of ANN. Finally, the reliability of models was assessed according to the experimental results and statistical indicators. The results demonstrated that the combinatory model predicted the HAZ and temperature more effective than a trial-and-error ANN model.
A high-level language for rule-based modelling.
Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D
2015-01-01
Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.
A High-Level Language for Rule-Based Modelling
Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.
2015-01-01
Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208
Hard X-ray Imaging for Measuring Laser Absorption Spatial Profiles on the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewald, E L; Jones, O S; Landen, O L
2006-04-25
Hard x-ray (''Thin wall'') imaging will be employed on the National Ignition Facility (NIF) to spatially locate laser beam energy deposition regions on the hohlraum walls in indirect drive Inertial Confinement Fusion (ICF) experiments, relevant for ICF symmetry tuning. Based on time resolved imaging of the hard x-ray emission of the laser spots, this method will be used to infer hohlraum wall motion due to x-ray and laser ablation and any beam refraction caused by plasma density gradients. In optimizing this measurement, issues that have to be addressed are hard x-ray visibility during the entire ignition laser pulse with intensitiesmore » ranging from 10{sup 13} to 10{sup 15} W/cm{sup 2}, as well as simultaneous visibility of the inner and the outer laser drive cones. In this work we will compare the hard x-ray emission calculated by LASNEX and analytical modeling with thin wall imaging data recorded previously on Omega and during the first hohlraum experiments on NIF. Based on these calculations and comparisons the thin wall imaging will be optimized for ICF/NIF experiments.« less
The application of artificial intelligence in the optimal design of mechanical systems
NASA Astrophysics Data System (ADS)
Poteralski, A.; Szczepanik, M.
2016-11-01
The paper is devoted to new computational techniques in mechanical optimization where one tries to study, model, analyze and optimize very complex phenomena, for which more precise scientific tools of the past were incapable of giving low cost and complete solution. Soft computing methods differ from conventional (hard) computing in that, unlike hard computing, they are tolerant of imprecision, uncertainty, partial truth and approximation. The paper deals with an application of the bio-inspired methods, like the evolutionary algorithms (EA), the artificial immune systems (AIS) and the particle swarm optimizers (PSO) to optimization problems. Structures considered in this work are analyzed by the finite element method (FEM), the boundary element method (BEM) and by the method of fundamental solutions (MFS). The bio-inspired methods are applied to optimize shape, topology and material properties of 2D, 3D and coupled 2D/3D structures, to optimize the termomechanical structures, to optimize parameters of composites structures modeled by the FEM, to optimize the elastic vibrating systems to identify the material constants for piezoelectric materials modeled by the BEM and to identify parameters in acoustics problem modeled by the MFS.
Morphological Constraints on Cerebellar Granule Cell Combinatorial Diversity.
Gilmer, Jesse I; Person, Abigail L
2017-12-13
Combinatorial expansion by the cerebellar granule cell layer (GCL) is fundamental to theories of cerebellar contributions to motor control and learning. Granule cells (GrCs) sample approximately four mossy fiber inputs and are thought to form a combinatorial code useful for pattern separation and learning. We constructed a spatially realistic model of the cerebellar GCL and examined how GCL architecture contributes to GrC combinatorial diversity. We found that GrC combinatorial diversity saturates quickly as mossy fiber input diversity increases, and that this saturation is in part a consequence of short dendrites, which limit access to diverse inputs and favor dense sampling of local inputs. This local sampling also produced GrCs that were combinatorially redundant, even when input diversity was extremely high. In addition, we found that mossy fiber clustering, which is a common anatomical pattern, also led to increased redundancy of GrC input combinations. We related this redundancy to hypothesized roles of temporal expansion of GrC information encoding in service of learned timing, and we show that GCL architecture produces GrC populations that support both temporal and combinatorial expansion. Finally, we used novel anatomical measurements from mice of either sex to inform modeling of sparse and filopodia-bearing mossy fibers, finding that these circuit features uniquely contribute to enhancing GrC diversification and redundancy. Our results complement information theoretic studies of granule layer structure and provide insight into the contributions of granule layer anatomical features to afferent mixing. SIGNIFICANCE STATEMENT Cerebellar granule cells are among the simplest neurons, with tiny somata and, on average, just four dendrites. These characteristics, along with their dense organization, inspired influential theoretical work on the granule cell layer as a combinatorial expander, where each granule cell represents a unique combination of inputs. Despite the centrality of these theories to cerebellar physiology, the degree of expansion supported by anatomically realistic patterns of inputs is unknown. Using modeling and anatomy, we show that realistic input patterns constrain combinatorial diversity by producing redundant combinations, which nevertheless could support temporal diversification of like combinations, suitable for learned timing. Our study suggests a neural substrate for producing high levels of both combinatorial and temporal diversity in the granule cell layer. Copyright © 2017 the authors 0270-6474/17/3712153-14$15.00/0.
Incorporation of β-glucans in meat emulsions through an optimal mixture modeling systems.
Vasquez Mejia, Sandra M; de Francisco, Alicia; Manique Barreto, Pedro L; Damian, César; Zibetti, Andre Wüst; Mahecha, Hector Suárez; Bohrer, Benjamin M
2018-09-01
The effects of β-glucans (βG) in beef emulsions with carrageenan and starch were evaluated using an optimal mixture modeling system. The best mathematical models to describe the cooking loss, color, and textural profile analysis (TPA) were selected and optimized. The cubic models were better to describe the cooking loss, color, and TPA parameters, with the exception of springiness. Emulsions with greater levels of βG and starch had less cooking loss (<1%), intermediate L* (>54 and <62), and greater hardness, cohesiveness and springiness values. Subsequently, during the optimization phase, the use of carrageenan was eliminated. The optimized emulsion contained 3.13 ± 0.11% βG, which could cover the intake daily of βG recommendations. However, the hardness of the optimized emulsion was greater (60,224 ± 1025 N) than expected. The optimized emulsion had a homogeneous structure and normal thermal behavior by DSC and allowed for the manufacture of products with high amounts of βG and desired functional attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fuzzy multiobjective models for optimal operation of a hydropower system
NASA Astrophysics Data System (ADS)
Teegavarapu, Ramesh S. V.; Ferreira, André R.; Simonovic, Slobodan P.
2013-06-01
Optimal operation models for a hydropower system using new fuzzy multiobjective mathematical programming models are developed and evaluated in this study. The models use (i) mixed integer nonlinear programming (MINLP) with binary variables and (ii) integrate a new turbine unit commitment formulation along with water quality constraints used for evaluation of reservoir downstream impairment. Reardon method used in solution of genetic algorithm optimization problems forms the basis for development of a new fuzzy multiobjective hydropower system optimization model with creation of Reardon type fuzzy membership functions. The models are applied to a real-life hydropower reservoir system in Brazil. Genetic Algorithms (GAs) are used to (i) solve the optimization formulations to avoid computational intractability and combinatorial problems associated with binary variables in unit commitment, (ii) efficiently address Reardon method formulations, and (iii) deal with local optimal solutions obtained from the use of traditional gradient-based solvers. Decision maker's preferences are incorporated within fuzzy mathematical programming formulations to obtain compromise operating rules for a multiobjective reservoir operation problem dominated by conflicting goals of energy production, water quality and conservation releases. Results provide insight into compromise operation rules obtained using the new Reardon fuzzy multiobjective optimization framework and confirm its applicability to a variety of multiobjective water resources problems.
2D photonic crystal complete band gap search using a cyclic cellular automaton refination
NASA Astrophysics Data System (ADS)
González-García, R.; Castañón, G.; Hernández-Figueroa, H. E.
2014-11-01
We present a refination method based on a cyclic cellular automaton (CCA) that simulates a crystallization-like process, aided with a heuristic evolutionary method called differential evolution (DE) used to perform an ordered search of full photonic band gaps (FPBGs) in a 2D photonic crystal (PC). The solution is proposed as a combinatorial optimization of the elements in a binary array. These elements represent the existence or absence of a dielectric material surrounded by air, thus representing a general geometry whose search space is defined by the number of elements in such array. A block-iterative frequency-domain method was used to compute the FPBGs on a PC, when present. DE has proved to be useful in combinatorial problems and we also present an implementation feature that takes advantage of the periodic nature of PCs to enhance the convergence of this algorithm. Finally, we used this methodology to find a PC structure with a 19% bandgap-to-midgap ratio without requiring previous information of suboptimal configurations and we made a statistical study of how it is affected by disorder in the borders of the structure compared with a previous work that uses a genetic algorithm.
A combinatorial approach to protein docking with flexible side chains.
Althaus, Ernst; Kohlbacher, Oliver; Lenhof, Hans-Peter; Müller, Peter
2002-01-01
Rigid-body docking approaches are not sufficient to predict the structure of a protein complex from the unbound (native) structures of the two proteins. Accounting for side chain flexibility is an important step towards fully flexible protein docking. This work describes an approach that allows conformational flexibility for the side chains while keeping the protein backbone rigid. Starting from candidates created by a rigid-docking algorithm, we demangle the side chains of the docking site, thus creating reasonable approximations of the true complex structure. These structures are ranked with respect to the binding free energy. We present two new techniques for side chain demangling. Both approaches are based on a discrete representation of the side chain conformational space by the use of a rotamer library. This leads to a combinatorial optimization problem. For the solution of this problem, we propose a fast heuristic approach and an exact, albeit slower, method that uses branch-and-cut techniques. As a test set, we use the unbound structures of three proteases and the corresponding protein inhibitors. For each of the examples, the highest-ranking conformation produced was a good approximation of the true complex structure.
A ripple-spreading genetic algorithm for the aircraft sequencing problem.
Hu, Xiao-Bing; Di Paolo, Ezequiel A
2011-01-01
When genetic algorithms (GAs) are applied to combinatorial problems, permutation representations are usually adopted. As a result, such GAs are often confronted with feasibility and memory-efficiency problems. With the aircraft sequencing problem (ASP) as a study case, this paper reports on a novel binary-representation-based GA scheme for combinatorial problems. Unlike existing GAs for the ASP, which typically use permutation representations based on aircraft landing order, the new GA introduces a novel ripple-spreading model which transforms the original landing-order-based ASP solutions into value-based ones. In the new scheme, arriving aircraft are projected as points into an artificial space. A deterministic method inspired by the natural phenomenon of ripple-spreading on liquid surfaces is developed, which uses a few parameters as input to connect points on this space to form a landing sequence. A traditional GA, free of feasibility and memory-efficiency problems, can then be used to evolve the ripple-spreading related parameters in order to find an optimal sequence. Since the ripple-spreading model is the centerpiece of the new algorithm, it is called the ripple-spreading GA (RSGA). The advantages of the proposed RSGA are illustrated by extensive comparative studies for the case of the ASP.
Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall
2011-01-01
One bottleneck in NMR structure determination lies in the laborious and time-consuming process of side-chain resonance and NOE assignments. Compared to the well-studied backbone resonance assignment problem, automated side-chain resonance and NOE assignments are relatively less explored. Most NOE assignment algorithms require nearly complete side-chain resonance assignments from a series of through-bond experiments such as HCCH-TOCSY or HCCCONH. Unfortunately, these TOCSY experiments perform poorly on large proteins. To overcome this deficiency, we present a novel algorithm, called NASCA (NOE Assignment and Side-Chain Assignment), to automate both side-chain resonance and NOE assignments and to perform high-resolution protein structure determination in the absence of any explicit through-bond experiment to facilitate side-chain resonance assignment, such as HCCH-TOCSY. After casting the assignment problem into a Markov Random Field (MRF), NASCA extends and applies combinatorial protein design algorithms to compute optimal assignments that best interpret the NMR data. The MRF captures the contact map information of the protein derived from NOESY spectra, exploits the backbone structural information determined by RDCs, and considers all possible side-chain rotamers. The complexity of the combinatorial search is reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is employed to find a set of optimal side-chain resonance assignments that best fit the NMR data. These side-chain resonance assignments are then used to resolve the NOE assignment ambiguity and compute high-resolution protein structures. Tests on five proteins show that NASCA assigns resonances for more than 90% of side-chain protons, and achieves about 80% correct assignments. The final structures computed using the NOE distance restraints assigned by NASCA have backbone RMSD 0.8 – 1.5 Å from the reference structures determined by traditional NMR approaches. PMID:21706248
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siol, Sebastian; Dhakal, Tara P.; Gudavalli, Ganesh S.
High-throughput computational and experimental techniques have been used in the past to accelerate the discovery of new promising solar cell materials. An important part of the development of novel thin film solar cell technologies, that is still considered a bottleneck for both theory and experiment, is the search for alternative interfacial contact (buffer) layers. The research and development of contact materials is difficult due to the inherent complexity that arises from its interactions at the interface with the absorber. A promising alternative to the commonly used CdS buffer layer in thin film solar cells that contain absorbers with lower electronmore » affinity can be found in ..beta..-In2S3. However, the synthesis conditions for the sputter deposition of this material are not well-established. Here, In2S3 is investigated as a solar cell contact material utilizing a high-throughput combinatorial screening of the temperature-flux parameter space, followed by a number of spatially resolved characterization techniques. It is demonstrated that, by tuning the sulfur partial pressure, phase pure ..beta..-In2S3 could be deposited using a broad range of substrate temperatures between 500 degrees C and ambient temperature. Combinatorial photovoltaic device libraries with Al/ZnO/In2S3/Cu2ZnSnS4/Mo/SiO2 structure were built at optimal processing conditions to investigate the feasibility of the sputtered In2S3 buffer layers and of an accelerated optimization of the device structure. The performance of the resulting In2S3/Cu2ZnSnS4 photovoltaic devices is on par with CdS/Cu2ZnSnS4 reference solar cells with similar values for short circuit currents and open circuit voltages, despite the overall quite low efficiency of the devices (-2%). Overall, these results demonstrate how a high-throughput experimental approach can be used to accelerate the development of contact materials and facilitate the optimization of thin film solar cell devices.« less
NASA Astrophysics Data System (ADS)
Burello, E.; Bologa, C.; Frecer, V.; Miertus, S.
Combinatorial chemistry and technologies have been developed to a stage where synthetic schemes are available for generation of a large variety of organic molecules. The innovative concept of combinatorial design assumes that screening of a large and diverse library of compounds will increase the probability of finding an active analogue among the compounds tested. Since the rate at which libraries are screened for activity currently constitutes a limitation to the use of combinatorial technologies, it is important to be selective about the number of compounds to be synthesized. Early experience with combinatorial chemistry indicated that chemical diversity alone did not result in a significant increase in the number of generated lead compounds. Emphasis has therefore been increasingly put on the use of computer assisted combinatorial chemical techniques. Computational methods are valuable in the design of virtual libraries of molecular models. Selection strategies based on computed physicochemical properties of the models or of a target compound are introduced to reduce the time and costs of library synthesis and screening. In addition, computational structure-based library focusing methods can be used to perform in silico screening of the activity of compounds against a target receptor by docking the ligands into the receptor model. Three case studies are discussed dealing with the design of targeted combinatorial libraries of inhibitors of HIV-1 protease, P. falciparum plasmepsin and human urokinase as potential antivirial, antimalarial and anticancer drugs. These illustrate library focusing strategies.
Huang, Xiaoqiang; Han, Kehang; Zhu, Yushan
2013-01-01
A systematic optimization model for binding sequence selection in computational enzyme design was developed based on the transition state theory of enzyme catalysis and graph-theoretical modeling. The saddle point on the free energy surface of the reaction system was represented by catalytic geometrical constraints, and the binding energy between the active site and transition state was minimized to reduce the activation energy barrier. The resulting hyperscale combinatorial optimization problem was tackled using a novel heuristic global optimization algorithm, which was inspired and tested by the protein core sequence selection problem. The sequence recapitulation tests on native active sites for two enzyme catalyzed hydrolytic reactions were applied to evaluate the predictive power of the design methodology. The results of the calculation show that most of the native binding sites can be successfully identified if the catalytic geometrical constraints and the structural motifs of the substrate are taken into account. Reliably predicting active site sequences may have significant implications for the creation of novel enzymes that are capable of catalyzing targeted chemical reactions. PMID:23649589
Optimal de novo design of MRM experiments for rapid assay development in targeted proteomics.
Bertsch, Andreas; Jung, Stephan; Zerck, Alexandra; Pfeifer, Nico; Nahnsen, Sven; Henneges, Carsten; Nordheim, Alfred; Kohlbacher, Oliver
2010-05-07
Targeted proteomic approaches such as multiple reaction monitoring (MRM) overcome problems associated with classical shotgun mass spectrometry experiments. Developing MRM quantitation assays can be time consuming, because relevant peptide representatives of the proteins must be found and their retention time and the product ions must be determined. Given the transitions, hundreds to thousands of them can be scheduled into one experiment run. However, it is difficult to select which of the transitions should be included into a measurement. We present a novel algorithm that allows the construction of MRM assays from the sequence of the targeted proteins alone. This enables the rapid development of targeted MRM experiments without large libraries of transitions or peptide spectra. The approach relies on combinatorial optimization in combination with machine learning techniques to predict proteotypicity, retention time, and fragmentation of peptides. The resulting potential transitions are scheduled optimally by solving an integer linear program. We demonstrate that fully automated construction of MRM experiments from protein sequences alone is possible and over 80% coverage of the targeted proteins can be achieved without further optimization of the assay.
NASA Astrophysics Data System (ADS)
Moghaddam, Kamran S.; Usher, John S.
2011-07-01
In this article, a new multi-objective optimization model is developed to determine the optimal preventive maintenance and replacement schedules in a repairable and maintainable multi-component system. In this model, the planning horizon is divided into discrete and equally-sized periods in which three possible actions must be planned for each component, namely maintenance, replacement, or do nothing. The objective is to determine a plan of actions for each component in the system while minimizing the total cost and maximizing overall system reliability simultaneously over the planning horizon. Because of the complexity, combinatorial and highly nonlinear structure of the mathematical model, two metaheuristic solution methods, generational genetic algorithm, and a simulated annealing are applied to tackle the problem. The Pareto optimal solutions that provide good tradeoffs between the total cost and the overall reliability of the system can be obtained by the solution approach. Such a modeling approach should be useful for maintenance planners and engineers tasked with the problem of developing recommended maintenance plans for complex systems of components.
XY vs X Mixer in Quantum Alternating Operator Ansatz for Optimization Problems with Constraints
NASA Technical Reports Server (NTRS)
Wang, Zhihui; Rubin, Nicholas; Rieffel, Eleanor G.
2018-01-01
Quantum Approximate Optimization Algorithm, further generalized as Quantum Alternating Operator Ansatz (QAOA), is a family of algorithms for combinatorial optimization problems. It is a leading candidate to run on emerging universal quantum computers to gain insight into quantum heuristics. In constrained optimization, penalties are often introduced so that the ground state of the cost Hamiltonian encodes the solution (a standard practice in quantum annealing). An alternative is to choose a mixing Hamiltonian such that the constraint corresponds to a constant of motion and the quantum evolution stays in the feasible subspace. Better performance of the algorithm is speculated due to a much smaller search space. We consider problems with a constant Hamming weight as the constraint. We also compare different methods of generating the generalized W-state, which serves as a natural initial state for the Hamming-weight constraint. Using graph-coloring as an example, we compare the performance of using XY model as a mixer that preserves the Hamming weight with the performance of adding a penalty term in the cost Hamiltonian.
A multipopulation PSO based memetic algorithm for permutation flow shop scheduling.
Liu, Ruochen; Ma, Chenlin; Ma, Wenping; Li, Yangyang
2013-01-01
The permutation flow shop scheduling problem (PFSSP) is part of production scheduling, which belongs to the hardest combinatorial optimization problem. In this paper, a multipopulation particle swarm optimization (PSO) based memetic algorithm (MPSOMA) is proposed in this paper. In the proposed algorithm, the whole particle swarm population is divided into three subpopulations in which each particle evolves itself by the standard PSO and then updates each subpopulation by using different local search schemes such as variable neighborhood search (VNS) and individual improvement scheme (IIS). Then, the best particle of each subpopulation is selected to construct a probabilistic model by using estimation of distribution algorithm (EDA) and three particles are sampled from the probabilistic model to update the worst individual in each subpopulation. The best particle in the entire particle swarm is used to update the global optimal solution. The proposed MPSOMA is compared with two recently proposed algorithms, namely, PSO based memetic algorithm (PSOMA) and hybrid particle swarm optimization with estimation of distribution algorithm (PSOEDA), on 29 well-known PFFSPs taken from OR-library, and the experimental results show that it is an effective approach for the PFFSP.
A New Model for a Carpool Matching Service.
Xia, Jizhe; Curtin, Kevin M; Li, Weihong; Zhao, Yonglong
2015-01-01
Carpooling is an effective means of reducing traffic. A carpool team shares a vehicle for their commute, which reduces the number of vehicles on the road during rush hour periods. Carpooling is officially sanctioned by most governments, and is supported by the construction of high-occupancy vehicle lanes. A number of carpooling services have been designed in order to match commuters into carpool teams, but it known that the determination of optimal carpool teams is a combinatorially complex problem, and therefore technological solutions are difficult to achieve. In this paper, a model for carpool matching services is proposed, and both optimal and heuristic approaches are tested to find solutions for that model. The results show that different solution approaches are preferred over different ranges of problem instances. Most importantly, it is demonstrated that a new formulation and associated solution procedures can permit the determination of optimal carpool teams and routes. An instantiation of the model is presented (using the street network of Guangzhou city, China) to demonstrate how carpool teams can be determined.
AI techniques for a space application scheduling problem
NASA Technical Reports Server (NTRS)
Thalman, N.; Sparn, T.; Jaffres, L.; Gablehouse, D.; Judd, D.; Russell, C.
1991-01-01
Scheduling is a very complex optimization problem which can be categorized as an NP-complete problem. NP-complete problems are quite diverse, as are the algorithms used in searching for an optimal solution. In most cases, the best solutions that can be derived for these combinatorial explosive problems are near-optimal solutions. Due to the complexity of the scheduling problem, artificial intelligence (AI) can aid in solving these types of problems. Some of the factors are examined which make space application scheduling problems difficult and presents a fairly new AI-based technique called tabu search as applied to a real scheduling application. the specific problem is concerned with scheduling application. The specific problem is concerned with scheduling solar and stellar observations for the SOLar-STellar Irradiance Comparison Experiment (SOLSTICE) instrument in a constrained environment which produces minimum impact on the other instruments and maximizes target observation times. The SOLSTICE instrument will gly on-board the Upper Atmosphere Research Satellite (UARS) in 1991, and a similar instrument will fly on the earth observing system (Eos).
A New Model for a Carpool Matching Service
Xia, Jizhe; Curtin, Kevin M.; Li, Weihong; Zhao, Yonglong
2015-01-01
Carpooling is an effective means of reducing traffic. A carpool team shares a vehicle for their commute, which reduces the number of vehicles on the road during rush hour periods. Carpooling is officially sanctioned by most governments, and is supported by the construction of high-occupancy vehicle lanes. A number of carpooling services have been designed in order to match commuters into carpool teams, but it known that the determination of optimal carpool teams is a combinatorially complex problem, and therefore technological solutions are difficult to achieve. In this paper, a model for carpool matching services is proposed, and both optimal and heuristic approaches are tested to find solutions for that model. The results show that different solution approaches are preferred over different ranges of problem instances. Most importantly, it is demonstrated that a new formulation and associated solution procedures can permit the determination of optimal carpool teams and routes. An instantiation of the model is presented (using the street network of Guangzhou city, China) to demonstrate how carpool teams can be determined. PMID:26125552
Pantazes, Robert J; Saraf, Manish C; Maranas, Costas D
2007-08-01
In this paper, we introduce and test two new sequence-based protein scoring systems (i.e. S1, S2) for assessing the likelihood that a given protein hybrid will be functional. By binning together amino acids with similar properties (i.e. volume, hydrophobicity and charge) the scoring systems S1 and S2 allow for the quantification of the severity of mismatched interactions in the hybrids. The S2 scoring system is found to be able to significantly functionally enrich a cytochrome P450 library over other scoring methods. Given this scoring base, we subsequently constructed two separate optimization formulations (i.e. OPTCOMB and OPTOLIGO) for optimally designing protein combinatorial libraries involving recombination or mutations, respectively. Notably, two separate versions of OPTCOMB are generated (i.e. model M1, M2) with the latter allowing for position-dependent parental fragment skipping. Computational benchmarking results demonstrate the efficacy of models OPTCOMB and OPTOLIGO to generate high scoring libraries of a prespecified size.
NASA Technical Reports Server (NTRS)
1971-01-01
The optimal allocation of resources to the national space program over an extended time period requires the solution of a large combinatorial problem in which the program elements are interdependent. The computer model uses an accelerated search technique to solve this problem. The model contains a large number of options selectable by the user to provide flexible input and a broad range of output for use in sensitivity analyses of all entering elements. Examples of these options are budget smoothing under varied appropriation levels, entry of inflation and discount effects, and probabilistic output which provides quantified degrees of certainty that program costs will remain within planned budget. Criteria and related analytic procedures were established for identifying potential new space program directions. Used in combination with the optimal resource allocation model, new space applications can be analyzed in realistic perspective, including the advantage gain from existing space program plant and on-going programs such as the space transportation system.
Annealing Ant Colony Optimization with Mutation Operator for Solving TSP
2016-01-01
Ant Colony Optimization (ACO) has been successfully applied to solve a wide range of combinatorial optimization problems such as minimum spanning tree, traveling salesman problem, and quadratic assignment problem. Basic ACO has drawbacks of trapping into local minimum and low convergence rate. Simulated annealing (SA) and mutation operator have the jumping ability and global convergence; and local search has the ability to speed up the convergence. Therefore, this paper proposed a hybrid ACO algorithm integrating the advantages of ACO, SA, mutation operator, and local search procedure to solve the traveling salesman problem. The core of algorithm is based on the ACO. SA and mutation operator were used to increase the ants population diversity from time to time and the local search was used to exploit the current search area efficiently. The comparative experiments, using 24 TSP instances from TSPLIB, show that the proposed algorithm outperformed some well-known algorithms in the literature in terms of solution quality. PMID:27999590
Bifurcation analysis of eight coupled degenerate optical parametric oscillators
NASA Astrophysics Data System (ADS)
Ito, Daisuke; Ueta, Tetsushi; Aihara, Kazuyuki
2018-06-01
A degenerate optical parametric oscillator (DOPO) network realized as a coherent Ising machine can be used to solve combinatorial optimization problems. Both theoretical and experimental investigations into the performance of DOPO networks have been presented previously. However a problem remains, namely that the dynamics of the DOPO network itself can lower the search success rates of globally optimal solutions for Ising problems. This paper shows that the problem is caused by pitchfork bifurcations due to the symmetry structure of coupled DOPOs. Some two-parameter bifurcation diagrams of equilibrium points express the performance deterioration. It is shown that the emergence of non-ground states regarding local minima hampers the system from reaching the ground states corresponding to the global minimum. We then describe a parametric strategy for leading a system to the ground state by actively utilizing the bifurcation phenomena. By adjusting the parameters to break particular symmetry, we find appropriate parameter sets that allow the coherent Ising machine to obtain the globally optimal solution alone.
Shelf life modelling for first-expired-first-out warehouse management
Hertog, Maarten L. A. T. M.; Uysal, Ismail; McCarthy, Ultan; Verlinden, Bert M.; Nicolaï, Bart M.
2014-01-01
In the supply chain of perishable food products, large losses are incurred between farm and fork. Given the limited land resources and an ever-growing population, the food supply chain is faced with the challenge of increasing its handling efficiency and minimizing post-harvest food losses. Huge value can be added by optimizing warehouse management systems, taking into account the estimated remaining shelf life of the product, and matching it to the requirements of the subsequent part of the handling chain. This contribution focuses on how model approaches estimating quality changes and remaining shelf life can be combined in optimizing first-expired-first-out cold chain management strategies for perishable products. To this end, shelf-life-related performance indicators are used to introduce remaining shelf life and product quality in the cost function when optimizing the supply chain. A combinatorial exhaustive-search algorithm is shown to be feasible as the complexity of the optimization problem is sufficiently low for the size and properties of a typical commercial cold chain. The estimated shelf life distances for a particular batch can thus be taken as a guide to optimize logistics. PMID:24797134
Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.
McIntosh, Chris; Hamarneh, Ghassan
2012-01-01
We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.
Garrido, Mariano; Larrechi, Maria Soledad; Rius, F Xavier; Mercado, Luis Adolfo; Galià, Marina
2007-02-05
Soft- and hard-modelling strategy was applied to near-infrared spectroscopy data obtained from monitoring the reaction between glycidyloxydimethylphenyl silane, a silicon-based epoxy monomer, and aniline. On the basis of the pure soft-modelling approach and previous chemical knowledge, a kinetic model for the reaction was proposed. Then, multivariate curve resolution-alternating least squares optimization was carried out under a hard constraint, that compels the concentration profiles to fulfil the proposed kinetic model at each iteration of the optimization process. In this way, the concentration profiles of each species and the corresponding kinetic rate constants of the reaction, unpublished until now, were obtained. The results obtained were contrasted with 13C NMR. The joint interval test of slope and intercept for detecting bias was not significant (alpha=5%).
Dibó, Gábor
2012-02-01
Combinatorial chemistry was introduced in the 1980s. It provided the possibility to produce new compounds in practically unlimited number. New strategies and technologies have also been developed that made it possible to screen very large number of compounds and to identify useful components in mixtures containing millions of different substances. This dramatically changed the drug discovery process and the way of thinking of synthetic chemists. In addition, combinatorial strategies became useful in areas such as pharmaceutical research, agrochemistry, catalyst design, and materials research. Prof. Árpád Furka is one of the pioneers of combinatorial chemistry.
Liao, Chenzhong; Liu, Bing; Shi, Leming; Zhou, Jiaju; Lu, Xian-Ping
2005-07-01
Based on the structural characters of PPAR modulators, a virtual combinatorial library containing 1226,625 compounds was constructed using SMILES strings. Selected ADME filters were employed to compel compounds having poor drug-like properties from this library. This library was converted to sdf and mol2 files by CONCORD 4.0, and was then docked to PPARgamma by DOCK 4.0 to identify new chemical entities that may be potential drug leads against type 2 diabetes and other metabolic diseases. The method to construct virtual combinatorial library using SMILES strings was further visualized by Visual Basic.net that can facilitate the needs of generating other type virtual combinatorial libraries.
Analysis of tasks for dynamic man/machine load balancing in advanced helicopters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jorgensen, C.C.
1987-10-01
This report considers task allocation requirements imposed by advanced helicopter designs incorporating mixes of human pilots and intelligent machines. Specifically, it develops an analogy between load balancing using distributed non-homogeneous multiprocessors and human team functions. A taxonomy is presented which can be used to identify task combinations likely to cause overload for dynamic scheduling and process allocation mechanisms. Designer criteria are given for function decomposition, separation of control from data, and communication handling for dynamic tasks. Possible effects of n-p complete scheduling problems are noted and a class of combinatorial optimization methods are examined.
Current state and future prospects of immunotherapy for glioma.
Kamran, Neha; Alghamri, Mahmoud S; Nunez, Felipe J; Shah, Diana; Asad, Antonela S; Candolfi, Marianela; Altshuler, David; Lowenstein, Pedro R; Castro, Maria G
2018-02-01
There is a large unmet need for effective therapeutic approaches for glioma, the most malignant brain tumor. Clinical and preclinical studies have enormously expanded our knowledge about the molecular aspects of this deadly disease and its interaction with the host immune system. In this review we highlight the wide array of immunotherapeutic interventions that are currently being tested in glioma patients. Given the molecular heterogeneity, tumor immunoediting and the profound immunosuppression that characterize glioma, it has become clear that combinatorial approaches targeting multiple pathways tailored to the genetic signature of the tumor will be required in order to achieve optimal therapeutic efficacy.
Combinatorial Optimization by Amoeba-Based Neurocomputer with Chaotic Dynamics
NASA Astrophysics Data System (ADS)
Aono, Masashi; Hirata, Yoshito; Hara, Masahiko; Aihara, Kazuyuki
We demonstrate a computing system based on an amoeba of a true slime mold Physarum capable of producing rich spatiotemporal oscillatory behavior. Our system operates as a neurocomputer because an optical feedback control in accordance with a recurrent neural network algorithm leads the amoeba's photosensitive branches to search for a stable configuration concurrently. We show our system's capability of solving the traveling salesman problem. Furthermore, we apply various types of nonlinear time series analysis to the amoeba's oscillatory behavior in the problem-solving process. The results suggest that an individual amoeba might be characterized as a set of coupled chaotic oscillators.
Improved artificial bee colony algorithm for vehicle routing problem with time windows
Yan, Qianqian; Zhang, Mengjie; Yang, Yunong
2017-01-01
This paper investigates a well-known complex combinatorial problem known as the vehicle routing problem with time windows (VRPTW). Unlike the standard vehicle routing problem, each customer in the VRPTW is served within a given time constraint. This paper solves the VRPTW using an improved artificial bee colony (IABC) algorithm. The performance of this algorithm is improved by a local optimization based on a crossover operation and a scanning strategy. Finally, the effectiveness of the IABC is evaluated on some well-known benchmarks. The results demonstrate the power of IABC algorithm in solving the VRPTW. PMID:28961252