Hernando, Leticia; Mendiburu, Alexander; Lozano, Jose A
2013-01-01
The solution of many combinatorial optimization problems is carried out by metaheuristics, which generally make use of local search algorithms. These algorithms use some kind of neighborhood structure over the search space. The performance of the algorithms strongly depends on the properties that the neighborhood imposes on the search space. One of these properties is the number of local optima. Given an instance of a combinatorial optimization problem and a neighborhood, the estimation of the number of local optima can help not only to measure the complexity of the instance, but also to choose the most convenient neighborhood to solve it. In this paper we review and evaluate several methods to estimate the number of local optima in combinatorial optimization problems. The methods reviewed not only come from the combinatorial optimization literature, but also from the statistical literature. A thorough evaluation in synthetic as well as real problems is given. We conclude by providing recommendations of methods for several scenarios.
ERIC Educational Resources Information Center
Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie
2008-01-01
Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
Combinatorial Optimization in Project Selection Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Dewi, Sari; Sawaluddin
2018-01-01
This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.
Nonlinear Multidimensional Assignment Problems Efficient Conic Optimization Methods and Applications
2015-06-24
WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Arizona State University School of Mathematical & Statistical Sciences 901 S...SUPPLEMENTARY NOTES 14. ABSTRACT The major goals of this project were completed: the exact solution of previously unsolved challenging combinatorial optimization... combinatorial optimization problem, the Directional Sensor Problem, was solved in two ways. First, heuristically in an engineering fashion and second, exactly
Distributed Combinatorial Optimization Using Privacy on Mobile Phones
NASA Astrophysics Data System (ADS)
Ono, Satoshi; Katayama, Kimihiro; Nakayama, Shigeru
This paper proposes a method for distributed combinatorial optimization which uses mobile phones as computers. In the proposed method, an ordinary computer generates solution candidates and mobile phones evaluates them by referring privacy — private information and preferences. Users therefore does not have to send their privacy to any other computers and does not have to refrain from inputting their preferences. They therefore can obtain satisfactory solution. Experimental results have showed the proposed method solved room assignment problems without sending users' privacy to a server.
Structure-based design of combinatorial mutagenesis libraries
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-01-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called “Structure-based Optimization of Combinatorial Mutagenesis” (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. PMID:25611189
Structure-based design of combinatorial mutagenesis libraries.
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-05-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called "Structure-based Optimization of Combinatorial Mutagenesis" (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. © 2015 The Protein Society.
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
Optimal weighted combinatorial forecasting model of QT dispersion of ECGs in Chinese adults.
Wen, Zhang; Miao, Ge; Xinlei, Liu; Minyi, Cen
2016-07-01
This study aims to provide a scientific basis for unifying the reference value standard of QT dispersion of ECGs in Chinese adults. Three predictive models including regression model, principal component model, and artificial neural network model are combined to establish the optimal weighted combination model. The optimal weighted combination model and single model are verified and compared. Optimal weighted combinatorial model can reduce predicting risk of single model and improve the predicting precision. The reference value of geographical distribution of Chinese adults' QT dispersion was precisely made by using kriging methods. When geographical factors of a particular area are obtained, the reference value of QT dispersion of Chinese adults in this area can be estimated by using optimal weighted combinatorial model and reference value of the QT dispersion of Chinese adults anywhere in China can be obtained by using geographical distribution figure as well.
Combinatorial Methods for Exploring Complex Materials
NASA Astrophysics Data System (ADS)
Amis, Eric J.
2004-03-01
Combinatorial and high-throughput methods have changed the paradigm of pharmaceutical synthesis and have begun to have a similar impact on materials science research. Already there are examples of combinatorial methods used for inorganic materials, catalysts, and polymer synthesis. For many investigations the primary goal has been discovery of new material compositions that optimize properties such as phosphorescence or catalytic activity. In the midst of the excitement generated to "make things", another opportunity arises for materials science to "understand things" by using the efficiency of combinatorial methods. We have shown that combinatorial methods hold potential for rapid and systematic generation of experimental data over the multi-parameter space typical of investigations in polymer physics. We have applied the combinatorial approach to studies of polymer thin films, biomaterials, polymer blends, filled polymers, and semicrystalline polymers. By combining library fabrication, high-throughput measurements, informatics, and modeling we can demonstrate validation of the methodology, new observations, and developments toward predictive models. This talk will present some of our latest work with applications to coating stability, multi-component formulations, and nanostructure assembly.
Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.
Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi
2016-06-01
Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.
Combinatorial optimization in foundry practice
NASA Astrophysics Data System (ADS)
Antamoshkin, A. N.; Masich, I. S.
2016-04-01
The multicriteria mathematical model of foundry production capacity planning is suggested in the paper. The model is produced in terms of pseudo-Boolean optimization theory. Different search optimization methods were used to solve the obtained problem.
A Discriminative Sentence Compression Method as Combinatorial Optimization Problem
NASA Astrophysics Data System (ADS)
Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki
In the study of automatic summarization, the main research topic was `important sentence extraction' but nowadays `sentence compression' is a hot research topic. Conventional sentence compression methods usually transform a given sentence into a parse tree or a dependency tree, and modify them to get a shorter sentence. However, this method is sometimes too rigid. In this paper, we regard sentence compression as an combinatorial optimization problem that extracts an optimal subsequence of words. Hori et al. also proposed a similar method, but they used only a small number of features and their weights were tuned by hand. We introduce a large number of features such as part-of-speech bigrams and word position in the sentence. Furthermore, we train the system by discriminative learning. According to our experiments, our method obtained better score than other methods with statistical significance.
Chang, Yuchao; Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Yuan, Baoqing Li andXiaobing
2017-07-19
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum-minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms.
Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication
ERIC Educational Resources Information Center
Wolf, Michael Maclean
2009-01-01
Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…
Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space
2015-05-01
ARL-TR-7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
A gradient system solution to Potts mean field equations and its electronic implementation.
Urahama, K; Ueno, S
1993-03-01
A gradient system solution method is presented for solving Potts mean field equations for combinatorial optimization problems subject to winner-take-all constraints. In the proposed solution method the optimum solution is searched by using gradient descent differential equations whose trajectory is confined within the feasible solution space of optimization problems. This gradient system is proven theoretically to always produce a legal local optimum solution of combinatorial optimization problems. An elementary analog electronic circuit implementing the presented method is designed on the basis of current-mode subthreshold MOS technologies. The core constituent of the circuit is the winner-take-all circuit developed by Lazzaro et al. Correct functioning of the presented circuit is exemplified with simulations of the circuits implementing the scheme for solving the shortest path problems.
Statistical Mechanics of Combinatorial Auctions
NASA Astrophysics Data System (ADS)
Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo
2006-09-01
Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.
Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Li, Baoqing; Yuan, Xiaobing
2017-01-01
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum–minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms. PMID:28753962
Fuel management optimization using genetic algorithms and code independence
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeChaine, M.D.; Feltus, M.A.
1994-12-31
Fuel management optimization is a hard problem for traditional optimization techniques. Loading pattern optimization is a large combinatorial problem without analytical derivative information. Therefore, methods designed for continuous functions, such as linear programming, do not always work well. Genetic algorithms (GAs) address these problems and, therefore, appear ideal for fuel management optimization. They do not require derivative information and work well with combinatorial. functions. The GAs are a stochastic method based on concepts from biological genetics. They take a group of candidate solutions, called the population, and use selection, crossover, and mutation operators to create the next generation of bettermore » solutions. The selection operator is a {open_quotes}survival-of-the-fittest{close_quotes} operation and chooses the solutions for the next generation. The crossover operator is analogous to biological mating, where children inherit a mixture of traits from their parents, and the mutation operator makes small random changes to the solutions.« less
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner
2013-04-08
In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.
Coelho, V N; Coelho, I M; Souza, M J F; Oliveira, T A; Cota, L P; Haddad, M N; Mladenovic, N; Silva, R C P; Guimarães, F G
2016-01-01
This article presents an Evolution Strategy (ES)--based algorithm, designed to self-adapt its mutation operators, guiding the search into the solution space using a Self-Adaptive Reduced Variable Neighborhood Search procedure. In view of the specific local search operators for each individual, the proposed population-based approach also fits into the context of the Memetic Algorithms. The proposed variant uses the Greedy Randomized Adaptive Search Procedure with different greedy parameters for generating its initial population, providing an interesting exploration-exploitation balance. To validate the proposal, this framework is applied to solve three different [Formula: see text]-Hard combinatorial optimization problems: an Open-Pit-Mining Operational Planning Problem with dynamic allocation of trucks, an Unrelated Parallel Machine Scheduling Problem with Setup Times, and the calibration of a hybrid fuzzy model for Short-Term Load Forecasting. Computational results point out the convergence of the proposed model and highlight its ability in combining the application of move operations from distinct neighborhood structures along the optimization. The results gathered and reported in this article represent a collective evidence of the performance of the method in challenging combinatorial optimization problems from different application domains. The proposed evolution strategy demonstrates an ability of adapting the strength of the mutation disturbance during the generations of its evolution process. The effectiveness of the proposal motivates the application of this novel evolutionary framework for solving other combinatorial optimization problems.
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
Xiang, X D
Combinatorial materials synthesis methods and high-throughput evaluation techniques have been developed to accelerate the process of materials discovery and optimization and phase-diagram mapping. Analogous to integrated circuit chips, integrated materials chips containing thousands of discrete different compositions or continuous phase diagrams, often in the form of high-quality epitaxial thin films, can be fabricated and screened for interesting properties. Microspot x-ray method, various optical measurement techniques, and a novel evanescent microwave microscope have been used to characterize the structural, optical, magnetic, and electrical properties of samples on the materials chips. These techniques are routinely used to discover/optimize and map phase diagrams of ferroelectric, dielectric, optical, magnetic, and superconducting materials.
Parallel tempering for the traveling salesman problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Percus, Allon; Wang, Richard; Hyman, Jeffrey
We explore the potential of parallel tempering as a combinatorial optimization method, applying it to the traveling salesman problem. We compare simulation results of parallel tempering with a benchmark implementation of simulated annealing, and study how different choices of parameters affect the relative performance of the two methods. We find that a straightforward implementation of parallel tempering can outperform simulated annealing in several crucial respects. When parameters are chosen appropriately, both methods yield close approximation to the actual minimum distance for an instance with 200 nodes. However, parallel tempering yields more consistently accurate results when a series of independent simulationsmore » are performed. Our results suggest that parallel tempering might offer a simple but powerful alternative to simulated annealing for combinatorial optimization problems.« less
NASA Astrophysics Data System (ADS)
Doerr, Timothy; Alves, Gelio; Yu, Yi-Kuo
2006-03-01
Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time. This suggests a way to efficiently find approximate solutions - - find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the fininte number of high- ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks - - peptide sequencing using tandem mass spectrometry data.
Puthiyedth, Nisha; Riveros, Carlos; Berretta, Regina; Moscato, Pablo
2015-01-01
Background The joint study of multiple datasets has become a common technique for increasing statistical power in detecting biomarkers obtained from smaller studies. The approach generally followed is based on the fact that as the total number of samples increases, we expect to have greater power to detect associations of interest. This methodology has been applied to genome-wide association and transcriptomic studies due to the availability of datasets in the public domain. While this approach is well established in biostatistics, the introduction of new combinatorial optimization models to address this issue has not been explored in depth. In this study, we introduce a new model for the integration of multiple datasets and we show its application in transcriptomics. Methods We propose a new combinatorial optimization problem that addresses the core issue of biomarker detection in integrated datasets. Optimal solutions for this model deliver a feature selection from a panel of prospective biomarkers. The model we propose is a generalised version of the (α,β)-k-Feature Set problem. We illustrate the performance of this new methodology via a challenging meta-analysis task involving six prostate cancer microarray datasets. The results are then compared to the popular RankProd meta-analysis tool and to what can be obtained by analysing the individual datasets by statistical and combinatorial methods alone. Results Application of the integrated method resulted in a more informative signature than the rank-based meta-analysis or individual dataset results, and overcomes problems arising from real world datasets. The set of genes identified is highly significant in the context of prostate cancer. The method used does not rely on homogenisation or transformation of values to a common scale, and at the same time is able to capture markers associated with subgroups of the disease. PMID:26106884
Combining local search with co-evolution in a remarkably simple way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.
2000-05-01
The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less
Exploiting Quantum Resonance to Solve Combinatorial Problems
NASA Technical Reports Server (NTRS)
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
Automatic Summarization as a Combinatorial Optimization Problem
NASA Astrophysics Data System (ADS)
Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki
We derived the oracle summary with the highest ROUGE score that can be achieved by integrating sentence extraction with sentence compression from the reference abstract. The analysis results of the oracle revealed that summarization systems have to assign an appropriate compression rate for each sentence in the document. In accordance with this observation, this paper proposes a summarization method as a combinatorial optimization: selecting the set of sentences that maximize the sum of the sentence scores from the pool which consists of the sentences with various compression rates, subject to length constrains. The score of the sentence is defined by its compression rate, content words and positional information. The parameters for the compression rates and positional information are optimized by minimizing the loss between score of oracles and that of candidates. The results obtained from TSC-2 corpus showed that our method outperformed the previous systems with statistical significance.
A preliminary study to metaheuristic approach in multilayer radiation shielding optimization
NASA Astrophysics Data System (ADS)
Arif Sazali, Muhammad; Rashid, Nahrul Khair Alang Md; Hamzah, Khaidzir
2018-01-01
Metaheuristics are high-level algorithmic concepts that can be used to develop heuristic optimization algorithms. One of their applications is to find optimal or near optimal solutions to combinatorial optimization problems (COPs) such as scheduling, vehicle routing, and timetabling. Combinatorial optimization deals with finding optimal combinations or permutations in a given set of problem components when exhaustive search is not feasible. A radiation shield made of several layers of different materials can be regarded as a COP. The time taken to optimize the shield may be too high when several parameters are involved such as the number of materials, the thickness of layers, and the arrangement of materials. Metaheuristics can be applied to reduce the optimization time, trading guaranteed optimal solutions for near-optimal solutions in comparably short amount of time. The application of metaheuristics for radiation shield optimization is lacking. In this paper, we present a review on the suitability of using metaheuristics in multilayer shielding design, specifically the genetic algorithm and ant colony optimization algorithm (ACO). We would also like to propose an optimization model based on the ACO method.
Directed differentiation of embryonic stem cells using a bead-based combinatorial screening method.
Tarunina, Marina; Hernandez, Diana; Johnson, Christopher J; Rybtsov, Stanislav; Ramathas, Vidya; Jeyakumar, Mylvaganam; Watson, Thomas; Hook, Lilian; Medvinsky, Alexander; Mason, Chris; Choo, Yen
2014-01-01
We have developed a rapid, bead-based combinatorial screening method to determine optimal combinations of variables that direct stem cell differentiation to produce known or novel cell types having pre-determined characteristics. Here we describe three experiments comprising stepwise exposure of mouse or human embryonic cells to 10,000 combinations of serum-free differentiation media, through which we discovered multiple novel, efficient and robust protocols to generate a number of specific hematopoietic and neural lineages. We further demonstrate that the technology can be used to optimize existing protocols in order to substitute costly growth factors with bioactive small molecules and/or increase cell yield, and to identify in vitro conditions for the production of rare developmental intermediates such as an embryonic lymphoid progenitor cell that has not previously been reported.
Lexicographic goal programming and assessment tools for a combinatorial production problem.
DOT National Transportation Integrated Search
2008-01-01
NP-complete combinatorial problems often necessitate the use of near-optimal solution techniques including : heuristics and metaheuristics. The addition of multiple optimization criteria can further complicate : comparison of these solution technique...
Microbatteries for Combinatorial Studies of Conventional Lithium-Ion Batteries
NASA Technical Reports Server (NTRS)
West, William; Whitacre, Jay; Bugga, Ratnakumar
2003-01-01
Integrated arrays of microscopic solid-state batteries have been demonstrated in a continuing effort to develop microscopic sources of power and of voltage reference circuits to be incorporated into low-power integrated circuits. Perhaps even more importantly, arrays of microscopic batteries can be fabricated and tested in combinatorial experiments directed toward optimization and discovery of battery materials. The value of the combinatorial approach to optimization and discovery has been proven in the optoelectronic, pharmaceutical, and bioengineering industries. Depending on the specific application, the combinatorial approach can involve the investigation of hundreds or even thousands of different combinations; hence, it is time-consuming and expensive to attempt to implement the combinatorial approach by building and testing full-size, discrete cells and batteries. The conception of microbattery arrays makes it practical to bring the advantages of the combinatorial approach to the development of batteries.
Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.
Bauer, Markus; Klau, Gunnar W; Reinert, Knut
2007-07-27
The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.
A Simple Combinatorial Codon Mutagenesis Method for Targeted Protein Engineering.
Belsare, Ketaki D; Andorfer, Mary C; Cardenas, Frida S; Chael, Julia R; Park, Hyun June; Lewis, Jared C
2017-03-17
Directed evolution is a powerful tool for optimizing enzymes, and mutagenesis methods that improve enzyme library quality can significantly expedite the evolution process. Here, we report a simple method for targeted combinatorial codon mutagenesis (CCM). To demonstrate the utility of this method for protein engineering, CCM libraries were constructed for cytochrome P450 BM3 , pfu prolyl oligopeptidase, and the flavin-dependent halogenase RebH; 10-26 sites were targeted for codon mutagenesis in each of these enzymes, and libraries with a tunable average of 1-7 codon mutations per gene were generated. Each of these libraries provided improved enzymes for their respective transformations, which highlights the generality, simplicity, and tunability of CCM for targeted protein engineering.
NASA Astrophysics Data System (ADS)
Xue, Wei; Wang, Qi; Wang, Tianyu
2018-04-01
This paper presents an improved parallel combinatory spread spectrum (PC/SS) communication system with the method of double information matching (DIM). Compared with conventional PC/SS system, the new model inherits the advantage of high transmission speed, large information capacity and high security. Besides, the problem traditional system will face is the high bit error rate (BER) and since its data-sequence mapping algorithm. Hence the new model presented shows lower BER and higher efficiency by its optimization of mapping algorithm.
Directed Differentiation of Embryonic Stem Cells Using a Bead-Based Combinatorial Screening Method
Tarunina, Marina; Hernandez, Diana; Johnson, Christopher J.; Rybtsov, Stanislav; Ramathas, Vidya; Jeyakumar, Mylvaganam; Watson, Thomas; Hook, Lilian; Medvinsky, Alexander; Mason, Chris; Choo, Yen
2014-01-01
We have developed a rapid, bead-based combinatorial screening method to determine optimal combinations of variables that direct stem cell differentiation to produce known or novel cell types having pre-determined characteristics. Here we describe three experiments comprising stepwise exposure of mouse or human embryonic cells to 10,000 combinations of serum-free differentiation media, through which we discovered multiple novel, efficient and robust protocols to generate a number of specific hematopoietic and neural lineages. We further demonstrate that the technology can be used to optimize existing protocols in order to substitute costly growth factors with bioactive small molecules and/or increase cell yield, and to identify in vitro conditions for the production of rare developmental intermediates such as an embryonic lymphoid progenitor cell that has not previously been reported. PMID:25251366
Turkett, Jeremy A; Bicker, Kevin L
2017-04-10
Growing prevalence of antibiotic resistant bacterial infections necessitates novel antimicrobials, which could be rapidly identified from combinatorial libraries. We report the use of the peptoid library agar diffusion (PLAD) assay to screen peptoid libraries against the ESKAPE pathogens, including the optimization of assay conditions for each pathogen. Work presented here focuses on the tailoring of combinatorial peptoid library design through a detailed study of how peptoid lipophilicity relates to antibacterial potency and mammalian cell toxicity. The information gleaned from this optimization was then applied using the aforementioned screening method to examine the relative potency of peptoid libraries against Staphylococcus aureus, Acinetobacter baumannii, and Enterococcus faecalis prior to and following functionalization with long alkyl tails. The data indicate that overall peptoid hydrophobicity and not simply alkyl tail length is strongly correlated with mammalian cell toxicity. Furthermore, this work demonstrates the utility of the PLAD assay in rapidly evaluating the effect of molecular property changes in similar libraries.
Solving multi-objective optimization problems in conservation with the reference point method
Dujardin, Yann; Chadès, Iadine
2018-01-01
Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650
Multiobjective optimization of combinatorial libraries.
Agrafiotis, D K
2002-01-01
Combinatorial chemistry and high-throughput screening have caused a fundamental shift in the way chemists contemplate experiments. Designing a combinatorial library is a controversial art that involves a heterogeneous mix of chemistry, mathematics, economics, experience, and intuition. Although there seems to be little agreement as to what constitutes an ideal library, one thing is certain: only one property or measure seldom defines the quality of the design. In most real-world applications, a good experiment requires the simultaneous optimization of several, often conflicting, design objectives, some of which may be vague and uncertain. In this paper, we discuss a class of algorithms for subset selection rooted in the principles of multiobjective optimization. Our approach is to employ an objective function that encodes all of the desired selection criteria, and then use a simulated annealing or evolutionary approach to identify the optimal (or a nearly optimal) subset from among the vast number of possibilities. Many design criteria can be accommodated, including diversity, similarity to known actives, predicted activity and/or selectivity determined by quantitative structure-activity relationship (QSAR) models or receptor binding models, enforcement of certain property distributions, reagent cost and availability, and many others. The method is robust, convergent, and extensible, offers the user full control over the relative significance of the various objectives in the final design, and permits the simultaneous selection of compounds from multiple libraries in full- or sparse-array format.
Gobin, Oliver C; Schüth, Ferdi
2008-01-01
Genetic algorithms are widely used to solve and optimize combinatorial problems and are more often applied for library design in combinatorial chemistry. Because of their flexibility, however, their implementation can be challenging. In this study, the influence of the representation of solid catalysts on the performance of genetic algorithms was systematically investigated on the basis of a new, constrained, multiobjective, combinatorial test problem with properties common to problems in combinatorial materials science. Constraints were satisfied by penalty functions, repair algorithms, or special representations. The tests were performed using three state-of-the-art evolutionary multiobjective algorithms by performing 100 optimization runs for each algorithm and test case. Experimental data obtained during the optimization of a noble metal-free solid catalyst system active in the selective catalytic reduction of nitric oxide with propene was used to build up a predictive model to validate the results of the theoretical test problem. A significant influence of the representation on the optimization performance was observed. Binary encodings were found to be the preferred encoding in most of the cases, and depending on the experimental test unit, repair algorithms or penalty functions performed best.
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Lin, Lin
Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
Quantum Resonance Approach to Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Zak, Michail
1997-01-01
It is shown that quantum resonance can be used for combinatorial optimization. The advantage of the approach is in independence of the computing time upon the dimensionality of the problem. As an example, the solution to a constraint satisfaction problem of exponential complexity is demonstrated.
Global gene expression analysis by combinatorial optimization.
Ameur, Adam; Aurell, Erik; Carlsson, Mats; Westholm, Jakub Orzechowski
2004-01-01
Generally, there is a trade-off between methods of gene expression analysis that are precise but labor-intensive, e.g. RT-PCR, and methods that scale up to global coverage but are not quite as quantitative, e.g. microarrays. In the present paper, we show how how a known method of gene expression profiling (K. Kato, Nucleic Acids Res. 23, 3685-3690 (1995)), which relies on a fairly small number of steps, can be turned into a global gene expression measurement by advanced data post-processing, with potentially little loss of accuracy. Post-processing here entails solving an ancillary combinatorial optimization problem. Validation is performed on in silico experiments generated from the FANTOM data base of full-length mouse cDNA. We present two variants of the method. One uses state-of-the-art commercial software for solving problems of this kind, the other a code developed by us specifically for this purpose, released in the public domain under GPL license.
Optimization Strategies for Sensor and Actuator Placement
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Kincaid, Rex K.
1999-01-01
This paper provides a survey of actuator and sensor placement problems from a wide range of engineering disciplines and a variety of applications. Combinatorial optimization methods are recommended as a means for identifying sets of actuators and sensors that maximize performance. Several sample applications from NASA Langley Research Center, such as active structural acoustic control, are covered in detail. Laboratory and flight tests of these applications indicate that actuator and sensor placement methods are effective and important. Lessons learned in solving these optimization problems can guide future research.
Carbon Nanotubes by CVD and Applications
NASA Technical Reports Server (NTRS)
Cassell, Alan; Delzeit, Lance; Nguyen, Cattien; Stevens, Ramsey; Han, Jie; Meyyappan, M.; Arnold, James O. (Technical Monitor)
2001-01-01
Carbon nanotube (CNT) exhibits extraordinary mechanical and unique electronic properties and offers significant potential for structural, sensor, and nanoelectronics applications. An overview of CNT, growth methods, properties and applications is provided. Single-wall, and multi-wall CNTs have been grown by chemical vapor deposition. Catalyst development and optimization has been accomplished using combinatorial optimization methods. CNT has also been grown from the tips of silicon cantilevers for use in atomic force microscopy.
Combinatorial Strategies for the Development of Bulk Metallic Glasses
NASA Astrophysics Data System (ADS)
Ding, Shiyan
The systematic identification of multi-component alloys out of the vast composition space is still a daunting task, especially in the development of bulk metallic glasses that are typically based on three or more elements. In order to address this challenge, combinatorial approaches have been proposed. However, previous attempts have not successfully coupled the synthesis of combinatorial libraries with high-throughput characterization methods. The goal of my dissertation is to develop efficient high-throughput characterization methods, optimized to identify glass formers systematically. Here, two innovative approaches have been invented. One is to measure the nucleation temperature in parallel for up-to 800 compositions. The composition with the lowest nucleation temperature has a reasonable agreement with the best-known glass forming composition. In addition, the thermoplastic formability of a metallic glass forming system is determined through blow molding a compositional library. Our results reveal that the composition with the largest thermoplastic deformation correlates well with the best-known formability composition. I have demonstrated both methods as powerful tools to develop new bulk metallic glasses.
Creating IRT-Based Parallel Test Forms Using the Genetic Algorithm Method
ERIC Educational Resources Information Center
Sun, Koun-Tem; Chen, Yu-Jen; Tsai, Shu-Yen; Cheng, Chien-Fen
2008-01-01
In educational measurement, the construction of parallel test forms is often a combinatorial optimization problem that involves the time-consuming selection of items to construct tests having approximately the same test information functions (TIFs) and constraints. This article proposes a novel method, genetic algorithm (GA), to construct parallel…
Awwal, Abdul; Diaz-Ramirez, Victor H.; Cuevas, Andres; ...
2014-10-23
Composite correlation filters are used for solving a wide variety of pattern recognition problems. These filters are given by a combination of several training templates chosen by a designer in an ad hoc manner. In this work, we present a new approach for the design of composite filters based on multi-objective combinatorial optimization. Given a vast search space of training templates, an iterative algorithm is used to synthesize a filter with an optimized performance in terms of several competing criteria. Furthermore, by employing a suggested binary-search procedure a filter bank with a minimum number of filters can be constructed, formore » a prespecified trade-off of performance metrics. Computer simulation results obtained with the proposed method in recognizing geometrically distorted versions of a target in cluttered and noisy scenes are discussed and compared in terms of recognition performance and complexity with existing state-of-the-art filters.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Awwal, Abdul; Diaz-Ramirez, Victor H.; Cuevas, Andres
Composite correlation filters are used for solving a wide variety of pattern recognition problems. These filters are given by a combination of several training templates chosen by a designer in an ad hoc manner. In this work, we present a new approach for the design of composite filters based on multi-objective combinatorial optimization. Given a vast search space of training templates, an iterative algorithm is used to synthesize a filter with an optimized performance in terms of several competing criteria. Furthermore, by employing a suggested binary-search procedure a filter bank with a minimum number of filters can be constructed, formore » a prespecified trade-off of performance metrics. Computer simulation results obtained with the proposed method in recognizing geometrically distorted versions of a target in cluttered and noisy scenes are discussed and compared in terms of recognition performance and complexity with existing state-of-the-art filters.« less
Rationally reduced libraries for combinatorial pathway optimization minimizing experimental effort.
Jeschek, Markus; Gerngross, Daniel; Panke, Sven
2016-03-31
Rational flux design in metabolic engineering approaches remains difficult since important pathway information is frequently not available. Therefore empirical methods are applied that randomly change absolute and relative pathway enzyme levels and subsequently screen for variants with improved performance. However, screening is often limited on the analytical side, generating a strong incentive to construct small but smart libraries. Here we introduce RedLibs (Reduced Libraries), an algorithm that allows for the rational design of smart combinatorial libraries for pathway optimization thereby minimizing the use of experimental resources. We demonstrate the utility of RedLibs for the design of ribosome-binding site libraries by in silico and in vivo screening with fluorescent proteins and perform a simple two-step optimization of the product selectivity in the branched multistep pathway for violacein biosynthesis, indicating a general applicability for the algorithm and the proposed heuristics. We expect that RedLibs will substantially simplify the refactoring of synthetic metabolic pathways.
TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization
2016-11-28
objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation
Combinatorial Multiobjective Optimization Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Crossley, William A.; Martin. Eric T.
2002-01-01
The research proposed in this document investigated multiobjective optimization approaches based upon the Genetic Algorithm (GA). Several versions of the GA have been adopted for multiobjective design, but, prior to this research, there had not been significant comparisons of the most popular strategies. The research effort first generalized the two-branch tournament genetic algorithm in to an N-branch genetic algorithm, then the N-branch GA was compared with a version of the popular Multi-Objective Genetic Algorithm (MOGA). Because the genetic algorithm is well suited to combinatorial (mixed discrete / continuous) optimization problems, the GA can be used in the conceptual phase of design to combine selection (discrete variable) and sizing (continuous variable) tasks. Using a multiobjective formulation for the design of a 50-passenger aircraft to meet the competing objectives of minimizing takeoff gross weight and minimizing trip time, the GA generated a range of tradeoff designs that illustrate which aircraft features change from a low-weight, slow trip-time aircraft design to a heavy-weight, short trip-time aircraft design. Given the objective formulation and analysis methods used, the results of this study identify where turboprop-powered aircraft and turbofan-powered aircraft become more desirable for the 50 seat passenger application. This aircraft design application also begins to suggest how a combinatorial multiobjective optimization technique could be used to assist in the design of morphing aircraft.
Human Performance on the Traveling Salesman and Related Problems: A Review
ERIC Educational Resources Information Center
MacGregor, James N.; Chu, Yun
2011-01-01
The article provides a review of recent research on human performance on the traveling salesman problem (TSP) and related combinatorial optimization problems. We discuss what combinatorial optimization problems are, why they are important, and why they may be of interest to cognitive scientists. We next describe the main characteristics of human…
Osaba, E; Carballedo, R; Diaz, F; Onieva, E; de la Iglesia, I; Perallos, A
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test.
Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; de la Iglesia, I.; Perallos, A.
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test. PMID:25165731
Wang, Lipo; Li, Sa; Tian, Fuyu; Fu, Xiuju
2004-10-01
Recently Chen and Aihara have demonstrated both experimentally and mathematically that their chaotic simulated annealing (CSA) has better search ability for solving combinatorial optimization problems compared to both the Hopfield-Tank approach and stochastic simulated annealing (SSA). However, CSA may not find a globally optimal solution no matter how slowly annealing is carried out, because the chaotic dynamics are completely deterministic. In contrast, SSA tends to settle down to a global optimum if the temperature is reduced sufficiently slowly. Here we combine the best features of both SSA and CSA, thereby proposing a new approach for solving optimization problems, i.e., stochastic chaotic simulated annealing, by using a noisy chaotic neural network. We show the effectiveness of this new approach with two difficult combinatorial optimization problems, i.e., a traveling salesman problem and a channel assignment problem for cellular mobile communications.
Aerospace applications of integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in solving combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on a large space structure and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa
2008-01-01
Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.
Jézéquel, Laetitia; Loeper, Jacqueline; Pompon, Denis
2008-11-01
Combinatorial libraries coding for mosaic enzymes with predefined crossover points constitute useful tools to address and model structure-function relationships and for functional optimization of enzymes based on multivariate statistics. The presented method, called sequence-independent generation of a chimera-ordered library (SIGNAL), allows easy shuffling of any predefined amino acid segment between two or more proteins. This method is particularly well adapted to the exchange of protein structural modules. The procedure could also be well suited to generate ordered combinatorial libraries independent of sequence similarities in a robotized manner. Sequence segments to be recombined are first extracted by PCR from a single-stranded template coding for an enzyme of interest using a biotin-avidin-based method. This technique allows the reduction of parental template contamination in the final library. Specific PCR primers allow amplification of two complementary mosaic DNA fragments, overlapping in the region to be exchanged. Fragments are finally reassembled using a fusion PCR. The process is illustrated via the construction of a set of mosaic CYP2B enzymes using this highly modular approach.
Gooding, Owen W
2004-06-01
The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.
Podlewska, Sabina; Czarnecki, Wojciech M; Kafel, Rafał; Bojarski, Andrzej J
2017-02-27
The growing computational abilities of various tools that are applied in the broadly understood field of computer-aided drug design have led to the extreme popularity of virtual screening in the search for new biologically active compounds. Most often, the source of such molecules consists of commercially available compound databases, but they can also be searched for within the libraries of structures generated in silico from existing ligands. Various computational combinatorial approaches are based solely on the chemical structure of compounds, using different types of substitutions for new molecules formation. In this study, the starting point for combinatorial library generation was the fingerprint referring to the optimal substructural composition in terms of the activity toward a considered target, which was obtained using a machine learning-based optimization procedure. The systematic enumeration of all possible connections between preferred substructures resulted in the formation of target-focused libraries of new potential ligands. The compounds were initially assessed by machine learning methods using a hashed fingerprint to represent molecules; the distribution of their physicochemical properties was also investigated, as well as their synthetic accessibility. The examination of various fingerprints and machine learning algorithms indicated that the Klekota-Roth fingerprint and support vector machine were an optimal combination for such experiments. This study was performed for 8 protein targets, and the obtained compound sets and their characterization are publically available at http://skandal.if-pan.krakow.pl/comb_lib/ .
Aerospace Applications of Integer and Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Aerospace applications on integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem. for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Dynamical analysis of continuous higher-order hopfield networks for combinatorial optimization.
Atencia, Miguel; Joya, Gonzalo; Sandoval, Francisco
2005-08-01
In this letter, the ability of higher-order Hopfield networks to solve combinatorial optimization problems is assessed by means of a rigorous analysis of their properties. The stability of the continuous network is almost completely clarified: (1) hyperbolic interior equilibria, which are unfeasible, are unstable; (2) the state cannot escape from the unitary hypercube; and (3) a Lyapunov function exists. Numerical methods used to implement the continuous equation on a computer should be designed with the aim of preserving these favorable properties. The case of nonhyperbolic fixed points, which occur when the Hessian of the target function is the null matrix, requires further study. We prove that these nonhyperbolic interior fixed points are unstable in networks with three neurons and order two. The conjecture that interior equilibria are unstable in the general case is left open.
Synthesis and characterization of catalysts and electrocatalysts using combinatorial methods
NASA Astrophysics Data System (ADS)
Ramanathan, Ramnarayanan
This thesis documents attempts at solving three problems. Bead-based parallel synthetic and screening methods based on matrix algorithms were developed. The method was applied to search for new heterogeneous catalysts for dehydrogenation of methylcyclohexane. The most powerful use of the method to date was to optimize metal adsorption and evaluate catalysts as a function of incident energy, likely to be important in the future, should availability of energy be an optimization parameter. This work also highlighted the importance of order of addition of metal salts on catalytic activity and a portion of this work resulted in a patent with UOP LLC, Desplaines, Illinois. Combinatorial methods were also investigated as a tool to search for carbon-monoxide tolerant anode electrocatalysts and methanol tolerant cathode electrocatalysts, resulting in discovery of no new electrocatalysts. A physically intuitive scaling criterion was developed to analyze all experiments on electrocatalysts, providing insight for future experiments. We attempted to solve the CO poisoning problem in polymer electrolyte fuel cells using carbon molecular sieves as a separator. This approach was unsuccessful in solving the CO poisoning problem, possibly due to the tendency of the carbon molecular sieves to concentrate CO and CO 2 in pore walls.
NASA Astrophysics Data System (ADS)
Masuda, Kazuaki; Aiyoshi, Eitaro
We propose a method for solving optimal price decision problems for simultaneous multi-article auctions. An auction problem, originally formulated as a combinatorial problem, determines both every seller's whether or not to sell his/her article and every buyer's which article(s) to buy, so that the total utility of buyers and sellers will be maximized. Due to the duality theory, we transform it equivalently into a dual problem in which Lagrange multipliers are interpreted as articles' transaction price. As the dual problem is a continuous optimization problem with respect to the multipliers (i.e., the transaction prices), we propose a numerical method to solve it by applying heuristic global search methods. In this paper, Particle Swarm Optimization (PSO) is used to solve the dual problem, and experimental results are presented to show the validity of the proposed method.
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
Combinatorial therapy discovery using mixed integer linear programming.
Pang, Kaifang; Wan, Ying-Wooi; Choi, William T; Donehower, Lawrence A; Sun, Jingchun; Pant, Dhruv; Liu, Zhandong
2014-05-15
Combinatorial therapies play increasingly important roles in combating complex diseases. Owing to the huge cost associated with experimental methods in identifying optimal drug combinations, computational approaches can provide a guide to limit the search space and reduce cost. However, few computational approaches have been developed for this purpose, and thus there is a great need of new algorithms for drug combination prediction. Here we proposed to formulate the optimal combinatorial therapy problem into two complementary mathematical algorithms, Balanced Target Set Cover (BTSC) and Minimum Off-Target Set Cover (MOTSC). Given a disease gene set, BTSC seeks a balanced solution that maximizes the coverage on the disease genes and minimizes the off-target hits at the same time. MOTSC seeks a full coverage on the disease gene set while minimizing the off-target set. Through simulation, both BTSC and MOTSC demonstrated a much faster running time over exhaustive search with the same accuracy. When applied to real disease gene sets, our algorithms not only identified known drug combinations, but also predicted novel drug combinations that are worth further testing. In addition, we developed a web-based tool to allow users to iteratively search for optimal drug combinations given a user-defined gene set. Our tool is freely available for noncommercial use at http://www.drug.liuzlab.org/. zhandong.liu@bcm.edu Supplementary data are available at Bioinformatics online.
Liu, Zhi-Hua; Xie, Shangxian; Lin, Furong; Jin, Mingjie; Yuan, Joshua S
2018-01-01
Lignin valorization has recently been considered to be an essential process for sustainable and cost-effective biorefineries. Lignin represents a potential new feedstock for value-added products. Oleaginous bacteria such as Rhodococcus opacus can produce intracellular lipids from biodegradation of aromatic substrates. These lipids can be used for biofuel production, which can potentially replace petroleum-derived chemicals. However, the low reactivity of lignin produced from pretreatment and the underdeveloped fermentation technology hindered lignin bioconversion to lipids. In this study, combinatorial pretreatment with an optimized fermentation strategy was evaluated to improve lignin valorization into lipids using R. opacus PD630. As opposed to single pretreatment, combinatorial pretreatment produced a 12.8-75.6% higher lipid concentration in fermentation using lignin as the carbon source. Gas chromatography-mass spectrometry analysis showed that combinatorial pretreatment released more aromatic monomers, which could be more readily utilized by lignin-degrading strains. Three detoxification strategies were used to remove potential inhibitors produced from pretreatment. After heating detoxification of the lignin stream, the lipid concentration further increased by 2.9-9.7%. Different fermentation strategies were evaluated in scale-up lipid fermentation using a 2.0-l fermenter. With laccase treatment of the lignin stream produced from combinatorial pretreatment, the highest cell dry weight and lipid concentration were 10.1 and 1.83 g/l, respectively, in fed-batch fermentation, with a total soluble substrate concentration of 40 g/l. The improvement of the lipid fermentation performance may have resulted from lignin depolymerization by the combinatorial pretreatment and laccase treatment, reduced inhibition effects by fed-batch fermentation, adequate oxygen supply, and an accurate pH control in the fermenter. Overall, these results demonstrate that combinatorial pretreatment, together with fermentation optimization, favorably improves lipid production using lignin as the carbon source. Combinatorial pretreatment integrated with fed-batch fermentation was an effective strategy to improve the bioconversion of lignin into lipids, thus facilitating lignin valorization in biorefineries.
Azimi, Sayyed M; Sheridan, Steven D; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih
2018-05-01
Identification of optimal transcription-factor expression patterns to direct cellular differentiation along a desired pathway presents significant challenges. We demonstrate massively combinatorial screening of temporally-varying mRNA transcription factors to direct differentiation of neural progenitor cells using a dynamically-reconfigurable magnetically-guided spotting technology for localizing mRNA, enabling experiments on millimetre size spots. In addition, we present a time-interleaved delivery method that dramatically reduces fluctuations in the delivered transcription-factor copy-numbers per cell. We screened combinatorial and temporal delivery of a pool of midbrain-specific transcription factors to augment the generation of dopaminergic neurons. We show that the combinatorial delivery of LMX1A, FOXA2 and PITX3 is highly effective in generating dopaminergic neurons from midbrain progenitors. We show that LMX1A significantly increases TH -expression levels when delivered to neural progenitor cells either during proliferation or after induction of neural differentiation, while FOXA2 and PITX3 increase expression only when delivered prior to induction, demonstrating temporal dependence of factor addition. © 2018, Azimi et al.
NASA Astrophysics Data System (ADS)
Rahman, P. A.
2018-05-01
This scientific paper deals with the model of the knapsack optimization problem and method of its solving based on directed combinatorial search in the boolean space. The offered by the author specialized mathematical model of decomposition of the search-zone to the separate search-spheres and the algorithm of distribution of the search-spheres to the different cores of the multi-core processor are also discussed. The paper also provides an example of decomposition of the search-zone to the several search-spheres and distribution of the search-spheres to the different cores of the quad-core processor. Finally, an offered by the author formula for estimation of the theoretical maximum of the computational acceleration, which can be achieved due to the parallelization of the search-zone to the search-spheres on the unlimited number of the processor cores, is also given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernal, Andrés; Patiny, Luc; Castillo, Andrés M.
2015-02-21
Nuclear magnetic resonance (NMR) assignment of small molecules is presented as a typical example of a combinatorial optimization problem in chemical physics. Three strategies that help improve the efficiency of solution search by the branch and bound method are presented: 1. reduction of the size of the solution space by resort to a condensed structure formula, wherein symmetric nuclei are grouped together; 2. partitioning of the solution space based on symmetry, that becomes the basis for an efficient branching procedure; and 3. a criterion of selection of input restrictions that leads to increased gaps between branches and thus faster pruningmore » of non-viable solutions. Although the examples chosen to illustrate this work focus on small-molecule NMR assignment, the results are generic and might help solving other combinatorial optimization problems.« less
Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.
Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana
2015-03-11
Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Antolini, Ermete
2017-02-13
Combinatorial chemistry and high-throughput screening represent an innovative and rapid tool to prepare and evaluate a large number of new materials, saving time and expense for research and development. Considering that the activity and selectivity of catalysts depend on complex kinetic phenomena, making their development largely empirical in practice, they are prime candidates for combinatorial discovery and optimization. This review presents an overview of recent results of combinatorial screening of low-temperature fuel cell electrocatalysts for methanol oxidation. Optimum catalyst compositions obtained by combinatorial screening were compared with those of bulk catalysts, and the effect of the library geometry on the screening of catalyst composition is highlighted.
Stochastic dynamics and combinatorial optimization
NASA Astrophysics Data System (ADS)
Ovchinnikov, Igor V.; Wang, Kang L.
2017-11-01
Natural dynamics is often dominated by sudden nonlinear processes such as neuroavalanches, gamma-ray bursts, solar flares, etc., that exhibit scale-free statistics much in the spirit of the logarithmic Ritcher scale for earthquake magnitudes. On phase diagrams, stochastic dynamical systems (DSs) exhibiting this type of dynamics belong to the finite-width phase (N-phase for brevity) that precedes ordinary chaotic behavior and that is known under such names as noise-induced chaos, self-organized criticality, dynamical complexity, etc. Within the recently proposed supersymmetric theory of stochastic dynamics, the N-phase can be roughly interpreted as the noise-induced “overlap” between integrable and chaotic deterministic dynamics. As a result, the N-phase dynamics inherits the properties of the both. Here, we analyze this unique set of properties and conclude that the N-phase DSs must naturally be the most efficient optimizers: on one hand, N-phase DSs have integrable flows with well-defined attractors that can be associated with candidate solutions and, on the other hand, the noise-induced attractor-to-attractor dynamics in the N-phase is effectively chaotic or aperiodic so that a DS must avoid revisiting solutions/attractors thus accelerating the search for the best solution. Based on this understanding, we propose a method for stochastic dynamical optimization using the N-phase DSs. This method can be viewed as a hybrid of the simulated and chaotic annealing methods. Our proposition can result in a new generation of hardware devices for efficient solution of various search and/or combinatorial optimization problems.
Optimization of Highway Work Zone Decisions Considering Short-Term and Long-Term Impacts
2010-01-01
strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature of this optimization problem, a heuristic...combination of lane closure and traffic control strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature ...zone) NV # the number of vehicle classes NPV $ Net Present Value p’(t) % Adjusted traffic diversion rate at time t p(t) % Natural diversion rate
One step DNA assembly for combinatorial metabolic engineering.
Coussement, Pieter; Maertens, Jo; Beauprez, Joeri; Van Bellegem, Wouter; De Mey, Marjan
2014-05-01
The rapid and efficient assembly of multi-step metabolic pathways for generating microbial strains with desirable phenotypes is a critical procedure for metabolic engineering, and remains a significant challenge in synthetic biology. Although several DNA assembly methods have been developed and applied for metabolic pathway engineering, many of them are limited by their suitability for combinatorial pathway assembly. The introduction of transcriptional (promoters), translational (ribosome binding site (RBS)) and enzyme (mutant genes) variability to modulate pathway expression levels is essential for generating balanced metabolic pathways and maximizing the productivity of a strain. We report a novel, highly reliable and rapid single strand assembly (SSA) method for pathway engineering. The method was successfully optimized and applied to create constructs containing promoter, RBS and/or mutant enzyme libraries. To demonstrate its efficiency and reliability, the method was applied to fine-tune multi-gene pathways. Two promoter libraries were simultaneously introduced in front of two target genes, enabling orthogonal expression as demonstrated by principal component analysis. This shows that SSA will increase our ability to tune multi-gene pathways at all control levels for the biotechnological production of complex metabolites, achievable through the combinatorial modulation of transcription, translation and enzyme activity. Copyright © 2014 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Doerr, Timothy P.; Alves, Gelio; Yu, Yi-Kuo
2005-08-01
Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time using the transfer matrix technique or, equivalently, the dynamic programming approach. This suggests a way to efficiently find approximate solutions-find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of the kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the finite number of high-ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks-peptide sequencing using tandem mass spectrometry data. For directed paths in random media, the scaling function depends on the particular realization of randomness; in the mass spectrometry case, the scaling function is spectrum-specific.
Feng, Qiang; Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success.
Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success. PMID:24892046
Jiménez-Moreno, Ester; Montalvillo-Jiménez, Laura; Santana, Andrés G; Gómez, Ana M; Jiménez-Osés, Gonzalo; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesús; Cañada, Francisco Javier; Gómez-Pinto, Irene; González, Carlos; Asensio, Juan Luis
2016-05-25
Development of strong and selective binders from promiscuous lead compounds represents one of the most expensive and time-consuming tasks in drug discovery. We herein present a novel fragment-based combinatorial strategy for the optimization of multivalent polyamine scaffolds as DNA/RNA ligands. Our protocol provides a quick access to a large variety of regioisomer libraries that can be tested for selective recognition by combining microdialysis assays with simple isotope labeling and NMR experiments. To illustrate our approach, 20 small libraries comprising 100 novel kanamycin-B derivatives have been prepared and evaluated for selective binding to the ribosomal decoding A-Site sequence. Contrary to the common view of NMR as a low-throughput technique, we demonstrate that our NMR methodology represents a valuable alternative for the detection and quantification of complex mixtures, even integrated by highly similar or structurally related derivatives, a common situation in the context of a lead optimization process. Furthermore, this study provides valuable clues about the structural requirements for selective A-site recognition.
Design of focused and restrained subsets from extremely large virtual libraries.
Jamois, Eric A; Lin, Chien T; Waldman, Marvin
2003-11-01
With the current and ever-growing offering of reagents along with the vast palette of organic reactions, virtual libraries accessible to combinatorial chemists can reach sizes of billions of compounds or more. Extracting practical size subsets for experimentation has remained an essential step in the design of combinatorial libraries. A typical approach to computational library design involves enumeration of structures and properties for the entire virtual library, which may be unpractical for such large libraries. This study describes a new approach termed as on the fly optimization (OTFO) where descriptors are computed as needed within the subset optimization cycle and without intermediate enumeration of structures. Results reported herein highlight the advantages of coupling an ultra-fast descriptor calculation engine to subset optimization capabilities. We also show that enumeration of properties for the entire virtual library may not only be unpractical but also wasteful. Successful design of focused and restrained subsets can be achieved while sampling only a small fraction of the virtual library. We also investigate the stability of the method and compare results obtained from simulated annealing (SA) and genetic algorithms (GA).
Tabu Search enhances network robustness under targeted attacks
NASA Astrophysics Data System (ADS)
Sun, Shi-wen; Ma, Yi-lin; Li, Rui-qi; Wang, Li; Xia, Cheng-yi
2016-03-01
We focus on the optimization of network robustness with respect to intentional attacks on high-degree nodes. Given an existing network, this problem can be considered as a typical single-objective combinatorial optimization problem. Based on the heuristic Tabu Search optimization algorithm, a link-rewiring method is applied to reconstruct the network while keeping the degree of every node unchanged. Through numerical simulations, BA scale-free network and two real-world networks are investigated to verify the effectiveness of the proposed optimization method. Meanwhile, we analyze how the optimization affects other topological properties of the networks, including natural connectivity, clustering coefficient and degree-degree correlation. The current results can help to improve the robustness of existing complex real-world systems, as well as to provide some insights into the design of robust networks.
NASA Astrophysics Data System (ADS)
Chandra, Rishabh
Partial differential equation-constrained combinatorial optimization (PDECCO) problems are a mixture of continuous and discrete optimization problems. PDECCO problems have discrete controls, but since the partial differential equations (PDE) are continuous, the optimization space is continuous as well. Such problems have several applications, such as gas/water network optimization, traffic optimization, micro-chip cooling optimization, etc. Currently, no efficient classical algorithm which guarantees a global minimum for PDECCO problems exists. A new mapping has been developed that transforms PDECCO problem, which only have linear PDEs as constraints, into quadratic unconstrained binary optimization (QUBO) problems that can be solved using an adiabatic quantum optimizer (AQO). The mapping is efficient, it scales polynomially with the size of the PDECCO problem, requires only one PDE solve to form the QUBO problem, and if the QUBO problem is solved correctly and efficiently on an AQO, guarantees a global optimal solution for the original PDECCO problem.
Zhang, H H; Gao, S; Chen, W; Shi, L; D'Souza, W D; Meyer, R R
2013-03-21
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equallyspaced beams (eplans), we have developed a global search metaheuristic process based on the nested partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are of superior quality.
Zhang, H H; Gao, S; Chen, W; Shi, L; D’Souza, W D; Meyer, R R
2013-01-01
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equally-spaced beams (eplans), we have developed a global search metaheuristic process based on the Nested Partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are superior quality. PMID:23459411
System and method for bullet tracking and shooter localization
Roberts, Randy S [Livermore, CA; Breitfeller, Eric F [Dublin, CA
2011-06-21
A system and method of processing infrared imagery to determine projectile trajectories and the locations of shooters with a high degree of accuracy. The method includes image processing infrared image data to reduce noise and identify streak-shaped image features, using a Kalman filter to estimate optimal projectile trajectories, updating the Kalman filter with new image data, determining projectile source locations by solving a combinatorial least-squares solution for all optimal projectile trajectories, and displaying all of the projectile source locations. Such a shooter-localization system is of great interest for military and law enforcement applications to determine sniper locations, especially in urban combat scenarios.
Optimizing an Actuator Array for the Control of Multi-Frequency Noise in Aircraft Interiors
NASA Technical Reports Server (NTRS)
Palumbo, D. L.; Padula, S. L.
1997-01-01
Techniques developed for selecting an optimized actuator array for interior noise reduction at a single frequency are extended to the multi-frequency case. Transfer functions for 64 actuators were obtained at 5 frequencies from ground testing the rear section of a fully trimmed DC-9 fuselage. A single loudspeaker facing the left side of the aircraft was the primary source. A combinatorial search procedure (tabu search) was employed to find optimum actuator subsets of from 2 to 16 actuators. Noise reduction predictions derived from the transfer functions were used as a basis for evaluating actuator subsets during optimization. Results indicate that it is necessary to constrain actuator forces during optimization. Unconstrained optimizations selected actuators which require unrealistically large forces. Two methods of constraint are evaluated. It is shown that a fast, but approximate, method yields results equivalent to an accurate, but computationally expensive, method.
Ito, Yoichiro; Yamanishi, Mamoru; Ikeuchi, Akinori; Imamura, Chie; Matsuyama, Takashi
2015-01-01
Combinatorial screening used together with a broad library of gene expression cassettes is expected to produce a powerful tool for the optimization of the simultaneous expression of multiple enzymes. Recently, we proposed a highly tunable protein expression system that utilized multiple genome-integrated target genes to fine-tune enzyme expression in yeast cells. This tunable system included a library of expression cassettes each composed of three gene-expression control elements that in different combinations produced a wide range of protein expression levels. In this study, four gene expression cassettes with graded protein expression levels were applied to the expression of three cellulases: cellobiohydrolase 1, cellobiohydrolase 2, and endoglucanase 2. After combinatorial screening for transgenic yeasts simultaneously secreting these three cellulases, we obtained strains with higher cellulase expressions than a strain harboring three cellulase-expression constructs within one high-performance gene expression cassette. These results show that our method will be of broad use throughout the field of metabolic engineering. PMID:26692026
Feature selection methods for big data bioinformatics: A survey from the search perspective.
Wang, Lipo; Wang, Yaoli; Chang, Qing
2016-12-01
This paper surveys main principles of feature selection and their recent applications in big data bioinformatics. Instead of the commonly used categorization into filter, wrapper, and embedded approaches to feature selection, we formulate feature selection as a combinatorial optimization or search problem and categorize feature selection methods into exhaustive search, heuristic search, and hybrid methods, where heuristic search methods may further be categorized into those with or without data-distilled feature ranking measures. Copyright © 2016 Elsevier Inc. All rights reserved.
δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms
NASA Astrophysics Data System (ADS)
Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi
In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.
Two is better than one; toward a rational design of combinatorial therapy.
Chen, Sheng-Hong; Lahav, Galit
2016-12-01
Drug combination is an appealing strategy for combating the heterogeneity of tumors and evolution of drug resistance. However, the rationale underlying combinatorial therapy is often not well established due to lack of understandings of the specific pathways responding to the drugs, and their temporal dynamics following each treatment. Here we present several emerging trends in harnessing properties of biological systems for the optimal design of drug combinations, including the type of drugs, specific concentration, sequence of addition and the temporal schedule of treatments. We highlight recent studies showing different approaches for efficient design of drug combinations including single-cell signaling dynamics, adaption and pathway crosstalk. Finally, we discuss novel and feasible approaches that can facilitate the optimal design of combinatorial therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Martin, Brian
Combinatorial approaches have proven useful for rapid alloy fabrication and optimization. A new method of producing controlled isothermal gradients using the Gleeble Thermomechanical simulator has been developed, and demonstrated on the metastable beta-Ti alloy beta-21S, achieving a thermal gradient of 525-700 °C. This thermal gradient method has subsequently been coupled with existing combinatorial methods of producing composition gradients using the LENS(TM) additive manufacturing system, through the use of elemental blended powders. This has been demonstrated with a binary Ti-(0-15) wt% Cr build, which has subsequently been characterized with optical and electron microscopy, with special attention to the precipitate of TiCr2 Laves phases. The TiCr2 phase has been explored for its high temperature mechanical properties in a new oxidation resistant beta-Ti alloy, which serves as a demonstration of the new bicombinatorial methods developed as applied to a multicomponent alloy system.
NASA Technical Reports Server (NTRS)
Rash, James L.
2010-01-01
NASA's space data-communications infrastructure, the Space Network and the Ground Network, provide scheduled (as well as some limited types of unscheduled) data-communications services to user spacecraft via orbiting relay satellites and ground stations. An implementation of the methods and algorithms disclosed herein will be a system that produces globally optimized schedules with not only optimized service delivery by the space data-communications infrastructure but also optimized satisfaction of all user requirements and prescribed constraints, including radio frequency interference (RFI) constraints. Evolutionary search, a class of probabilistic strategies for searching large solution spaces, constitutes the essential technology in this disclosure. Also disclosed are methods and algorithms for optimizing the execution efficiency of the schedule-generation algorithm itself. The scheduling methods and algorithms as presented are adaptable to accommodate the complexity of scheduling the civilian and/or military data-communications infrastructure. Finally, the problem itself, and the methods and algorithms, are generalized and specified formally, with applicability to a very broad class of combinatorial optimization problems.
Combinatorial Optimization of Heterogeneous Catalysts Used in the Growth of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Verma, Sunita; Delzeit, Lance; Meyyappan, M.; Han, Jie
2000-01-01
Libraries of liquid-phase catalyst precursor solutions were printed onto iridium-coated silicon substrates and evaluated for their effectiveness in catalyzing the growth of multi-walled carbon nanotubes (MWNTs) by chemical vapor deposition (CVD). The catalyst precursor solutions were composed of inorganic salts and a removable tri-block copolymer (EO)20(PO)70(EO)20 (EO = ethylene oxide, PO = propylene oxide) structure-directing agent (SDA), dissolved in ethanol/methanol mixtures. Sample libraries were quickly assayed using scanning electron microscopy after CVD growth to identify active catalysts and CVD conditions. Composition libraries and focus libraries were then constructed around the active spots identified in the discovery libraries to understand how catalyst precursor composition affects the yield, density, and quality of the nanotubes. Successful implementation of combinatorial optimization methods in the development of highly active, carbon nanotube catalysts is demonstrated, as well as the identification of catalyst formulations that lead to varying densities and shapes of aligned nanotube towers.
A New Approach for Proving or Generating Combinatorial Identities
ERIC Educational Resources Information Center
Gonzalez, Luis
2010-01-01
A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…
NASA Astrophysics Data System (ADS)
Vallet, B.; Soheilian, B.; Brédif, M.
2014-08-01
The 3D reconstruction of similar 3D objects detected in 2D faces a major issue when it comes to grouping the 2D detections into clusters to be used to reconstruct the individual 3D objects. Simple clustering heuristics fail as soon as similar objects are close. This paper formulates a framework to use the geometric quality of the reconstruction as a hint to do a proper clustering. We present a methodology to solve the resulting combinatorial optimization problem with some simplifications and approximations in order to make it tractable. The proposed method is applied to the reconstruction of 3D traffic signs from their 2D detections to demonstrate its capacity to solve ambiguities.
Zhou, Yikang; Li, Gang; Dong, Junkai; Xing, Xin-Hui; Dai, Junbiao; Zhang, Chong
2018-05-01
Facing boosting ability to construct combinatorial metabolic pathways, how to search the metabolic sweet spot has become the rate-limiting step. We here reported an efficient Machine-learning workflow in conjunction with YeastFab Assembly strategy (MiYA) for combinatorial optimizing the large biosynthetic genotypic space of heterologous metabolic pathways in Saccharomyces cerevisiae. Using β-carotene biosynthetic pathway as example, we first demonstrated that MiYA has the power to search only a small fraction (2-5%) of combinatorial space to precisely tune the expression level of each gene with a machine-learning algorithm of an artificial neural network (ANN) ensemble to avoid over-fitting problem when dealing with a small number of training samples. We then applied MiYA to improve the biosynthesis of violacein. Feed with initial data from a colorimetric plate-based, pre-screened pool of 24 strains producing violacein, MiYA successfully predicted, and verified experimentally, the existence of a strain that showed a 2.42-fold titer improvement in violacein production among 3125 possible designs. Furthermore, MiYA was able to largely avoid the branch pathway of violacein biosynthesis that makes deoxyviolacein, and produces very pure violacein. Together, MiYA combines the advantages of standardized building blocks and machine learning to accelerate the Design-Build-Test-Learn (DBTL) cycle for combinatorial optimization of metabolic pathways, which could significantly accelerate the development of microbial cell factories. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Combinatorial chemical bath deposition of CdS contacts for chalcogenide photovoltaics
Mokurala, Krishnaiah; Baranowski, Lauryn L.; de Souza Lucas, Francisco W.; ...
2016-08-01
Contact layers play an important role in thin film solar cells, but new material development and optimization of its thickness is usually a long and tedious process. A high-throughput experimental approach has been used to accelerate the rate of research in photovoltaic (PV) light absorbers and transparent conductive electrodes, however the combinatorial research on contact layers is less common. Here, we report on the chemical bath deposition (CBD) of CdS thin films by combinatorial dip coating technique and apply these contact layers to Cu(In,Ga)Se 2 (CIGSe) and Cu 2ZnSnSe 4 (CZTSe) light absorbers in PV devices. Combinatorial thickness steps ofmore » CdS thin films were achieved by removal of the substrate from the chemical bath, at regular intervals of time, and in equal distance increments. The trends in the photoconversion efficiency and in the spectral response of the PV devices as a function of thickness of CdS contacts were explained with the help of optical and morphological characterization of the CdS thin films. The maximum PV efficiency achieved for the combinatorial dip-coating CBD was similar to that for the PV devices processed using conventional CBD. Finally, the results of this study lead to the conclusion that combinatorial dip-coating can be used to accelerate the optimization of PV device performance of CdS and other candidate contact layers for a wide range of emerging absorbers.« less
Identification of combinatorial drug regimens for treatment of Huntington's disease using Drosophila
NASA Astrophysics Data System (ADS)
Agrawal, Namita; Pallos, Judit; Slepko, Natalia; Apostol, Barbara L.; Bodai, Laszlo; Chang, Ling-Wen; Chiang, Ann-Shyn; Michels Thompson, Leslie; Marsh, J. Lawrence
2005-03-01
We explore the hypothesis that pathology of Huntington's disease involves multiple cellular mechanisms whose contributions to disease are incrementally additive or synergistic. We provide evidence that the photoreceptor neuron degeneration seen in flies expressing mutant human huntingtin correlates with widespread degenerative events in the Drosophila CNS. We use a Drosophila Huntington's disease model to establish dose regimens and protocols to assess the effectiveness of drug combinations used at low threshold concentrations. These proof of principle studies identify at least two potential combinatorial treatment options and illustrate a rapid and cost-effective paradigm for testing and optimizing combinatorial drug therapies while reducing side effects for patients with neurodegenerative disease. The potential for using prescreening in Drosophila to inform combinatorial therapies that are most likely to be effective for testing in mammals is discussed. combinatorial treatments | neurodegeneration
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
ERIC Educational Resources Information Center
Kittredge, Kevin W.; Marine, Susan S.; Taylor, Richard T.
2004-01-01
A molecule possessing other functional groups that could be hydrogenerated is examined, where a variety of metal catalysts are evaluated under similar reaction conditions. Optimizing organic reactions is both time and labor intensive, and the use of a combinatorial parallel synthesis reactor was great time saving device, as per summary.
Control Coordination of Multiple Agents Through Decision Theoretic and Economic Methods
2003-02-01
instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information...investigated the design of test data for benchmarking such optimization algorithms. Our other research on combinatorial auctions included I...average combination rule. We exemplified these theoretical results with experiments on stock market data , demonstrating how ensembles of classifiers can
Teixidó, Meritxell; Belda, Ignasi; Zurita, Esther; Llorà, Xavier; Fabre, Myriam; Vilaró, Senén; Albericio, Fernando; Giralt, Ernest
2005-12-01
The use of high-throughput methods in drug discovery allows the generation and testing of a large number of compounds, but at the price of providing redundant information. Evolutionary combinatorial chemistry combines the selection and synthesis of biologically active compounds with artificial intelligence optimization methods, such as genetic algorithms (GA). Drug candidates for the treatment of central nervous system (CNS) disorders must overcome the blood-brain barrier (BBB). This paper reports a new genetic algorithm that searches for the optimal physicochemical properties for peptide transport across the blood-brain barrier. A first generation of peptides has been generated and synthesized. Due to the high content of N-methyl amino acids present in most of these peptides, their syntheses were especially challenging due to over-incorporations, deletions and DKP formations. Distinct fragmentation patterns during peptide cleavage have been identified. The first generation of peptides has been studied by evaluation techniques such as immobilized artificial membrane chromatography (IAMC), a cell-based assay, log Poctanol/water calculations, etc. Finally, a second generation has been proposed. (c) 2005 European Peptide Society and John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.
Charleston, M A
1995-01-01
This article introduces a coherent language base for describing and working with characteristics of combinatorial optimization problems, which is at once general enough to be used in all such problems and precise enough to allow subtle concepts in this field to be discussed unambiguously. An example is provided of how this nomenclature is applied to an instance of the phylogeny problem. Also noted is the beneficial effect, on the landscape of the solution space, of transforming the observed data to account for multiple changes of character state.
Synthesizing optimal waste blends
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narayan, V.; Diwekar, W.M.; Hoza, M.
Vitrification of tank wastes to form glass is a technique that will be used for the disposal of high-level waste at Hanford. Process and storage economics show that minimizing the total number of glass logs produced is the key to keeping cost as low as possible. The amount of glass produced can be reduced by blending of the wastes. The optimal way to combine the tanks to minimize the vole of glass can be determined from a discrete blend calculation. However, this problem results in a combinatorial explosion as the number of tanks increases. Moreover, the property constraints make thismore » problem highly nonconvex where many algorithms get trapped in local minima. In this paper the authors examine the use of different combinatorial optimization approaches to solve this problem. A two-stage approach using a combination of simulated annealing and nonlinear programming (NLP) is developed. The results of different methods such as the heuristics approach based on human knowledge and judgment, the mixed integer nonlinear programming (MINLP) approach with GAMS, and branch and bound with lower bound derived from the structure of the given blending problem are compared with this coupled simulated annealing and NLP approach.« less
Concept of combinatorial de novo design of drug-like molecules by particle swarm optimization.
Hartenfeller, Markus; Proschak, Ewgenij; Schüller, Andreas; Schneider, Gisbert
2008-07-01
We present a fast stochastic optimization algorithm for fragment-based molecular de novo design (COLIBREE, Combinatorial Library Breeding). The search strategy is based on a discrete version of particle swarm optimization. Molecules are represented by a scaffold, which remains constant during optimization, and variable linkers and side chains. Different linkers represent virtual chemical reactions. Side-chain building blocks were obtained from pseudo-retrosynthetic dissection of large compound databases. Here, ligand-based design was performed using chemically advanced template search (CATS) topological pharmacophore similarity to reference ligands as fitness function. A weighting scheme was included for particle swarm optimization-based molecular design, which permits the use of many reference ligands and allows for positive and negative design to be performed simultaneously. In a case study, the approach was applied to the de novo design of potential peroxisome proliferator-activated receptor subtype-selective agonists. The results demonstrate the ability of the technique to cope with large combinatorial chemistry spaces and its applicability to focused library design. The technique was able to perform exploitation of a known scheme and at the same time explorative search for novel ligands within the framework of a given molecular core structure. It thereby represents a practical solution for compound screening in the early hit and lead finding phase of a drug discovery project.
Concepts and applications of "natural computing" techniques in de novo drug and peptide design.
Hiss, Jan A; Hartenfeller, Markus; Schneider, Gisbert
2010-05-01
Evolutionary algorithms, particle swarm optimization, and ant colony optimization have emerged as robust optimization methods for molecular modeling and peptide design. Such algorithms mimic combinatorial molecule assembly by using molecular fragments as building-blocks for compound construction, and relying on adaptation and emergence of desired pharmacological properties in a population of virtual molecules. Nature-inspired algorithms might be particularly suited for bioisosteric replacement or scaffold-hopping from complex natural products to synthetically more easily accessible compounds that are amenable to optimization by medicinal chemistry. The theory and applications of selected nature-inspired algorithms for drug design are reviewed, together with practical applications and a discussion of their advantages and limitations.
A Combinatorial Platform for the Optimization of Peptidomimetic Methyl-Lysine Reader Antagonists
NASA Astrophysics Data System (ADS)
Barnash, Kimberly D.
Post-translational modification of histone N-terminal tails mediates chromatin compaction and, consequently, DNA replication, transcription, and repair. While numerous post-translational modifications decorate histone tails, lysine methylation is an abundant mark important for both gene activation and repression. Methyl-lysine (Kme) readers function through binding mono-, di-, or trimethyl-lysine. Chemical intervention of Kme readers faces numerous challenges due to the broad surface-groove interactions between readers and their cognate histone peptides; yet, the increasing interest in understanding chromatin-modifying complexes suggests tractable lead compounds for Kme readers are critical for elucidating the mechanisms of chromatin dysregulation in disease states and validating the druggability of these domains and complexes. The successful discovery of a peptide-derived chemical probe, UNC3866, for the Polycomb repressive complex 1 (PRC1) chromodomain Kme readers has proven the potential for selective peptidomimetic inhibition of reader function. Unfortunately, the systematic modification of peptides-to-peptidomimetics is a costly and inefficient strategy for target-class hit discovery against Kme readers. Through the exploration of biased chemical space via combinatorial on-bead libraries, we have developed two concurrent methodologies for Kme reader chemical probe discovery. We employ biased peptide combinatorial libraries as a hit discovery strategy with subsequent optimization via iterative targeted libraries. Peptide-to-peptidomimetic optimization through targeted library design was applied based on structure-guided library design around the interaction of the endogenous peptide ligand with three target Kme readers. Efforts targeting the WD40 reader EED led to the discovery of the 3-mer peptidomimetic ligand UNC5115 while combinatorial repurposing of UNC3866 for off-target chromodomains resulted in the discovery of UNC4991, a CDYL/2-selective ligand, and UNC4848, a MPP8 and CDYL/2 ligand. Ultimately, our efforts demonstrate the generalizability of a peptidomimetic combinatorial platform for the optimization of Kme reader ligands in a target class manner.
An Adaptive Niching Genetic Algorithm using a niche size equalization mechanism
NASA Astrophysics Data System (ADS)
Nagata, Yuichi
Niching GAs have been widely investigated to apply genetic algorithms (GAs) to multimodal function optimization problems. In this paper, we suggest a new niching GA that attempts to form niches, each consisting of an equal number of individuals. The proposed GA can be applied also to combinatorial optimization problems by defining a distance metric in the search space. We apply the proposed GA to the job-shop scheduling problem (JSP) and demonstrate that the proposed niching method enhances the ability to maintain niches and improve the performance of GAs.
NASA Astrophysics Data System (ADS)
Tahernezhad-Javazm, Farajollah; Azimirad, Vahid; Shoaran, Maryam
2018-04-01
Objective. Considering the importance and the near-future development of noninvasive brain-machine interface (BMI) systems, this paper presents a comprehensive theoretical-experimental survey on the classification and evolutionary methods for BMI-based systems in which EEG signals are used. Approach. The paper is divided into two main parts. In the first part, a wide range of different types of the base and combinatorial classifiers including boosting and bagging classifiers and evolutionary algorithms are reviewed and investigated. In the second part, these classifiers and evolutionary algorithms are assessed and compared based on two types of relatively widely used BMI systems, sensory motor rhythm-BMI and event-related potentials-BMI. Moreover, in the second part, some of the improved evolutionary algorithms as well as bi-objective algorithms are experimentally assessed and compared. Main results. In this study two databases are used, and cross-validation accuracy (CVA) and stability to data volume (SDV) are considered as the evaluation criteria for the classifiers. According to the experimental results on both databases, regarding the base classifiers, linear discriminant analysis and support vector machines with respect to CVA evaluation metric, and naive Bayes with respect to SDV demonstrated the best performances. Among the combinatorial classifiers, four classifiers, Bagg-DT (bagging decision tree), LogitBoost, and GentleBoost with respect to CVA, and Bagging-LR (bagging logistic regression) and AdaBoost (adaptive boosting) with respect to SDV had the best performances. Finally, regarding the evolutionary algorithms, single-objective invasive weed optimization (IWO) and bi-objective nondominated sorting IWO algorithms demonstrated the best performances. Significance. We present a general survey on the base and the combinatorial classification methods for EEG signals (sensory motor rhythm and event-related potentials) as well as their optimization methods through the evolutionary algorithms. In addition, experimental and statistical significance tests are carried out to study the applicability and effectiveness of the reviewed methods.
Exact solution of large asymmetric traveling salesman problems.
Miller, D L; Pekny, J F
1991-02-15
The traveling salesman problem is one of a class of difficult problems in combinatorial optimization that is representative of a large number of important scientific and engineering problems. A survey is given of recent applications and methods for solving large problems. In addition, an algorithm for the exact solution of the asymmetric traveling salesman problem is presented along with computational results for several classes of problems. The results show that the algorithm performs remarkably well for some classes of problems, determining an optimal solution even for problems with large numbers of cities, yet for other classes, even small problems thwart determination of a provably optimal solution.
Towards a theory of automated elliptic mesh generation
NASA Technical Reports Server (NTRS)
Cordova, J. Q.
1992-01-01
The theory of elliptic mesh generation is reviewed and the fundamental problem of constructing computational space is discussed. It is argued that the construction of computational space is an NP-Complete problem and therefore requires a nonstandard approach for its solution. This leads to the development of graph-theoretic, combinatorial optimization and integer programming algorithms. Methods for the construction of two dimensional computational space are presented.
Developing recombinant antibodies for biomarker detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baird, Cheryl L.; Fischer, Christopher J.; Pefaur, Noah B.
2010-10-01
Monoclonal antibodies (mAbs) have an essential role in biomarker validation and diagnostic assays. A barrier to pursuing these applications is the reliance on immunization and hybridomas to produce mAbs, which is time-consuming and may not yield the desired mAb. We recommend a process flow for affinity reagent production that utilizes combinatorial protein display systems (eg, yeast surface display or phage display) rather than hybridomas. These systems link a selectable phenotype-binding conferred by an antibody fragment-with a means for recovering the encoding gene. Recombinant libraries obtained from immunizations can produce high-affinity antibodies (<10 nM) more quickly than other methods. Non-immune librariesmore » provide an alternate route when immunizations are not possible, or when suitable mAbs are not recovered from an immune library. Directed molecular evolution (DME) is an integral part of optimizing mAbs obtained from combinatorial protein display, but can also be used on hybridoma-derived mAbs. Variants can easily be obtained and screened to increase the affinity of the parent mAb (affinity maturation). We discuss examples where DME has been used to tailor affinity reagents to specific applications. Combinatorial protein display also provides an accessible method for identifying antibody pairs, which are necessary for sandwich-type diagnostic assays.« less
Zhang, Guozhu; Xie, Changsheng; Zhang, Shunping; Zhao, Jianwei; Lei, Tao; Zeng, Dawen
2014-09-08
A combinatorial high-throughput temperature-programmed method to obtain the optimal operating temperature (OOT) of gas sensor materials is demonstrated here for the first time. A material library consisting of SnO2, ZnO, WO3, and In2O3 sensor films was fabricated by screen printing. Temperature-dependent conductivity curves were obtained by scanning this gas sensor library from 300 to 700 K in different atmospheres (dry air, formaldehyde, carbon monoxide, nitrogen dioxide, toluene and ammonia), giving the OOT of each sensor formulation as a function of the carrier and analyte gases. A comparative study of the temperature-programmed method and a conventional method showed good agreement in measured OOT.
cDREM: inferring dynamic combinatorial gene regulation.
Wise, Aaron; Bar-Joseph, Ziv
2015-04-01
Genes are often combinatorially regulated by multiple transcription factors (TFs). Such combinatorial regulation plays an important role in development and facilitates the ability of cells to respond to different stresses. While a number of approaches have utilized sequence and ChIP-based datasets to study combinational regulation, these have often ignored the combinational logic and the dynamics associated with such regulation. Here we present cDREM, a new method for reconstructing dynamic models of combinatorial regulation. cDREM integrates time series gene expression data with (static) protein interaction data. The method is based on a hidden Markov model and utilizes the sparse group Lasso to identify small subsets of combinatorially active TFs, their time of activation, and the logical function they implement. We tested cDREM on yeast and human data sets. Using yeast we show that the predicted combinatorial sets agree with other high throughput genomic datasets and improve upon prior methods developed to infer combinatorial regulation. Applying cDREM to study human response to flu, we were able to identify several combinatorial TF sets, some of which were known to regulate immune response while others represent novel combinations of important TFs.
Su, Zhangli
2016-01-01
Combinatorial patterns of histone modifications are key indicators of different chromatin states. Most of the current approaches rely on the usage of antibodies to analyze combinatorial histone modifications. Here we detail an antibody-free method named MARCC (Matrix-Assisted Reader Chromatin Capture) to enrich combinatorial histone modifications. The combinatorial patterns are enriched on native nucleosomes extracted from cultured mammalian cells and prepared by micrococcal nuclease digestion. Such enrichment is achieved by recombinant chromatin-interacting protein modules, or so-called reader domains, which can bind in a combinatorial modification-dependent manner. The enriched chromatin can be quantified by western blotting or mass spectrometry for the co-existence of histone modifications, while the associated DNA content can be analyzed by qPCR or next-generation sequencing. Altogether, MARCC provides a reproducible, efficient and customizable solution to enrich and analyze combinatorial histone modifications. PMID:26131849
Hybridization of decomposition and local search for multiobjective optimization.
Ke, Liangjun; Zhang, Qingfu; Battiti, Roberto
2014-10-01
Combining ideas from evolutionary algorithms, decomposition approaches, and Pareto local search, this paper suggests a simple yet efficient memetic algorithm for combinatorial multiobjective optimization problems: memetic algorithm based on decomposition (MOMAD). It decomposes a combinatorial multiobjective problem into a number of single objective optimization problems using an aggregation method. MOMAD evolves three populations: 1) population P(L) for recording the current solution to each subproblem; 2) population P(P) for storing starting solutions for Pareto local search; and 3) an external population P(E) for maintaining all the nondominated solutions found so far during the search. A problem-specific single objective heuristic can be applied to these subproblems to initialize the three populations. At each generation, a Pareto local search method is first applied to search a neighborhood of each solution in P(P) to update P(L) and P(E). Then a single objective local search is applied to each perturbed solution in P(L) for improving P(L) and P(E), and reinitializing P(P). The procedure is repeated until a stopping condition is met. MOMAD provides a generic hybrid multiobjective algorithmic framework in which problem specific knowledge, well developed single objective local search and heuristics and Pareto local search methods can be hybridized. It is a population based iterative method and thus an anytime algorithm. Extensive experiments have been conducted in this paper to study MOMAD and compare it with some other state-of-the-art algorithms on the multiobjective traveling salesman problem and the multiobjective knapsack problem. The experimental results show that our proposed algorithm outperforms or performs similarly to the best so far heuristics on these two problems.
NASA Technical Reports Server (NTRS)
Phillips, K.
1976-01-01
A mathematical model for job scheduling in a specified context is presented. The model uses both linear programming and combinatorial methods. While designed with a view toward optimization of scheduling of facility and plant operations at the Deep Space Communications Complex, the context is sufficiently general to be widely applicable. The general scheduling problem including options for scheduling objectives is discussed and fundamental parameters identified. Mathematical algorithms for partitioning problems germane to scheduling are presented.
Solving optimization problems by the public goods game
NASA Astrophysics Data System (ADS)
Javarone, Marco Alberto
2017-09-01
We introduce a method based on the Public Goods Game for solving optimization tasks. In particular, we focus on the Traveling Salesman Problem, i.e. a NP-hard problem whose search space exponentially grows increasing the number of cities. The proposed method considers a population whose agents are provided with a random solution to the given problem. In doing so, agents interact by playing the Public Goods Game using the fitness of their solution as currency of the game. Notably, agents with better solutions provide higher contributions, while those with lower ones tend to imitate the solution of richer agents for increasing their fitness. Numerical simulations show that the proposed method allows to compute exact solutions, and suboptimal ones, in the considered search spaces. As result, beyond to propose a new heuristic for combinatorial optimization problems, our work aims to highlight the potentiality of evolutionary game theory beyond its current horizons.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.
Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian
2011-01-13
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm
2010-01-01
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure−activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods. PMID:24900251
Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.
2015-01-01
SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231
Combinatorics of least-squares trees.
Mihaescu, Radu; Pachter, Lior
2008-09-09
A recurring theme in the least-squares approach to phylogenetics has been the discovery of elegant combinatorial formulas for the least-squares estimates of edge lengths. These formulas have proved useful for the development of efficient algorithms, and have also been important for understanding connections among popular phylogeny algorithms. For example, the selection criterion of the neighbor-joining algorithm is now understood in terms of the combinatorial formulas of Pauplin for estimating tree length. We highlight a phylogenetically desirable property that weighted least-squares methods should satisfy, and provide a complete characterization of methods that satisfy the property. The necessary and sufficient condition is a multiplicative four-point condition that the variance matrix needs to satisfy. The proof is based on the observation that the Lagrange multipliers in the proof of the Gauss-Markov theorem are tree-additive. Our results generalize and complete previous work on ordinary least squares, balanced minimum evolution, and the taxon-weighted variance model. They also provide a time-optimal algorithm for computation.
NASA Astrophysics Data System (ADS)
Ushijima, Timothy T.; Yeh, William W.-G.
2013-10-01
An optimal experimental design algorithm is developed to select locations for a network of observation wells that provide maximum information about unknown groundwater pumping in a confined, anisotropic aquifer. The design uses a maximal information criterion that chooses, among competing designs, the design that maximizes the sum of squared sensitivities while conforming to specified design constraints. The formulated optimization problem is non-convex and contains integer variables necessitating a combinatorial search. Given a realistic large-scale model, the size of the combinatorial search required can make the problem difficult, if not impossible, to solve using traditional mathematical programming techniques. Genetic algorithms (GAs) can be used to perform the global search; however, because a GA requires a large number of calls to a groundwater model, the formulated optimization problem still may be infeasible to solve. As a result, proper orthogonal decomposition (POD) is applied to the groundwater model to reduce its dimensionality. Then, the information matrix in the full model space can be searched without solving the full model. Results from a small-scale test case show identical optimal solutions among the GA, integer programming, and exhaustive search methods. This demonstrates the GA's ability to determine the optimal solution. In addition, the results show that a GA with POD model reduction is several orders of magnitude faster in finding the optimal solution than a GA using the full model. The proposed experimental design algorithm is applied to a realistic, two-dimensional, large-scale groundwater problem. The GA converged to a solution for this large-scale problem.
USDA-ARS?s Scientific Manuscript database
Ant Colony Optimization (ACO) refers to the family of algorithms inspired by the behavior of real ants and used to solve combinatorial problems such as the Traveling Salesman Problem (TSP).Optimal Foraging Theory (OFT) is an evolutionary principle wherein foraging organisms or insect parasites seek ...
Genetic algorithms for the vehicle routing problem
NASA Astrophysics Data System (ADS)
Volna, Eva
2016-06-01
The Vehicle Routing Problem (VRP) is one of the most challenging combinatorial optimization tasks. This problem consists in designing the optimal set of routes for fleet of vehicles in order to serve a given set of customers. Evolutionary algorithms are general iterative algorithms for combinatorial optimization. These algorithms have been found to be very effective and robust in solving numerous problems from a wide range of application domains. This problem is known to be NP-hard; hence many heuristic procedures for its solution have been suggested. For such problems it is often desirable to obtain approximate solutions, so they can be found fast enough and are sufficiently accurate for the purpose. In this paper we have performed an experimental study that indicates the suitable use of genetic algorithms for the vehicle routing problem.
Luo, Li; Luo, Le; Zhang, Xinli; He, Xiaoli
2017-07-10
Accurate forecasting of hospital outpatient visits is beneficial for the reasonable planning and allocation of healthcare resource to meet the medical demands. In terms of the multiple attributes of daily outpatient visits, such as randomness, cyclicity and trend, time series methods, ARIMA, can be a good choice for outpatient visits forecasting. On the other hand, the hospital outpatient visits are also affected by the doctors' scheduling and the effects are not pure random. Thinking about the impure specialty, this paper presents a new forecasting model that takes cyclicity and the day of the week effect into consideration. We formulate a seasonal ARIMA (SARIMA) model on a daily time series and then a single exponential smoothing (SES) model on the day of the week time series, and finally establish a combinatorial model by modifying them. The models are applied to 1 year of daily visits data of urban outpatients in two internal medicine departments of a large hospital in Chengdu, for forecasting the daily outpatient visits about 1 week ahead. The proposed model is applied to forecast the cross-sectional data for 7 consecutive days of daily outpatient visits over an 8-weeks period based on 43 weeks of observation data during 1 year. The results show that the two single traditional models and the combinatorial model are simplicity of implementation and low computational intensiveness, whilst being appropriate for short-term forecast horizons. Furthermore, the combinatorial model can capture the comprehensive features of the time series data better. Combinatorial model can achieve better prediction performance than the single model, with lower residuals variance and small mean of residual errors which needs to be optimized deeply on the next research step.
Heterogeneous Catalysis: Understanding for Designing, and Designing for Applications.
Corma, Avelino
2016-05-17
"… Despite the introduction of high-throughput and combinatorial methods that certainly can be useful in the process of catalysts optimization, it is recognized that the generation of fundamental knowledge at the molecular level is key for the development of new concepts and for reaching the final objective of solid catalysts by design …" Read more in the Editorial by Avelino Corma. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optimal mapping of irregular finite element domains to parallel processors
NASA Technical Reports Server (NTRS)
Flower, J.; Otto, S.; Salama, M.
1987-01-01
Mapping the solution domain of n-finite elements into N-subdomains that may be processed in parallel by N-processors is an optimal one if the subdomain decomposition results in a well-balanced workload distribution among the processors. The problem is discussed in the context of irregular finite element domains as an important aspect of the efficient utilization of the capabilities of emerging multiprocessor computers. Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids. The simulated annealing analogy and algorithm are described, and numerical results are given for mapping an irregular two-dimensional finite element domain containing a singularity onto the Hypercube computer.
'Extremotaxis': computing with a bacterial-inspired algorithm.
Nicolau, Dan V; Burrage, Kevin; Nicolau, Dan V; Maini, Philip K
2008-01-01
We present a general-purpose optimization algorithm inspired by "run-and-tumble", the biased random walk chemotactic swimming strategy used by the bacterium Escherichia coli to locate regions of high nutrient concentration The method uses particles (corresponding to bacteria) that swim through the variable space (corresponding to the attractant concentration profile). By constantly performing temporal comparisons, the particles drift towards the minimum or maximum of the function of interest. We illustrate the use of our method with four examples. We also present a discrete version of the algorithm. The new algorithm is expected to be useful in combinatorial optimization problems involving many variables, where the functional landscape is apparently stochastic and has local minima, but preserves some derivative structure at intermediate scales.
Optimized Reaction Conditions for Amide Bond Formation in DNA-Encoded Combinatorial Libraries.
Li, Yizhou; Gabriele, Elena; Samain, Florent; Favalli, Nicholas; Sladojevich, Filippo; Scheuermann, Jörg; Neri, Dario
2016-08-08
DNA-encoded combinatorial libraries are increasingly being used as tools for the discovery of small organic binding molecules to proteins of biological or pharmaceutical interest. In the majority of cases, synthetic procedures for the formation of DNA-encoded combinatorial libraries incorporate at least one step of amide bond formation between amino-modified DNA and a carboxylic acid. We investigated reaction conditions and established a methodology by using 1-ethyl-3-(3-(dimethylamino)propyl)carbodiimide, 1-hydroxy-7-azabenzotriazole and N,N'-diisopropylethylamine (EDC/HOAt/DIPEA) in combination, which provided conversions greater than 75% for 423/543 (78%) of the carboxylic acids tested. These reaction conditions were efficient with a variety of primary and secondary amines, as well as with various types of amino-modified oligonucleotides. The reaction conditions, which also worked efficiently over a broad range of DNA concentrations and reaction scales, should facilitate the synthesis of novel DNA-encoded combinatorial libraries.
FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science
NASA Astrophysics Data System (ADS)
Chikyo, Toyohiro
2011-10-01
About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.
Chan, Ting-Shan; Liu, Yao-Min; Liu, Ru-Shi
2008-01-01
The present investigation aims at the synthesis of KSr 1-x-y PO 4:Tb(3+) x Eu(2+) y phosphors using the combinatorial chemistry method. We have developed square-type arrays consisting of 121 compositions to investigate the optimum composition and luminescence properties of KSrPO 4 host matrix under 365 nm ultraviolet (UV) light. The optimized compositions of phosphors were found to be KSr 0.93PO 4:Tb(3+) 0.07 (green) and KSr 0.995PO 4:Eu(2+) 0.005 (blue). These phosphors showed good thermal luminescence stability better than commercially available YAG:Ce at temperature above 200 degrees C. The result indicates that the KSr 1-x-y PO 4:Tb(3+) x Eu (2+)y can be potentially useful as a UV radiation-converting phosphor for light-emitting diodes.
Martínez-Ceron, María C; Giudicessi, Silvana L; Marani, Mariela M; Albericio, Fernando; Cascone, Osvaldo; Erra-Balsells, Rosa; Camperi, Silvia A
2010-05-15
Optimization of bead analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) after the screening of one-bead-one-peptide combinatorial libraries was achieved, involving the fine-tuning of the whole process. Guanidine was replaced by acetonitrile (MeCN)/acetic acid (AcOH)/water (H(2)O), improving matrix crystallization. Peptide-bead cleavage with NH(4)OH was cheaper and safer than, yet as efficient as, NH(3)/tetrahydrofuran (THF). Peptide elution in microtubes instead of placing the beads in the sample plate yielded more sample aliquots. Successive dry layers deposit sample preparation was better than the dried droplet method. Among the matrices analyzed, alpha-cyano-4-hydroxycinnamic acid resulted in the best peptide ion yield. Cluster formation was minimized by the addition of additives to the matrix. Copyright 2010 Elsevier Inc. All rights reserved.
Automatic design of synthetic gene circuits through mixed integer non-linear programming.
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits.
Galaxy Redshifts from Discrete Optimization of Correlation Functions
NASA Astrophysics Data System (ADS)
Lee, Benjamin C. G.; Budavári, Tamás; Basu, Amitabh; Rahman, Mubdi
2016-12-01
We propose a new method of constraining the redshifts of individual extragalactic sources based on celestial coordinates and their ensemble statistics. Techniques from integer linear programming (ILP) are utilized to optimize simultaneously for the angular two-point cross- and autocorrelation functions. Our novel formalism introduced here not only transforms the otherwise hopelessly expensive, brute-force combinatorial search into a linear system with integer constraints but also is readily implementable in off-the-shelf solvers. We adopt Gurobi, a commercial optimization solver, and use Python to build the cost function dynamically. The preliminary results on simulated data show potential for future applications to sky surveys by complementing and enhancing photometric redshift estimators. Our approach is the first application of ILP to astronomical analysis.
Programmable synaptic devices for electronic neural nets
NASA Technical Reports Server (NTRS)
Moopenn, A.; Thakoor, A. P.
1990-01-01
The architecture, design, and operational characteristics of custom VLSI and thin film synaptic devices are described. The devices include CMOS-based synaptic chips containing 1024 reprogrammable synapses with a 6-bit dynamic range, and nonvolatile, write-once, binary synaptic arrays based on memory switching in hydrogenated amorphous silicon films. Their suitability for embodiment of fully parallel and analog neural hardware is discussed. Specifically, a neural network solution to an assignment problem of combinatorial global optimization, implemented in fully parallel hardware using the synaptic chips, is described. The network's ability to provide optimal and near optimal solutions over a time scale of few neuron time constants has been demonstrated and suggests a speedup improvement of several orders of magnitude over conventional search methods.
Lin, Jingjing; Jing, Honglei
2016-01-01
Artificial immune system is one of the most recently introduced intelligence methods which was inspired by biological immune system. Most immune system inspired algorithms are based on the clonal selection principle, known as clonal selection algorithms (CSAs). When coping with complex optimization problems with the characteristics of multimodality, high dimension, rotation, and composition, the traditional CSAs often suffer from the premature convergence and unsatisfied accuracy. To address these concerning issues, a recombination operator inspired by the biological combinatorial recombination is proposed at first. The recombination operator could generate the promising candidate solution to enhance search ability of the CSA by fusing the information from random chosen parents. Furthermore, a modified hypermutation operator is introduced to construct more promising and efficient candidate solutions. A set of 16 common used benchmark functions are adopted to test the effectiveness and efficiency of the recombination and hypermutation operators. The comparisons with classic CSA, CSA with recombination operator (RCSA), and CSA with recombination and modified hypermutation operator (RHCSA) demonstrate that the proposed algorithm significantly improves the performance of classic CSA. Moreover, comparison with the state-of-the-art algorithms shows that the proposed algorithm is quite competitive. PMID:27698662
Orbit Clustering Based on Transfer Cost
NASA Technical Reports Server (NTRS)
Gustafson, Eric D.; Arrieta-Camacho, Juan J.; Petropoulos, Anastassios E.
2013-01-01
We propose using cluster analysis to perform quick screening for combinatorial global optimization problems. The key missing component currently preventing cluster analysis from use in this context is the lack of a useable metric function that defines the cost to transfer between two orbits. We study several proposed metrics and clustering algorithms, including k-means and the expectation maximization algorithm. We also show that proven heuristic methods such as the Q-law can be modified to work with cluster analysis.
An Integrated Method Based on PSO and EDA for the Max-Cut Problem.
Lin, Geng; Guan, Jian
2016-01-01
The max-cut problem is NP-hard combinatorial optimization problem with many real world applications. In this paper, we propose an integrated method based on particle swarm optimization and estimation of distribution algorithm (PSO-EDA) for solving the max-cut problem. The integrated algorithm overcomes the shortcomings of particle swarm optimization and estimation of distribution algorithm. To enhance the performance of the PSO-EDA, a fast local search procedure is applied. In addition, a path relinking procedure is developed to intensify the search. To evaluate the performance of PSO-EDA, extensive experiments were carried out on two sets of benchmark instances with 800 to 20,000 vertices from the literature. Computational results and comparisons show that PSO-EDA significantly outperforms the existing PSO-based and EDA-based algorithms for the max-cut problem. Compared with other best performing algorithms, PSO-EDA is able to find very competitive results in terms of solution quality.
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Focusing on the golden ball metaheuristic: an extended study on a wider set of problems.
Osaba, E; Diaz, F; Carballedo, R; Onieva, E; Perallos, A
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results.
Focusing on the Golden Ball Metaheuristic: An Extended Study on a Wider Set of Problems
Osaba, E.; Diaz, F.; Carballedo, R.; Onieva, E.; Perallos, A.
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results. PMID:25165742
Optimizing Associative Experimental Design for Protein Crystallization Screening
Dinç, Imren; Pusey, Marc L.; Aygün, Ramazan S.
2016-01-01
The goal of protein crystallization screening is the determination of the main factors of importance to crystallizing the protein under investigation. One of the major issues about determining these factors is that screening is often expanded to many hundreds or thousands of conditions to maximize combinatorial chemical space coverage for maximizing the chances of a successful (crystalline) outcome. In this paper, we propose an experimental design method called “Associative Experimental Design (AED)” and an optimization method includes eliminating prohibited combinations and prioritizing reagents based on AED analysis of results from protein crystallization experiments. AED generates candidate cocktails based on these initial screening results. These results are analyzed to determine those screening factors in chemical space that are most likely to lead to higher scoring outcomes, crystals. We have tested AED on three proteins derived from the hyperthermophile Thermococcus thioreducens, and we applied an optimization method to these proteins. Our AED method generated novel cocktails (count provided in parentheses) leading to crystals for three proteins as follows: Nucleoside diphosphate kinase (4), HAD superfamily hydrolase (2), Nucleoside kinase (1). After getting promising results, we have tested our optimization method on four different proteins. The AED method with optimization yielded 4, 3, and 20 crystalline conditions for holo Human Transferrin, archaeal exosome protein, and Nucleoside diphosphate kinase, respectively. PMID:26955046
Optical Sensor/Actuator Locations for Active Structural Acoustic Control
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Palumbo, Daniel L.; Kincaid, Rex K.
1998-01-01
Researchers at NASA Langley Research Center have extensive experience using active structural acoustic control (ASAC) for aircraft interior noise reduction. One aspect of ASAC involves the selection of optimum locations for microphone sensors and force actuators. This paper explains the importance of sensor/actuator selection, reviews optimization techniques, and summarizes experimental and numerical results. Three combinatorial optimization problems are described. Two involve the determination of the number and position of piezoelectric actuators, and the other involves the determination of the number and location of the sensors. For each case, a solution method is suggested, and typical results are examined. The first case, a simplified problem with simulated data, is used to illustrate the method. The second and third cases are more representative of the potential of the method and use measured data. The three case studies and laboratory test results establish the usefulness of the numerical methods.
Fast globally optimal segmentation of 3D prostate MRI with axial symmetry prior.
Qiu, Wu; Yuan, Jing; Ukwatta, Eranga; Sun, Yue; Rajchl, Martin; Fenster, Aaron
2013-01-01
We propose a novel global optimization approach to segmenting a given 3D prostate T2w magnetic resonance (MR) image, which enforces the inherent axial symmetry of the prostate shape and simultaneously performs a sequence of 2D axial slice-wise segmentations with a global 3D coherence prior. We show that the proposed challenging combinatorial optimization problem can be solved globally and exactly by means of convex relaxation. With this regard, we introduce a novel coupled continuous max-flow model, which is dual to the studied convex relaxed optimization formulation and leads to an efficient multiplier augmented algorithm based on the modern convex optimization theory. Moreover, the new continuous max-flow based algorithm was implemented on GPUs to achieve a substantial improvement in computation. Experimental results using public and in-house datasets demonstrate great advantages of the proposed method in terms of both accuracy and efficiency.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Rapid NMR Assignments of Proteins by Using Optimized Combinatorial Selective Unlabeling.
Dubey, Abhinav; Kadumuri, Rajashekar Varma; Jaipuria, Garima; Vadrevu, Ramakrishna; Atreya, Hanudatta S
2016-02-15
A new approach for rapid resonance assignments in proteins based on amino acid selective unlabeling is presented. The method involves choosing a set of multiple amino acid types for selective unlabeling and identifying specific tripeptides surrounding the labeled residues from specific 2D NMR spectra in a combinatorial manner. The methodology directly yields sequence specific assignments, without requiring a contiguously stretch of amino acid residues to be linked, and is applicable to deuterated proteins. We show that a 2D [(15) N,(1) H] HSQC spectrum with two 2D spectra can result in ∼50 % assignments. The methodology was applied to two proteins: an intrinsically disordered protein (12 kDa) and the 29 kDa (268 residue) α-subunit of Escherichia coli tryptophan synthase, which presents a challenging case with spectral overlaps and missing peaks. The method can augment existing approaches and will be useful for applications such as identifying active-site residues involved in ligand binding, phosphorylation, or protein-protein interactions, even prior to complete resonance assignments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optimization of cell seeding in a 2D bio-scaffold system using computational models.
Ho, Nicholas; Chua, Matthew; Chui, Chee-Kong
2017-05-01
The cell expansion process is a crucial part of generating cells on a large-scale level in a bioreactor system. Hence, it is important to set operating conditions (e.g. initial cell seeding distribution, culture medium flow rate) to an optimal level. Often, the initial cell seeding distribution factor is neglected and/or overlooked in the design of a bioreactor using conventional seeding distribution methods. This paper proposes a novel seeding distribution method that aims to maximize cell growth and minimize production time/cost. The proposed method utilizes two computational models; the first model represents cell growth patterns whereas the second model determines optimal initial cell seeding positions for adherent cell expansions. Cell growth simulation from the first model demonstrates that the model can be a representation of various cell types with known probabilities. The second model involves a combination of combinatorial optimization, Monte Carlo and concepts of the first model, and is used to design a multi-layer 2D bio-scaffold system that increases cell production efficiency in bioreactor applications. Simulation results have shown that the recommended input configurations obtained from the proposed optimization method are the most optimal configurations. The results have also illustrated the effectiveness of the proposed optimization method. The potential of the proposed seeding distribution method as a useful tool to optimize the cell expansion process in modern bioreactor system applications is highlighted. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cryptographic Combinatorial Securities Exchanges
NASA Astrophysics Data System (ADS)
Thorpe, Christopher; Parkes, David C.
We present a useful new mechanism that facilitates the atomic exchange of many large baskets of securities in a combinatorial exchange. Cryptography prevents information about the securities in the baskets from being exploited, enhancing trust. Our exchange offers institutions who wish to trade large positions a new alternative to existing methods of block trading: they can reduce transaction costs by taking advantage of other institutions’ available liquidity, while third party liquidity providers guarantee execution—preserving their desired portfolio composition at all times. In our exchange, institutions submit encrypted orders which are crossed, leaving a “remainder”. The exchange proves facts about the portfolio risk of this remainder to third party liquidity providers without revealing the securities in the remainder, the knowledge of which could also be exploited. The third parties learn either (depending on the setting) the portfolio risk parameters of the remainder itself, or how their own portfolio risk would change if they were to incorporate the remainder into a portfolio they submit. In one setting, these third parties submit bids on the commission, and the winner supplies necessary liquidity for the entire exchange to clear. This guaranteed clearing, coupled with external price discovery from the primary markets for the securities, sidesteps difficult combinatorial optimization problems. This latter method of proving how taking on the remainder would change risk parameters of one’s own portfolio, without revealing the remainder’s contents or its own risk parameters, is a useful protocol of independent interest.
Optimizing Perioperative Decision Making: Improved Information for Clinical Workflow Planning
Doebbeling, Bradley N.; Burton, Matthew M.; Wiebke, Eric A.; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph
2012-01-01
Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40–70% of hospital revenues and 30–40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction. PMID:23304284
Optimizing perioperative decision making: improved information for clinical workflow planning.
Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph
2012-01-01
Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction.
Combinatorial optimization games
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, X.; Ibaraki, Toshihide; Nagamochi, Hiroshi
1997-06-01
We introduce a general integer programming formulation for a class of combinatorial optimization games, which immediately allows us to improve the algorithmic result for finding amputations in the core (an important solution concept in cooperative game theory) of the network flow game on simple networks by Kalai and Zemel. An interesting result is a general theorem that the core for this class of games is nonempty if and only if a related linear program has an integer optimal solution. We study the properties for this mathematical condition to hold for several interesting problems, and apply them to resolve algorithmic andmore » complexity issues for their cores along the line as put forward in: decide whether the core is empty; if the core is empty, find an imputation in the core; given an imputation x, test whether x is in the core. We also explore the properties of totally balanced games in this succinct formulation of cooperative games.« less
2D photonic crystal complete band gap search using a cyclic cellular automaton refination
NASA Astrophysics Data System (ADS)
González-García, R.; Castañón, G.; Hernández-Figueroa, H. E.
2014-11-01
We present a refination method based on a cyclic cellular automaton (CCA) that simulates a crystallization-like process, aided with a heuristic evolutionary method called differential evolution (DE) used to perform an ordered search of full photonic band gaps (FPBGs) in a 2D photonic crystal (PC). The solution is proposed as a combinatorial optimization of the elements in a binary array. These elements represent the existence or absence of a dielectric material surrounded by air, thus representing a general geometry whose search space is defined by the number of elements in such array. A block-iterative frequency-domain method was used to compute the FPBGs on a PC, when present. DE has proved to be useful in combinatorial problems and we also present an implementation feature that takes advantage of the periodic nature of PCs to enhance the convergence of this algorithm. Finally, we used this methodology to find a PC structure with a 19% bandgap-to-midgap ratio without requiring previous information of suboptimal configurations and we made a statistical study of how it is affected by disorder in the borders of the structure compared with a previous work that uses a genetic algorithm.
Tumor-targeting peptides from combinatorial libraries*
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S.
2018-01-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges infighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. PMID:27210583
Discovering time-lagged rules from microarray data using gene profile classifiers
2011-01-01
Background Gene regulatory networks have an essential role in every process of life. In this regard, the amount of genome-wide time series data is becoming increasingly available, providing the opportunity to discover the time-delayed gene regulatory networks that govern the majority of these molecular processes. Results This paper aims at reconstructing gene regulatory networks from multiple genome-wide microarray time series datasets. In this sense, a new model-free algorithm called GRNCOP2 (Gene Regulatory Network inference by Combinatorial OPtimization 2), which is a significant evolution of the GRNCOP algorithm, was developed using combinatorial optimization of gene profile classifiers. The method is capable of inferring potential time-delay relationships with any span of time between genes from various time series datasets given as input. The proposed algorithm was applied to time series data composed of twenty yeast genes that are highly relevant for the cell-cycle study, and the results were compared against several related approaches. The outcomes have shown that GRNCOP2 outperforms the contrasted methods in terms of the proposed metrics, and that the results are consistent with previous biological knowledge. Additionally, a genome-wide study on multiple publicly available time series data was performed. In this case, the experimentation has exhibited the soundness and scalability of the new method which inferred highly-related statistically-significant gene associations. Conclusions A novel method for inferring time-delayed gene regulatory networks from genome-wide time series datasets is proposed in this paper. The method was carefully validated with several publicly available data sets. The results have demonstrated that the algorithm constitutes a usable model-free approach capable of predicting meaningful relationships between genes, revealing the time-trends of gene regulation. PMID:21524308
Chatterjee, Kaushik; Lin-Gibson, Sheng; Wallace, William E.; Parekh, Sapun H.; Lee, Young J.; Cicerone, Marcus T.; Young, Marian F.; Simon, Carl G.
2011-01-01
Cells are known to sense and respond to the physical properties of their environment and those of tissue scaffolds. Optimizing these cell-material interactions is critical in tissue engineering. In this work, a simple and inexpensive combinatorial platform was developed to rapidly screen three-dimensional (3D) tissue scaffolds and was applied to screen the effect of scaffold properties for tissue engineering of bone. Differentiation of osteoblasts was examined in poly(ethylene glycol) hydrogel gradients spanning a 30-fold range in compressive modulus (≈ 10 kPa to ≈ 300 kPa). Results demonstrate that material properties (gel stiffness) of scaffolds can be leveraged to induce cell differentiation in 3D culture as an alternative to biochemical cues such as soluble supplements, immobilized biomolecules and vectors, which are often expensive, labile and potentially carcinogenic. Gel moduli of ≈ 225 kPa and higher enhanced osteogenesis. Furthermore, it is proposed that material-induced cell differentiation can be modulated to engineer seamless tissue interfaces between mineralized bone tissue and softer tissues such as ligaments and tendons. This work presents a combinatorial method to screen biological response to 3D hydrogel scaffolds that more closely mimics the 3D environment experienced by cells in vivo. PMID:20378163
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.
2002-02-01
Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.
Investigations of quantum heuristics for optimization
NASA Astrophysics Data System (ADS)
Rieffel, Eleanor; Hadfield, Stuart; Jiang, Zhang; Mandra, Salvatore; Venturelli, Davide; Wang, Zhihui
We explore the design of quantum heuristics for optimization, focusing on the quantum approximate optimization algorithm, a metaheuristic developed by Farhi, Goldstone, and Gutmann. We develop specific instantiations of the of quantum approximate optimization algorithm for a variety of challenging combinatorial optimization problems. Through theoretical analyses and numeric investigations of select problems, we provide insight into parameter setting and Hamiltonian design for quantum approximate optimization algorithms and related quantum heuristics, and into their implementation on hardware realizable in the near term.
Silva, Aleidy; Lee, Bai-Yu; Clemens, Daniel L; Kee, Theodore; Ding, Xianting; Ho, Chih-Ming; Horwitz, Marcus A
2016-04-12
Tuberculosis (TB) remains a major global public health problem, and improved treatments are needed to shorten duration of therapy, decrease disease burden, improve compliance, and combat emergence of drug resistance. Ideally, the most effective regimen would be identified by a systematic and comprehensive combinatorial search of large numbers of TB drugs. However, optimization of regimens by standard methods is challenging, especially as the number of drugs increases, because of the extremely large number of drug-dose combinations requiring testing. Herein, we used an optimization platform, feedback system control (FSC) methodology, to identify improved drug-dose combinations for TB treatment using a fluorescence-based human macrophage cell culture model of TB, in which macrophages are infected with isopropyl β-D-1-thiogalactopyranoside (IPTG)-inducible green fluorescent protein (GFP)-expressing Mycobacterium tuberculosis (Mtb). On the basis of only a single screening test and three iterations, we identified highly efficacious three- and four-drug combinations. To verify the efficacy of these combinations, we further evaluated them using a methodologically independent assay for intramacrophage killing of Mtb; the optimized combinations showed greater efficacy than the current standard TB drug regimen. Surprisingly, all top three- and four-drug optimized regimens included the third-line drug clofazimine, and none included the first-line drugs isoniazid and rifampin, which had insignificant or antagonistic impacts on efficacy. Because top regimens also did not include a fluoroquinolone or aminoglycoside, they are potentially of use for treating many cases of multidrug- and extensively drug-resistant TB. Our study shows the power of an FSC platform to identify promising previously unidentified drug-dose combinations for treatment of TB.
Automatic Design of Synthetic Gene Circuits through Mixed Integer Non-linear Programming
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits. PMID:22536398
Hybrid Nested Partitions and Math Programming Framework for Large-scale Combinatorial Optimization
2010-03-31
optimization problems: 1) exact algorithms and 2) metaheuristic algorithms . This project will integrate concepts from these two technologies to develop...optimal solutions within an acceptable amount of computation time, and 2) metaheuristic algorithms such as genetic algorithms , tabu search, and the...integer programming decomposition approaches, such as Dantzig Wolfe decomposition and Lagrangian relaxation, and metaheuristics such as the Nested
Statistical mechanics of budget-constrained auctions
NASA Astrophysics Data System (ADS)
Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.
2009-07-01
Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.
Tumor-targeting peptides from combinatorial libraries.
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S
2017-02-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges in fighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. Copyright © 2017. Published by Elsevier B.V.
Development of the PEBLebl Traveling Salesman Problem Computerized Testbed
ERIC Educational Resources Information Center
Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew
2015-01-01
The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…
Scoring of Side-Chain Packings: An Analysis of Weight Factors and Molecular Dynamics Structures.
Colbes, Jose; Aguila, Sergio A; Brizuela, Carlos A
2018-02-26
The protein side-chain packing problem (PSCPP) is a central task in computational protein design. The problem is usually modeled as a combinatorial optimization problem, which consists of searching for a set of rotamers, from a given rotamer library, that minimizes a scoring function (SF). The SF is a weighted sum of terms, that can be decomposed in physics-based and knowledge-based terms. Although there are many methods to obtain approximate solutions for this problem, all of them have similar performances and there has not been a significant improvement in recent years. Studies on protein structure prediction and protein design revealed the limitations of current SFs to achieve further improvements for these two problems. In the same line, a recent work reported a similar result for the PSCPP. In this work, we ask whether or not this negative result regarding further improvements in performance is due to (i) an incorrect weighting of the SFs terms or (ii) the constrained conformation resulting from the protein crystallization process. To analyze these questions, we (i) model the PSCPP as a bi-objective combinatorial optimization problem, optimizing, at the same time, the two most important terms of two SFs of state-of-the-art algorithms and (ii) performed a preprocessing relaxation of the crystal structure through molecular dynamics to simulate the protein in the solvent and evaluated the performance of these two state-of-the-art SFs under these conditions. Our results indicate that (i) no matter what combination of weight factors we use the current SFs will not lead to better performances and (ii) the evaluated SFs will not be able to improve performance on relaxed structures. Furthermore, the experiments revealed that the SFs and the methods are biased toward crystallized structures.
Construction of a scFv Library with Synthetic, Non-combinatorial CDR Diversity.
Bai, Xuelian; Shim, Hyunbo
2017-01-01
Many large synthetic antibody libraries have been designed, constructed, and successfully generated high-quality antibodies suitable for various demanding applications. While synthetic antibody libraries have many advantages such as optimized framework sequences and a broader sequence landscape than natural antibodies, their sequence diversities typically are generated by random combinatorial synthetic processes which cause the incorporation of many undesired CDR sequences. Here, we describe the construction of a synthetic scFv library using oligonucleotide mixtures that contain predefined, non-combinatorially synthesized CDR sequences. Each CDR is first inserted to a master scFv framework sequence and the resulting single-CDR libraries are subjected to a round of proofread panning. The proofread CDR sequences are assembled to produce the final scFv library with six diversified CDRs.
Preparation of cherry-picked combinatorial libraries by string synthesis.
Furka, Arpád; Dibó, Gábor; Gombosuren, Naran
2005-03-01
String synthesis [1-3] is an efficient and cheap manual method for preparation of combinatorial libraries by using macroscopic solid support units. Sorting the units between two synthetic steps is an important operation of the procedure. The software developed to guide sorting can be used only when complete combinatorial libraries are prepared. Since very often only selected components of the full libraries are needed, new software was constructed that guides sorting in preparation of non-complete combinatorial libraries. Application of the software is described in details.
Improved Modeling of Side-Chain–Base Interactions and Plasticity in Protein–DNA Interface Design
Thyme, Summer B.; Baker, David; Bradley, Philip
2012-01-01
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed “motifs”) was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein–DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. PMID:22426128
Improved modeling of side-chain--base interactions and plasticity in protein--DNA interface design.
Thyme, Summer B; Baker, David; Bradley, Philip
2012-06-08
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed "motifs") was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein-DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. Published by Elsevier Ltd.
Wafer-scale growth of VO2 thin films using a combinatorial approach
Zhang, Hai-Tian; Zhang, Lei; Mukherjee, Debangshu; Zheng, Yuan-Xia; Haislmaier, Ryan C.; Alem, Nasim; Engel-Herbert, Roman
2015-01-01
Transition metal oxides offer functional properties beyond conventional semiconductors. Bridging the gap between the fundamental research frontier in oxide electronics and their realization in commercial devices demands a wafer-scale growth approach for high-quality transition metal oxide thin films. Such a method requires excellent control over the transition metal valence state to avoid performance deterioration, which has been proved challenging. Here we present a scalable growth approach that enables a precise valence state control. By creating an oxygen activity gradient across the wafer, a continuous valence state library is established to directly identify the optimal growth condition. Single-crystalline VO2 thin films have been grown on wafer scale, exhibiting more than four orders of magnitude change in resistivity across the metal-to-insulator transition. It is demonstrated that ‘electronic grade' transition metal oxide films can be realized on a large scale using a combinatorial growth approach, which can be extended to other multivalent oxide systems. PMID:26450653
Anatomy of the Attraction Basins: Breaking with the Intuition.
Hernando, Leticia; Mendiburu, Alexander; Lozano, Jose A
2018-05-22
Solving combinatorial optimization problems efficiently requires the development of algorithms that consider the specific properties of the problems. In this sense, local search algorithms are designed over a neighborhood structure that partially accounts for these properties. Considering a neighborhood, the space is usually interpreted as a natural landscape, with valleys and mountains. Under this perception, it is commonly believed that, if maximizing, the solutions located in the slopes of the same mountain belong to the same attraction basin, with the peaks of the mountains being the local optima. Unfortunately, this is a widespread erroneous visualization of a combinatorial landscape. Thus, our aim is to clarify this aspect, providing a detailed analysis of, first, the existence of plateaus where the local optima are involved, and second, the properties that define the topology of the attraction basins, picturing a reliable visualization of the landscapes. Some of the features explored in this paper have never been examined before. Hence, new findings about the structure of the attraction basins are shown. The study is focused on instances of permutation-based combinatorial optimization problems considering the 2-exchange and the insert neighborhoods. As a consequence of this work, we break away from the extended belief about the anatomy of attraction basins.
Fast and Efficient Discrimination of Traveling Salesperson Problem Stimulus Difficulty
ERIC Educational Resources Information Center
Dry, Matthew J.; Fontaine, Elizabeth L.
2014-01-01
The Traveling Salesperson Problem (TSP) is a computationally difficult combinatorial optimization problem. In spite of its relative difficulty, human solvers are able to generate close-to-optimal solutions in a close-to-linear time frame, and it has been suggested that this is due to the visual system's inherent sensitivity to certain geometric…
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
Limpoco, F Ted; Bailey, Ryan C
2011-09-28
We directly monitor in parallel and in real time the temporal profiles of polymer brushes simultaneously grown via multiple ATRP reaction conditions on a single substrate using arrays of silicon photonic microring resonators. In addition to probing relative polymerization rates, we show the ability to evaluate the dynamic properties of the in situ grown polymers. This presents a powerful new platform for studying modified interfaces that may allow for the combinatorial optimization of surface-initiated polymerization conditions.
Solving the Container Stowage Problem (CSP) using Particle Swarm Optimization (PSO)
NASA Astrophysics Data System (ADS)
Matsaini; Santosa, Budi
2018-04-01
Container Stowage Problem (CSP) is a problem of containers arrangement into ships by considering rules such as: total weight, weight of one stack, destination, equilibrium, and placement of containers on vessel. Container stowage problem is combinatorial problem and hard to solve with enumeration technique. It is an NP-Hard Problem. Therefore, to find a solution, metaheuristics is preferred. The objective of solving the problem is to minimize the amount of shifting such that the unloading time is minimized. Particle Swarm Optimization (PSO) is proposed to solve the problem. The implementation of PSO is combined with some steps which are stack position change rules, stack changes based on destination, and stack changes based on the weight type of the stacks (light, medium, and heavy). The proposed method was applied on five different cases. The results were compared to Bee Swarm Optimization (BSO) and heuristics method. PSO provided mean of 0.87% gap and time gap of 60 second. While BSO provided mean of 2,98% gap and 459,6 second to the heuristcs.
Li, Desheng
2014-01-01
This paper proposes a novel variant of cooperative quantum-behaved particle swarm optimization (CQPSO) algorithm with two mechanisms to reduce the search space and avoid the stagnation, called CQPSO-DVSA-LFD. One mechanism is called Dynamic Varying Search Area (DVSA), which takes charge of limiting the ranges of particles' activity into a reduced area. On the other hand, in order to escape the local optima, Lévy flights are used to generate the stochastic disturbance in the movement of particles. To test the performance of CQPSO-DVSA-LFD, numerical experiments are conducted to compare the proposed algorithm with different variants of PSO. According to the experimental results, the proposed method performs better than other variants of PSO on both benchmark test functions and the combinatorial optimization issue, that is, the job-shop scheduling problem.
Efficient search, mapping, and optimization of multi-protein genetic systems in diverse bacteria
Farasat, Iman; Kushwaha, Manish; Collens, Jason; Easterbrook, Michael; Guido, Matthew; Salis, Howard M
2014-01-01
Developing predictive models of multi-protein genetic systems to understand and optimize their behavior remains a combinatorial challenge, particularly when measurement throughput is limited. We developed a computational approach to build predictive models and identify optimal sequences and expression levels, while circumventing combinatorial explosion. Maximally informative genetic system variants were first designed by the RBS Library Calculator, an algorithm to design sequences for efficiently searching a multi-protein expression space across a > 10,000-fold range with tailored search parameters and well-predicted translation rates. We validated the algorithm's predictions by characterizing 646 genetic system variants, encoded in plasmids and genomes, expressed in six gram-positive and gram-negative bacterial hosts. We then combined the search algorithm with system-level kinetic modeling, requiring the construction and characterization of 73 variants to build a sequence-expression-activity map (SEAMAP) for a biosynthesis pathway. Using model predictions, we designed and characterized 47 additional pathway variants to navigate its activity space, find optimal expression regions with desired activity response curves, and relieve rate-limiting steps in metabolism. Creating sequence-expression-activity maps accelerates the optimization of many protein systems and allows previous measurements to quantitatively inform future designs. PMID:24952589
A sampling and classification item selection approach with content balancing.
Chen, Pei-Hua
2015-03-01
Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.
A method for aircraft concept exploration using multicriteria interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Buonanno, Michael Alexander
2005-08-01
The problem of aircraft concept selection has become increasingly difficult in recent years due to changes in the primary evaluation criteria of concepts. In the past, performance was often the primary discriminator, whereas modern programs have placed increased emphasis on factors such as environmental impact, economics, supportability, aesthetics, and other metrics. The revolutionary nature of the vehicles required to simultaneously meet these conflicting requirements has prompted a shift from design using historical data regression techniques for metric prediction to the use of sophisticated physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to select a sub-optimum baseline vehicle. Some extremely important concept decisions, such as the type of control surface arrangement to use, are frequently made without sufficient understanding of their impact on the important system metrics due to a lack of historical guidance, computational resources, or analysis tools. This thesis discusses the difficulties associated with revolutionary system design, and introduces several new techniques designed to remedy them. First, an interactive design method has been developed that allows the designer to provide feedback to a numerical optimization algorithm during runtime, thereby preventing the optimizer from exploiting weaknesses in the analytical model. This method can be used to account for subjective criteria, or as a crude measure of un-modeled quantitative criteria. Other contributions of the work include a modified Structured Genetic Algorithm that enables the efficient search of large combinatorial design hierarchies and an improved multi-objective optimization procedure that can effectively optimize several objectives simultaneously. A new conceptual design method has been created by drawing upon each of these new capabilities and aspects of more traditional design methods. The ability of this new technique to assist in the design of revolutionary vehicles has been demonstrated using a problem of contemporary interest: the concept exploration of a supersonic business jet. This problem was found to be a good demonstration case because of its novelty and unique requirements, and the results of this proof of concept exercise indicate that the new method is effective at providing additional insight into the relationship between a vehicle's requirements and its favorable attributes.
Mondal, Milon; Radeva, Nedyalka; Fanlo‐Virgós, Hugo; Otto, Sijbren; Klebe, Gerhard
2016-01-01
Abstract Fragment‐based drug design (FBDD) affords active compounds for biological targets. While there are numerous reports on FBDD by fragment growing/optimization, fragment linking has rarely been reported. Dynamic combinatorial chemistry (DCC) has become a powerful hit‐identification strategy for biological targets. We report the synergistic combination of fragment linking and DCC to identify inhibitors of the aspartic protease endothiapepsin. Based on X‐ray crystal structures of endothiapepsin in complex with fragments, we designed a library of bis‐acylhydrazones and used DCC to identify potent inhibitors. The most potent inhibitor exhibits an IC50 value of 54 nm, which represents a 240‐fold improvement in potency compared to the parent hits. Subsequent X‐ray crystallography validated the predicted binding mode, thus demonstrating the efficiency of the combination of fragment linking and DCC as a hit‐identification strategy. This approach could be applied to a range of biological targets, and holds the potential to facilitate hit‐to‐lead optimization. PMID:27400756
Methods of Combinatorial Optimization to Reveal Factors Affecting Gene Length
Bolshoy, Alexander; Tatarinova, Tatiana
2012-01-01
In this paper we present a novel method for genome ranking according to gene lengths. The main outcomes described in this paper are the following: the formulation of the genome ranking problem, presentation of relevant approaches to solve it, and the demonstration of preliminary results from prokaryotic genomes ordering. Using a subset of prokaryotic genomes, we attempted to uncover factors affecting gene length. We have demonstrated that hyperthermophilic species have shorter genes as compared with mesophilic organisms, which probably means that environmental factors affect gene length. Moreover, these preliminary results show that environmental factors group together in ranking evolutionary distant species. PMID:23300345
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem.
Rajeswari, M; Amudhavel, J; Pothula, Sujatha; Dhavachelvan, P
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria.
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem
Amudhavel, J.; Pothula, Sujatha; Dhavachelvan, P.
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria. PMID:28473849
PROBABILISTIC CROSS-IDENTIFICATION IN CROWDED FIELDS AS AN ASSIGNMENT PROBLEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budavári, Tamás; Basu, Amitabh, E-mail: budavari@jhu.edu, E-mail: basu.amitabh@jhu.edu
2016-10-01
One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.
NASA Astrophysics Data System (ADS)
1981-04-01
The main topics discussed were related to nonparametric statistics, plane and antiplane states in finite elasticity, free-boundary-variational inequalities, the numerical solution of free boundary-value problems, discrete and combinatorial optimization, mathematical modelling in fluid mechanics, a survey and comparison regarding thermodynamic theories, invariant and almost invariant subspaces in linear systems with applications to disturbance isolation, nonlinear acoustics, and methods of function theory in the case of partial differential equations, giving particular attention to elliptic problems in the plane.
Probabilistic Cross-identification in Crowded Fields as an Assignment Problem
NASA Astrophysics Data System (ADS)
Budavári, Tamás; Basu, Amitabh
2016-10-01
One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.
Experimental realization of a highly secure chaos communication under strong channel noise
NASA Astrophysics Data System (ADS)
Ye, Weiping; Dai, Qionglin; Wang, Shihong; Lu, Huaping; Kuang, Jinyu; Zhao, Zhenfeng; Zhu, Xiangqing; Tang, Guoning; Huang, Ronghuai; Hu, Gang
2004-09-01
A one-way coupled spatiotemporally chaotic map lattice is used to construct cryptosystem. With the combinatorial applications of both chaotic computations and conventional algebraic operations, our system has optimal cryptographic properties much better than the separative applications of known chaotic and conventional methods. We have realized experiments to practice duplex voice secure communications in realistic Wired Public Switched Telephone Network by applying our chaotic system and the system of Advanced Encryption Standard (AES), respectively, for cryptography. Our system can work stably against strong channel noise when AES fails to work.
A combinatorial filtering method for magnetotelluric time-series based on Hilbert-Huang transform
NASA Astrophysics Data System (ADS)
Cai, Jianhua
2014-11-01
Magnetotelluric (MT) time-series are often contaminated with noise from natural or man-made processes. A substantial improvement is possible when the time-series are presented as clean as possible for further processing. A combinatorial method is described for filtering of MT time-series based on the Hilbert-Huang transform that requires a minimum of human intervention and leaves good data sections unchanged. Good data sections are preserved because after empirical mode decomposition the data are analysed through hierarchies, morphological filtering, adaptive threshold and multi-point smoothing, allowing separation of noise from signals. The combinatorial method can be carried out without any assumption about the data distribution. Simulated data and the real measured MT time-series from three different regions, with noise caused by baseline drift, high frequency noise and power-line contribution, are processed to demonstrate the application of the proposed method. Results highlight the ability of the combinatorial method to pick out useful signals, and the noise is suppressed greatly so that their deleterious influence is eliminated for the MT transfer function estimation.
Aghamohammadi, Hossein; Saadi Mesgari, Mohammad; Molaei, Damoon; Aghamohammadi, Hasan
2013-01-01
Location-allocation is a combinatorial optimization problem, and is defined as Non deterministic Polynomial Hard (NP) hard optimization. Therefore, solution of such a problem should be shifted from exact to heuristic or Meta heuristic due to the complexity of the problem. Locating medical centers and allocating injuries of an earthquake to them has high importance in earthquake disaster management so that developing a proper method will reduce the time of relief operation and will consequently decrease the number of fatalities. This paper presents the development of a heuristic method based on two nested genetic algorithms to optimize this location allocation problem by using the abilities of Geographic Information System (GIS). In the proposed method, outer genetic algorithm is applied to the location part of the problem and inner genetic algorithm is used to optimize the resource allocation. The final outcome of implemented method includes the spatial location of new required medical centers. The method also calculates that how many of the injuries at each demanding point should be taken to any of the existing and new medical centers as well. The results of proposed method showed high performance of designed structure to solve a capacitated location-allocation problem that may arise in a disaster situation when injured people has to be taken to medical centers in a reasonable time.
Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.
Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk
2017-01-01
Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.
Combinatorial optimization problem solution based on improved genetic algorithm
NASA Astrophysics Data System (ADS)
Zhang, Peng
2017-08-01
Traveling salesman problem (TSP) is a classic combinatorial optimization problem. It is a simplified form of many complex problems. In the process of study and research, it is understood that the parameters that affect the performance of genetic algorithm mainly include the quality of initial population, the population size, and crossover probability and mutation probability values. As a result, an improved genetic algorithm for solving TSP problems is put forward. The population is graded according to individual similarity, and different operations are performed to different levels of individuals. In addition, elitist retention strategy is adopted at each level, and the crossover operator and mutation operator are improved. Several experiments are designed to verify the feasibility of the algorithm. Through the experimental results analysis, it is proved that the improved algorithm can improve the accuracy and efficiency of the solution.
Fuzzy multiobjective models for optimal operation of a hydropower system
NASA Astrophysics Data System (ADS)
Teegavarapu, Ramesh S. V.; Ferreira, André R.; Simonovic, Slobodan P.
2013-06-01
Optimal operation models for a hydropower system using new fuzzy multiobjective mathematical programming models are developed and evaluated in this study. The models use (i) mixed integer nonlinear programming (MINLP) with binary variables and (ii) integrate a new turbine unit commitment formulation along with water quality constraints used for evaluation of reservoir downstream impairment. Reardon method used in solution of genetic algorithm optimization problems forms the basis for development of a new fuzzy multiobjective hydropower system optimization model with creation of Reardon type fuzzy membership functions. The models are applied to a real-life hydropower reservoir system in Brazil. Genetic Algorithms (GAs) are used to (i) solve the optimization formulations to avoid computational intractability and combinatorial problems associated with binary variables in unit commitment, (ii) efficiently address Reardon method formulations, and (iii) deal with local optimal solutions obtained from the use of traditional gradient-based solvers. Decision maker's preferences are incorporated within fuzzy mathematical programming formulations to obtain compromise operating rules for a multiobjective reservoir operation problem dominated by conflicting goals of energy production, water quality and conservation releases. Results provide insight into compromise operation rules obtained using the new Reardon fuzzy multiobjective optimization framework and confirm its applicability to a variety of multiobjective water resources problems.
Combinatorial Effects of Arginine and Fluoride on Oral Bacteria
Zheng, X.; Cheng, X.; Wang, L.; Qiu, W.; Wang, S.; Zhou, Y.; Li, M.; Li, Y.; Cheng, L.; Li, J.; Zhou, X.
2015-01-01
Dental caries is closely associated with the microbial disequilibrium between acidogenic/aciduric pathogens and alkali-generating commensal residents within the dental plaque. Fluoride is a widely used anticaries agent, which promotes tooth hard-tissue remineralization and suppresses bacterial activities. Recent clinical trials have shown that oral hygiene products containing both fluoride and arginine possess a greater anticaries effect compared with those containing fluoride alone, indicating synergy between fluoride and arginine in caries management. Here, we hypothesize that arginine may augment the ecological benefit of fluoride by enriching alkali-generating bacteria in the plaque biofilm and thus synergizes with fluoride in controlling dental caries. Specifically, we assessed the combinatory effects of NaF/arginine on planktonic and biofilm cultures of Streptococcus mutans, Streptococcus sanguinis, and Porphyromonas gingivalis with checkerboard microdilution assays. The optimal NaF/arginine combinations were selected, and their combinatory effects on microbial composition were further examined in single-, dual-, and 3-species biofilm using bacterial species–specific fluorescence in situ hybridization and quantitative polymerase chain reaction. We found that arginine synergized with fluoride in suppressing acidogenic S. mutans in both planktonic and biofilm cultures. In addition, the NaF/arginine combination synergistically reduced S. mutans but enriched S. sanguinis within the multispecies biofilms. More importantly, the optimal combination of NaF/arginine maintained a “streptococcal pressure” against the potential growth of oral anaerobe P. gingivalis within the alkalized biofilm. Taken together, we conclude that the combinatory application of fluoride and arginine has a potential synergistic effect in maintaining a healthy oral microbial equilibrium and thus represents a promising ecological approach to caries management. PMID:25477312
Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.
McIntosh, Chris; Hamarneh, Ghassan
2012-01-01
We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
NASA Technical Reports Server (NTRS)
Rash, James
2014-01-01
NASA's space data-communications infrastructure-the Space Network and the Ground Network-provide scheduled (as well as some limited types of unscheduled) data-communications services to user spacecraft. The Space Network operates several orbiting geostationary platforms (the Tracking and Data Relay Satellite System (TDRSS)), each with its own servicedelivery antennas onboard. The Ground Network operates service-delivery antennas at ground stations located around the world. Together, these networks enable data transfer between user spacecraft and their mission control centers on Earth. Scheduling data-communications events for spacecraft that use the NASA communications infrastructure-the relay satellites and the ground stations-can be accomplished today with software having an operational heritage dating from the 1980s or earlier. An implementation of the scheduling methods and algorithms disclosed and formally specified herein will produce globally optimized schedules with not only optimized service delivery by the space data-communications infrastructure but also optimized satisfaction of all user requirements and prescribed constraints, including radio frequency interference (RFI) constraints. Evolutionary algorithms, a class of probabilistic strategies for searching large solution spaces, is the essential technology invoked and exploited in this disclosure. Also disclosed are secondary methods and algorithms for optimizing the execution efficiency of the schedule-generation algorithms themselves. The scheduling methods and algorithms as presented are adaptable to accommodate the complexity of scheduling the civilian and/or military data-communications infrastructure within the expected range of future users and space- or ground-based service-delivery assets. Finally, the problem itself, and the methods and algorithms, are generalized and specified formally. The generalized methods and algorithms are applicable to a very broad class of combinatorial-optimization problems that encompasses, among many others, the problem of generating optimal space-data communications schedules.
Evolution, learning, and cognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Y.C.
1988-01-01
The book comprises more than fifteen articles in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics.
Measuring and Specifying Combinatorial Coverage of Test Input Configurations
Kuhn, D. Richard; Kacker, Raghu N.; Lei, Yu
2015-01-01
A key issue in testing is how many tests are needed for a required level of coverage or fault detection. Estimates are often based on error rates in initial testing, or on code coverage. For example, tests may be run until a desired level of statement or branch coverage is achieved. Combinatorial methods present an opportunity for a different approach to estimating required test set size, using characteristics of the test set. This paper describes methods for estimating the coverage of, and ability to detect, t-way interaction faults of a test set based on a covering array. We also develop a connection between (static) combinatorial coverage and (dynamic) code coverage, such that if a specific condition is satisfied, 100% branch coverage is assured. Using these results, we propose practical recommendations for using combinatorial coverage in specifying test requirements. PMID:28133442
Ligand design by a combinatorial approach based on modeling and experiment: application to HLA-DR4
NASA Astrophysics Data System (ADS)
Evensen, Erik; Joseph-McCarthy, Diane; Weiss, Gregory A.; Schreiber, Stuart L.; Karplus, Martin
2007-07-01
Combinatorial synthesis and large scale screening methods are being used increasingly in drug discovery, particularly for finding novel lead compounds. Although these "random" methods sample larger areas of chemical space than traditional synthetic approaches, only a relatively small percentage of all possible compounds are practically accessible. It is therefore helpful to select regions of chemical space that have greater likelihood of yielding useful leads. When three-dimensional structural data are available for the target molecule this can be achieved by applying structure-based computational design methods to focus the combinatorial library. This is advantageous over the standard usage of computational methods to design a small number of specific novel ligands, because here computation is employed as part of the combinatorial design process and so is required only to determine a propensity for binding of certain chemical moieties in regions of the target molecule. This paper describes the application of the Multiple Copy Simultaneous Search (MCSS) method, an active site mapping and de novo structure-based design tool, to design a focused combinatorial library for the class II MHC protein HLA-DR4. Methods for the synthesizing and screening the computationally designed library are presented; evidence is provided to show that binding was achieved. Although the structure of the protein-ligand complex could not be determined, experimental results including cross-exclusion of a known HLA-DR4 peptide ligand (HA) by a compound from the library. Computational model building suggest that at least one of the ligands designed and identified by the methods described binds in a mode similar to that of native peptides.
Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem.
Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue
2015-01-01
As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
NASA Astrophysics Data System (ADS)
Burello, E.; Bologa, C.; Frecer, V.; Miertus, S.
Combinatorial chemistry and technologies have been developed to a stage where synthetic schemes are available for generation of a large variety of organic molecules. The innovative concept of combinatorial design assumes that screening of a large and diverse library of compounds will increase the probability of finding an active analogue among the compounds tested. Since the rate at which libraries are screened for activity currently constitutes a limitation to the use of combinatorial technologies, it is important to be selective about the number of compounds to be synthesized. Early experience with combinatorial chemistry indicated that chemical diversity alone did not result in a significant increase in the number of generated lead compounds. Emphasis has therefore been increasingly put on the use of computer assisted combinatorial chemical techniques. Computational methods are valuable in the design of virtual libraries of molecular models. Selection strategies based on computed physicochemical properties of the models or of a target compound are introduced to reduce the time and costs of library synthesis and screening. In addition, computational structure-based library focusing methods can be used to perform in silico screening of the activity of compounds against a target receptor by docking the ligands into the receptor model. Three case studies are discussed dealing with the design of targeted combinatorial libraries of inhibitors of HIV-1 protease, P. falciparum plasmepsin and human urokinase as potential antivirial, antimalarial and anticancer drugs. These illustrate library focusing strategies.
Li, Desheng
2014-01-01
This paper proposes a novel variant of cooperative quantum-behaved particle swarm optimization (CQPSO) algorithm with two mechanisms to reduce the search space and avoid the stagnation, called CQPSO-DVSA-LFD. One mechanism is called Dynamic Varying Search Area (DVSA), which takes charge of limiting the ranges of particles' activity into a reduced area. On the other hand, in order to escape the local optima, Lévy flights are used to generate the stochastic disturbance in the movement of particles. To test the performance of CQPSO-DVSA-LFD, numerical experiments are conducted to compare the proposed algorithm with different variants of PSO. According to the experimental results, the proposed method performs better than other variants of PSO on both benchmark test functions and the combinatorial optimization issue, that is, the job-shop scheduling problem. PMID:24851085
Discrete Optimization Model for Vehicle Routing Problem with Scheduling Side Cosntraints
NASA Astrophysics Data System (ADS)
Juliandri, Dedy; Mawengkang, Herman; Bu'ulolo, F.
2018-01-01
Vehicle Routing Problem (VRP) is an important element of many logistic systems which involve routing and scheduling of vehicles from a depot to a set of customers node. This is a hard combinatorial optimization problem with the objective to find an optimal set of routes used by a fleet of vehicles to serve the demands a set of customers It is required that these vehicles return to the depot after serving customers’ demand. The problem incorporates time windows, fleet and driver scheduling, pick-up and delivery in the planning horizon. The goal is to determine the scheduling of fleet and driver and routing policies of the vehicles. The objective is to minimize the overall costs of all routes over the planning horizon. We model the problem as a linear mixed integer program. We develop a combination of heuristics and exact method for solving the model.
Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.
Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai
2005-10-01
A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.
A combinatorial approach to the design of vaccines.
Martínez, Luis; Milanič, Martin; Legarreta, Leire; Medvedev, Paul; Malaina, Iker; de la Fuente, Ildefonso M
2015-05-01
We present two new problems of combinatorial optimization and discuss their applications to the computational design of vaccines. In the shortest λ-superstring problem, given a family S1,...,S(k) of strings over a finite alphabet, a set Τ of "target" strings over that alphabet, and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ target strings as substrings of S(i). In the shortest λ-cover superstring problem, given a collection X1,...,X(n) of finite sets of strings over a finite alphabet and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ elements of X(i) as substrings. The two problems are polynomially equivalent, and the shortest λ-cover superstring problem is a common generalization of two well known combinatorial optimization problems, the shortest common superstring problem and the set cover problem. We present two approaches to obtain exact or approximate solutions to the shortest λ-superstring and λ-cover superstring problems: one based on integer programming, and a hill-climbing algorithm. An application is given to the computational design of vaccines and the algorithms are applied to experimental data taken from patients infected by H5N1 and HIV-1.
Lv, Xiaomei; Gu, Jiali; Wang, Fan; Xie, Wenping; Liu, Min; Ye, Lidan; Yu, Hongwei
2016-12-01
Metabolic engineering of microorganisms for heterologous biosynthesis is a promising route to sustainable chemical production which attracts increasing research and industrial interest. However, the efficiency of microbial biosynthesis is often restricted by insufficient activity of pathway enzymes and unbalanced utilization of metabolic intermediates. This work presents a combinatorial strategy integrating modification of multiple rate-limiting enzymes and modular pathway engineering to simultaneously improve intra- and inter-pathway balance, which might be applicable for a range of products, using isoprene as an example product. For intra-module engineering within the methylerythritol-phosphate (MEP) pathway, directed co-evolution of DXS/DXR/IDI was performed adopting a lycopene-indicated high-throughput screening method developed herein, leading to 60% improvement of isoprene production. In addition, inter-module engineering between the upstream MEP pathway and the downstream isoprene-forming pathway was conducted via promoter manipulation, which further increased isoprene production by 2.94-fold compared to the recombinant strain with solely protein engineering and 4.7-fold compared to the control strain containing wild-type enzymes. These results demonstrated the potential of pathway optimization in isoprene overproduction as well as the effectiveness of combining metabolic regulation and protein engineering in improvement of microbial biosynthesis. Biotechnol. Bioeng. 2016;113: 2661-2669. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
1997-01-01
create a dependency tree containing an optimum set of n-1 first-order dependencies. To do this, first, we select an arbitrary bit Xroot to place at the...the root to an arbitrary bit Xroot -For all other bits Xi, set bestMatchingBitInTree[Xi] to Xroot . -While not all bits have been
Interference Aware Routing Using Spatial Reuse in Wireless Sensor Networks
2013-12-01
practice there is no optimal STDMA algorithm due to the computational complexity of the STDMA implementation; therefore, the common approach is to...Applications, Springer Berlin Heidelberg, pp. 653–657, 2001. [26] B. Korte and J. Vygen, “Shortest Paths,” Combinatorial Optimization Theory and...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited INTERFERENCE
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
Balancing focused combinatorial libraries based on multiple GPCR ligands
NASA Astrophysics Data System (ADS)
Soltanshahi, Farhad; Mansley, Tamsin E.; Choi, Sun; Clark, Robert D.
2006-08-01
G-Protein coupled receptors (GPCRs) are important targets for drug discovery, and combinatorial chemistry is an important tool for pharmaceutical development. The absence of detailed structural information, however, limits the kinds of combinatorial design techniques that can be applied to GPCR targets. This is particularly problematic given the current emphasis on focused combinatorial libraries. By linking an incremental construction method (OptDesign) to the very fast shape-matching capability of ChemSpace, we have created an efficient method for designing targeted sublibraries that are topomerically similar to known actives. Multi-objective scoring allows consideration of multiple queries (actives) simultaneously. This can lead to a distribution of products skewed towards one particular query structure, however, particularly when the ligands of interest are quite dissimilar to one another. A novel pivoting technique is described which makes it possible to generate promising designs even under those circumstances. The approach is illustrated by application to some serotonergic agonists and chemokine antagonists.
Chang, Yi-Pin; Chu, Yen-Ho
2014-05-16
The design, synthesis and screening of diversity-oriented peptide libraries using a "libraries from libraries" strategy for the development of inhibitors of α1-antitrypsin deficiency are described. The major buttress of the biochemical approach presented here is the use of well-established solid-phase split-and-mix method for the generation of mixture-based libraries. The combinatorial technique iterative deconvolution was employed for library screening. While molecular diversity is the general consideration of combinatorial libraries, exquisite design through systematic screening of small individual libraries is a prerequisite for effective library screening and can avoid potential problems in some cases. This review will also illustrate how large peptide libraries were designed, as well as how a conformation-sensitive assay was developed based on the mechanism of the conformational disease. Finally, the combinatorially selected peptide inhibitor capable of blocking abnormal protein aggregation will be characterized by biophysical, cellular and computational methods.
Combinatorial effects of arginine and fluoride on oral bacteria.
Zheng, X; Cheng, X; Wang, L; Qiu, W; Wang, S; Zhou, Y; Li, M; Li, Y; Cheng, L; Li, J; Zhou, X; Xu, X
2015-02-01
Dental caries is closely associated with the microbial disequilibrium between acidogenic/aciduric pathogens and alkali-generating commensal residents within the dental plaque. Fluoride is a widely used anticaries agent, which promotes tooth hard-tissue remineralization and suppresses bacterial activities. Recent clinical trials have shown that oral hygiene products containing both fluoride and arginine possess a greater anticaries effect compared with those containing fluoride alone, indicating synergy between fluoride and arginine in caries management. Here, we hypothesize that arginine may augment the ecological benefit of fluoride by enriching alkali-generating bacteria in the plaque biofilm and thus synergizes with fluoride in controlling dental caries. Specifically, we assessed the combinatory effects of NaF/arginine on planktonic and biofilm cultures of Streptococcus mutans, Streptococcus sanguinis, and Porphyromonas gingivalis with checkerboard microdilution assays. The optimal NaF/arginine combinations were selected, and their combinatory effects on microbial composition were further examined in single-, dual-, and 3-species biofilm using bacterial species-specific fluorescence in situ hybridization and quantitative polymerase chain reaction. We found that arginine synergized with fluoride in suppressing acidogenic S. mutans in both planktonic and biofilm cultures. In addition, the NaF/arginine combination synergistically reduced S. mutans but enriched S. sanguinis within the multispecies biofilms. More importantly, the optimal combination of NaF/arginine maintained a "streptococcal pressure" against the potential growth of oral anaerobe P. gingivalis within the alkalized biofilm. Taken together, we conclude that the combinatory application of fluoride and arginine has a potential synergistic effect in maintaining a healthy oral microbial equilibrium and thus represents a promising ecological approach to caries management. © International & American Associations for Dental Research 2014.
Optimizing Irregular Applications for Energy and Performance on the Tilera Many-core Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Panyala, Ajay R.; Halappanavar, Mahantesh
Optimizing applications simultaneously for energy and performance is a complex problem. High performance, parallel, irregular applications are notoriously hard to optimize due to their data-dependent memory accesses, lack of structured locality and complex data structures and code patterns. Irregular kernels are growing in importance in applications such as machine learning, graph analytics and combinatorial scientific computing. Performance- and energy-efficient implementation of these kernels on modern, energy efficient, multicore and many-core platforms is therefore an important and challenging problem. We present results from optimizing two irregular applications { the Louvain method for community detection (Grappolo), and high-performance conjugate gradient (HPCCG) {more » on the Tilera many-core system. We have significantly extended MIT's OpenTuner auto-tuning framework to conduct a detailed study of platform-independent and platform-specific optimizations to improve performance as well as reduce total energy consumption. We explore the optimization design space along three dimensions: memory layout schemes, compiler-based code transformations, and optimization of parallel loop schedules. Using auto-tuning, we demonstrate whole node energy savings of up to 41% relative to a baseline instantiation, and up to 31% relative to manually optimized variants.« less
Mondal, Milon; Radeva, Nedyalka; Fanlo-Virgós, Hugo; Otto, Sijbren; Klebe, Gerhard; Hirsch, Anna K H
2016-08-01
Fragment-based drug design (FBDD) affords active compounds for biological targets. While there are numerous reports on FBDD by fragment growing/optimization, fragment linking has rarely been reported. Dynamic combinatorial chemistry (DCC) has become a powerful hit-identification strategy for biological targets. We report the synergistic combination of fragment linking and DCC to identify inhibitors of the aspartic protease endothiapepsin. Based on X-ray crystal structures of endothiapepsin in complex with fragments, we designed a library of bis-acylhydrazones and used DCC to identify potent inhibitors. The most potent inhibitor exhibits an IC50 value of 54 nm, which represents a 240-fold improvement in potency compared to the parent hits. Subsequent X-ray crystallography validated the predicted binding mode, thus demonstrating the efficiency of the combination of fragment linking and DCC as a hit-identification strategy. This approach could be applied to a range of biological targets, and holds the potential to facilitate hit-to-lead optimization. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-08-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.
A quantum annealing approach for fault detection and diagnosis of graph-based systems
NASA Astrophysics Data System (ADS)
Perdomo-Ortiz, A.; Fluegemann, J.; Narasimhan, S.; Biswas, R.; Smelyanskiy, V. N.
2015-02-01
Diagnosing the minimal set of faults capable of explaining a set of given observations, e.g., from sensor readouts, is a hard combinatorial optimization problem usually tackled with artificial intelligence techniques. We present the mapping of this combinatorial problem to quadratic unconstrained binary optimization (QUBO), and the experimental results of instances embedded onto a quantum annealing device with 509 quantum bits. Besides being the first time a quantum approach has been proposed for problems in the advanced diagnostics community, to the best of our knowledge this work is also the first research utilizing the route Problem → QUBO → Direct embedding into quantum hardware, where we are able to implement and tackle problem instances with sizes that go beyond previously reported toy-model proof-of-principle quantum annealing implementations; this is a significant leap in the solution of problems via direct-embedding adiabatic quantum optimization. We discuss some of the programmability challenges in the current generation of the quantum device as well as a few possible ways to extend this work to more complex arbitrary network graphs.
Nonparametric Combinatorial Sequence Models
NASA Astrophysics Data System (ADS)
Wauthier, Fabian L.; Jordan, Michael I.; Jojic, Nebojsa
This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This paper presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three sequence datasets which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution induced by the prior. By integrating out the posterior our method compares favorably to leading binding predictors.
Analysis of tasks for dynamic man/machine load balancing in advanced helicopters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jorgensen, C.C.
1987-10-01
This report considers task allocation requirements imposed by advanced helicopter designs incorporating mixes of human pilots and intelligent machines. Specifically, it develops an analogy between load balancing using distributed non-homogeneous multiprocessors and human team functions. A taxonomy is presented which can be used to identify task combinations likely to cause overload for dynamic scheduling and process allocation mechanisms. Designer criteria are given for function decomposition, separation of control from data, and communication handling for dynamic tasks. Possible effects of n-p complete scheduling problems are noted and a class of combinatorial optimization methods are examined.
A path following algorithm for the graph matching problem.
Zaslavskiy, Mikhail; Bach, Francis; Vert, Jean-Philippe
2009-12-01
We propose a convex-concave programming approach for the labeled weighted graph matching problem. The convex-concave programming formulation is obtained by rewriting the weighted graph matching problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems: a quadratic convex and a quadratic concave optimization problem on the set of doubly stochastic matrices. The concave relaxation has the same global minimum as the initial graph matching problem, but the search for its global minimum is also a hard combinatorial problem. We, therefore, construct an approximation of the concave problem solution by following a solution path of a convex-concave problem obtained by linear interpolation of the convex and concave formulations, starting from the convex relaxation. This method allows to easily integrate the information on graph label similarities into the optimization problem, and therefore, perform labeled weighted graph matching. The algorithm is compared with some of the best performing graph matching methods on four data sets: simulated graphs, QAPLib, retina vessel images, and handwritten Chinese characters. In all cases, the results are competitive with the state of the art.
Combinatorial chemistry on solid support in the search for central nervous system agents.
Zajdel, Paweł; Pawłowski, Maciej; Martinez, Jean; Subra, Gilles
2009-08-01
The advent of combinatorial chemistry was one of the most important developments, that has significantly contributed to the drug discovery process. Within just a few years, its initial concept aimed at production of libraries containing huge number of compounds (thousands to millions), so called screening libraries, has shifted towards preparation of small and medium-sized rationally designed libraries. When applicable, the use of solid supports for the generation of libraries has been a real breakthrough in enhancing productivity. With a limited amount of resin and simple manual workups, the split/mix procedure may generate thousands of bead-tethered compounds. Beads can be chemically or physically encoded to facilitate the identification of a hit after the biological assay. Compartmentalization of solid supports using small reactors like teabags, kans or pellicular discrete supports like Lanterns resulted in powerful sort and combine technologies, relying on codes 'written' on the reactor, and thus reducing the need for automation and improving the number of compounds synthesized. These methods of solid-phase combinatorial chemistry have been recently supported by introduction of solid-supported reagents and scavenger resins. The first part of this review discusses the general premises of combinatorial chemistry and some methods used in the design of primary and focused combinatorial libraries. The aim of the second part is to present combinatorial chemistry methodologies aimed at discovering bioactive compounds acting on diverse GPCR involved in central nervous system disorders.
QAPgrid: A Two Level QAP-Based Approach for Large-Scale Data Analysis and Visualization
Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo
2011-01-01
Background The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain “hidden regularities” and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. Methodology/Principal Findings We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Conclusions/Significance Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics. PMID:21267077
QAPgrid: a two level QAP-based approach for large-scale data analysis and visualization.
Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo
2011-01-18
The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain "hidden regularities" and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics.
Optimal placement of tuning masses on truss structures by genetic algorithms
NASA Technical Reports Server (NTRS)
Ponslet, Eric; Haftka, Raphael T.; Cudney, Harley H.
1993-01-01
Optimal placement of tuning masses, actuators and other peripherals on large space structures is a combinatorial optimization problem. This paper surveys several techniques for solving this problem. The genetic algorithm approach to the solution of the placement problem is described in detail. An example of minimizing the difference between the two lowest frequencies of a laboratory truss by adding tuning masses is used for demonstrating some of the advantages of genetic algorithms. The relative efficiencies of different codings are compared using the results of a large number of optimization runs.
Simulating the component counts of combinatorial structures.
Arratia, Richard; Barbour, A D; Ewens, W J; Tavaré, Simon
2018-02-09
This article describes and compares methods for simulating the component counts of random logarithmic combinatorial structures such as permutations and mappings. We exploit the Feller coupling for simulating permutations to provide a very fast method for simulating logarithmic assemblies more generally. For logarithmic multisets and selections, this approach is replaced by an acceptance/rejection method based on a particular conditioning relationship that represents the distribution of the combinatorial structure as that of independent random variables conditioned on a weighted sum. We show how to improve its acceptance rate. We illustrate the method by estimating the probability that a random mapping has no repeated component sizes, and establish the asymptotic distribution of the difference between the number of components and the number of distinct component sizes for a very general class of logarithmic structures. Copyright © 2018. Published by Elsevier Inc.
Performance evaluation of coherent Ising machines against classical neural networks
NASA Astrophysics Data System (ADS)
Haribara, Yoshitaka; Ishikawa, Hitoshi; Utsunomiya, Shoko; Aihara, Kazuyuki; Yamamoto, Yoshihisa
2017-12-01
The coherent Ising machine is expected to find a near-optimal solution in various combinatorial optimization problems, which has been experimentally confirmed with optical parametric oscillators and a field programmable gate array circuit. The similar mathematical models were proposed three decades ago by Hopfield et al in the context of classical neural networks. In this article, we compare the computational performance of both models.
Defect-free atomic array formation using the Hungarian matching algorithm
NASA Astrophysics Data System (ADS)
Lee, Woojun; Kim, Hyosub; Ahn, Jaewook
2017-05-01
Deterministic loading of single atoms onto arbitrary two-dimensional lattice points has recently been demonstrated, where by dynamically controlling the optical-dipole potential, atoms from a probabilistically loaded lattice were relocated to target lattice points to form a zero-entropy atomic lattice. In this atom rearrangement, how to pair atoms with the target sites is a combinatorial optimization problem: brute-force methods search all possible combinations so the process is slow, while heuristic methods are time efficient but optimal solutions are not guaranteed. Here, we use the Hungarian matching algorithm as a fast and rigorous alternative to this problem of defect-free atomic lattice formation. Our approach utilizes an optimization cost function that restricts collision-free guiding paths so that atom loss due to collision is minimized during rearrangement. Experiments were performed with cold rubidium atoms that were trapped and guided with holographically controlled optical-dipole traps. The result of atom relocation from a partially filled 7 ×7 lattice to a 3 ×3 target lattice strongly agrees with the theoretical analysis: using the Hungarian algorithm minimizes the collisional and trespassing paths and results in improved performance, with over 50% higher success probability than the heuristic shortest-move method.
The Combinatorial Trace Method in Action
ERIC Educational Resources Information Center
Krebs, Mike; Martinez, Natalie C.
2013-01-01
On any finite graph, the number of closed walks of length k is equal to the sum of the kth powers of the eigenvalues of any adjacency matrix. This simple observation is the basis for the combinatorial trace method, wherein we attempt to count (or bound) the number of closed walks of a given length so as to obtain information about the graph's…
Lessel, Uta; Wellenzohn, Bernd; Fischer, J Robert; Rarey, Matthias
2012-02-27
A case study is presented illustrating the design of a focused CDK2 library. The scaffold of the library was detected by a feature trees search in a fragment space based on reactions from combinatorial chemistry. For the design the software LoFT (Library optimizer using Feature Trees) was used. The special feature called FTMatch was applied to restrict the parts of the queries where the reagents are permitted to match. This way a 3D scoring function could be simulated. Results were compared with alternative designs by GOLD docking and ROCS 3D alignments.
Combinatorial invariants and covariants as tools for conical intersections.
Ryb, Itai; Baer, Roi
2004-12-01
The combinatorial invariant and covariant are introduced as practical tools for analysis of conical intersections in molecules. The combinatorial invariant is a quantity depending on adiabatic electronic states taken at discrete nuclear configuration points. It is invariant to the phase choice (gauge) of these states. In the limit that the points trace a loop in nuclear configuration space, the value of the invariant approaches the corresponding Berry phase factor. The Berry phase indicates the presence of an odd or even number of conical intersections on surfaces bounded by these loops. Based on the combinatorial invariant, we develop a computationally simple and efficient method for locating conical intersections. The method is robust due to its use of gauge invariant nature. It does not rely on the landscape of intersecting potential energy surfaces nor does it require the computation of nonadiabatic couplings. We generalize the concept to open paths and combinatorial covariants for higher dimensions obtaining a technique for the construction of the gauge-covariant adiabatic-diabatic transformation matrix. This too does not make use of nonadiabatic couplings. The importance of using gauge-covariant expressions is underlined throughout. These techniques can be readily implemented by standard quantum chemistry codes. (c) 2004 American Institute of Physics.
Liao, Chenzhong; Liu, Bing; Shi, Leming; Zhou, Jiaju; Lu, Xian-Ping
2005-07-01
Based on the structural characters of PPAR modulators, a virtual combinatorial library containing 1226,625 compounds was constructed using SMILES strings. Selected ADME filters were employed to compel compounds having poor drug-like properties from this library. This library was converted to sdf and mol2 files by CONCORD 4.0, and was then docked to PPARgamma by DOCK 4.0 to identify new chemical entities that may be potential drug leads against type 2 diabetes and other metabolic diseases. The method to construct virtual combinatorial library using SMILES strings was further visualized by Visual Basic.net that can facilitate the needs of generating other type virtual combinatorial libraries.
Study of optimal laser parameters for cutting QFN packages by Taguchi's matrix method
NASA Astrophysics Data System (ADS)
Li, Chen-Hao; Tsai, Ming-Jong; Yang, Ciann-Dong
2007-06-01
This paper reports the study of optimal laser parameters for cutting QFN (Quad Flat No-lead) packages by using a diode pumped solid-state laser system (DPSSL). The QFN cutting path includes two different materials, which are the encapsulated epoxy and a copper lead frame substrate. The Taguchi's experimental method with orthogonal array of L 9(3 4) is employed to obtain optimal combinatorial parameters. A quantified mechanism was proposed for examining the laser cutting quality of a QFN package. The influences of the various factors such as laser current, laser frequency, and cutting speed on the laser cutting quality is also examined. From the experimental results, the factors on the cutting quality in the order of decreasing significance are found to be (a) laser frequency, (b) cutting speed, and (c) laser driving current. The optimal parameters were obtained at the laser frequency of 2 kHz, the cutting speed of 2 mm/s, and the driving current of 29 A. Besides identifying this sequence of dominance, matrix experiment also determines the best level for each control factor. The verification experiment confirms that the application of laser cutting technology to QFN is very successfully by using the optimal laser parameters predicted from matrix experiments.
A mixed analog/digital chaotic neuro-computer system for quadratic assignment problems.
Horio, Yoshihiko; Ikeguchi, Tohru; Aihara, Kazuyuki
2005-01-01
We construct a mixed analog/digital chaotic neuro-computer prototype system for quadratic assignment problems (QAPs). The QAP is one of the difficult NP-hard problems, and includes several real-world applications. Chaotic neural networks have been used to solve combinatorial optimization problems through chaotic search dynamics, which efficiently searches optimal or near optimal solutions. However, preliminary experiments have shown that, although it obtained good feasible solutions, the Hopfield-type chaotic neuro-computer hardware system could not obtain the optimal solution of the QAP. Therefore, in the present study, we improve the system performance by adopting a solution construction method, which constructs a feasible solution using the analog internal state values of the chaotic neurons at each iteration. In order to include the construction method into our hardware, we install a multi-channel analog-to-digital conversion system to observe the internal states of the chaotic neurons. We show experimentally that a great improvement in the system performance over the original Hopfield-type chaotic neuro-computer is obtained. That is, we obtain the optimal solution for the size-10 QAP in less than 1000 iterations. In addition, we propose a guideline for parameter tuning of the chaotic neuro-computer system according to the observation of the internal states of several chaotic neurons in the network.
An Introduction to Simulated Annealing
ERIC Educational Resources Information Center
Albright, Brian
2007-01-01
An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.
Chemical Compound Design Using Nuclear Charge Distributions
2012-03-01
Finding optimal solutions to design problems in chemistry is hampered by the combinatorially large search space. We develop a general theoretical ... framework for finding chemical compounds with prescribed properties using nuclear charge distributions. The key is the reformulation of the design
Romero, Jennifer V; Smith, Jock W H; Sullivan, Braden M; Croll, Lisa M; Dahn, J R
2012-01-09
Ternary libraries of 64 ZnO/CuO/CuCl(2) impregnated activated carbon samples were prepared on untreated or HNO(3)-treated carbon and evaluated for their SO(2) and NH(3) gas adsorption properties gravimetrically using a combinatorial method. CuCl(2) is shown to be a viable substitute for HNO(3) and some compositions of ternary ZnO/CuO/CuCl(2) impregnated carbon samples prepared on untreated carbon provided comparable SO(2) and NH(3) gas removal capacities to the materials prepared on HNO(3)-treated carbon. Through combinatorial methods, it was determined that the use of HNO(3) in this multigas adsorbent formulation can be avoided.
Library fingerprints: a novel approach to the screening of virtual libraries.
Klon, Anthony E; Diller, David J
2007-01-01
We propose a novel method to prioritize libraries for combinatorial synthesis and high-throughput screening that assesses the viability of a particular library on the basis of the aggregate physical-chemical properties of the compounds using a naïve Bayesian classifier. This approach prioritizes collections of related compounds according to the aggregate values of their physical-chemical parameters in contrast to single-compound screening. The method is also shown to be useful in screening existing noncombinatorial libraries when the compounds in these libraries have been previously clustered according to their molecular graphs. We show that the method used here is comparable or superior to the single-compound virtual screening of combinatorial libraries and noncombinatorial libraries and is superior to the pairwise Tanimoto similarity searching of a collection of combinatorial libraries.
Advances in metaheuristics for gene selection and classification of microarray data.
Duval, Béatrice; Hao, Jin-Kao
2010-01-01
Gene selection aims at identifying a (small) subset of informative genes from the initial data in order to obtain high predictive accuracy for classification. Gene selection can be considered as a combinatorial search problem and thus be conveniently handled with optimization methods. In this article, we summarize some recent developments of using metaheuristic-based methods within an embedded approach for gene selection. In particular, we put forward the importance and usefulness of integrating problem-specific knowledge into the search operators of such a method. To illustrate the point, we explain how ranking coefficients of a linear classifier such as support vector machine (SVM) can be profitably used to reinforce the search efficiency of Local Search and Evolutionary Search metaheuristic algorithms for gene selection and classification.
Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network
NASA Technical Reports Server (NTRS)
Kuhn, D. Richard; Kacker, Raghu; Lei, Yu
2010-01-01
This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.
Latimer, Luke N; Dueber, John E
2017-06-01
A common challenge in metabolic engineering is rapidly identifying rate-controlling enzymes in heterologous pathways for subsequent production improvement. We demonstrate a workflow to address this challenge and apply it to improving xylose utilization in Saccharomyces cerevisiae. For eight reactions required for conversion of xylose to ethanol, we screened enzymes for functional expression in S. cerevisiae, followed by a combinatorial expression analysis to achieve pathway flux balancing and identification of limiting enzymatic activities. In the next round of strain engineering, we increased the copy number of these limiting enzymes and again tested the eight-enzyme combinatorial expression library in this new background. This workflow yielded a strain that has a ∼70% increase in biomass yield and ∼240% increase in xylose utilization. Finally, we chromosomally integrated the expression library. This library enriched for strains with multiple integrations of the pathway, which likely were the result of tandem integrations mediated by promoter homology. Biotechnol. Bioeng. 2017;114: 1301-1309. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Bioengineering Strategies for Designing Targeted Cancer Therapies
Wen, Xuejun
2014-01-01
The goals of bioengineering strategies for targeted cancer therapies are (1) to deliver a high dose of an anticancer drug directly to a cancer tumor, (2) to enhance drug uptake by malignant cells, and (3) to minimize drug uptake by nonmalignant cells. Effective cancer-targeting therapies will require both passive- and active targeting strategies and a thorough understanding of physiologic barriers to targeted drug delivery. Designing a targeted therapy includes the selection and optimization of a nanoparticle delivery vehicle for passive accumulation in tumors, a targeting moiety for active receptor-mediated uptake, and stimuli-responsive polymers for control of drug release. The future direction of cancer targeting is a combinatorial approach, in which targeting therapies are designed to use multiple targeting strategies. The combinatorial approach will enable combination therapy for delivery of multiple drugs and dual ligand targeting to improve targeting specificity. Targeted cancer treatments in development and the new combinatorial approaches show promise for improving targeted anticancer drug delivery and improving treatment outcomes. PMID:23768509
Besalú, Emili
2016-01-01
The Superposing Significant Interaction Rules (SSIR) method is described. It is a general combinatorial and symbolic procedure able to rank compounds belonging to combinatorial analogue series. The procedure generates structure-activity relationship (SAR) models and also serves as an inverse SAR tool. The method is fast and can deal with large databases. SSIR operates from statistical significances calculated from the available library of compounds and according to the previously attached molecular labels of interest or non-interest. The required symbolic codification allows dealing with almost any combinatorial data set, even in a confidential manner, if desired. The application example categorizes molecules as binding or non-binding, and consensus ranking SAR models are generated from training and two distinct cross-validation methods: leave-one-out and balanced leave-two-out (BL2O), the latter being suited for the treatment of binary properties. PMID:27240346
NASA Technical Reports Server (NTRS)
Lee, Jonathan A.
2005-01-01
High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.
Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun
2018-07-01
This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.
NASA Astrophysics Data System (ADS)
Youl Jung, Kyeong
2010-08-01
Conventional solution-based combinatorial chemistry was combined with spray pyrolysis and applied to optimize the luminescence properties of (Y x, Gd y, Al z)BO 3:Eu 3+ red phosphor under vacuum ultraviolet (VUV) excitation. For the Y-Gd-Al ternary system, a compositional library was established to seek the optimal composition at which the highest luminescence under VUV (147 nm) excitation could be achieved. The Al content was found to mainly control the relative peak ratio (R/O) of red and orange colors due to the 5D 0→ 7F 2 to 5D 0→ 7F 1 transitions of Eu 3+. The substitution of Gd atoms in the place of Y sites did not contribute to change the R/O ratio, but was helpful to enhance the emission intensity. As a result, the 613 nm emission peak due to the 5D 0→ 7F 2 transitions of Eu 3+ was intensified by increasing the Al/Gd ratio at a fixed Y content, resulting in the improvement of the color coordinate. Finally, the optimized host composition was (Y 0.11, Gd 0.10, Al 0.79)BO 3 in terms of the emission intensity at 613 nm and the color coordinate.
Legrand, Yves-Marie; van der Lee, Arie; Barboiu, Mihail
2007-11-12
In this paper we report an extended series of 2,6-(iminoarene)pyridine-type ZnII complexes [(Lii)2Zn]II, which were surveyed for their ability to self-exchange both their ligands and their aromatic arms and to form different homoduplex and heteroduplex complexes in solution. The self-sorting of heteroduplex complexes is likely to be the result of geometric constraints. Whereas the imine-exchange process occurs quantitatively in 1:1 mixtures of [(Lii)2Zn]II complexes, the octahedral coordination process around the metal ion defines spatial-frustrated exchanges that involve the selective formation of heterocomplexes of two, by two different substituents; the bulkiest ones (pyrene in principle) specifically interact with the pseudoterpyridine core, sterically hindering the least bulky ones, which are intermolecularly stacked with similar ligands of neighboring molecules. Such a self-sorting process defined by the specific self-constitution of the ligands exchanging their aromatic substituents is self-optimized by a specific control over their spatial orientation around a metal center within the complex. They ultimately show an improved charge-transfer energy function by virtue of the dynamic amplification of self-optimized heteroduplex architectures. These systems therefore illustrate the convergence of the combinatorial self-sorting of the dynamic combinatorial libraries (DCLs) strategy and the constitutional self-optimized function.
Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem
Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue
2015-01-01
As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods. PMID:26176764
Measurement configuration optimization for dynamic metrology using Stokes polarimetry
NASA Astrophysics Data System (ADS)
Liu, Jiamin; Zhang, Chuanwei; Zhong, Zhicheng; Gu, Honggang; Chen, Xiuguo; Jiang, Hao; Liu, Shiyuan
2018-05-01
As dynamic loading experiments such as a shock compression test are usually characterized by short duration, unrepeatability and high costs, high temporal resolution and precise accuracy of the measurements is required. Due to high temporal resolution up to a ten-nanosecond-scale, a Stokes polarimeter with six parallel channels has been developed to capture such instantaneous changes in optical properties in this paper. Since the measurement accuracy heavily depends on the configuration of the probing beam incident angle and the polarizer azimuth angle, it is important to select an optimal combination from the numerous options. In this paper, a systematic error propagation-based measurement configuration optimization method corresponding to the Stokes polarimeter was proposed. The maximal Frobenius norm of the combinatorial matrix of the configuration error propagating matrix and the intrinsic error propagating matrix is introduced to assess the measurement accuracy. The optimal configuration for thickness measurement of a SiO2 thin film deposited on a Si substrate has been achieved by minimizing the merit function. Simulation and experimental results show a good agreement between the optimal measurement configuration achieved experimentally using the polarimeter and the theoretical prediction. In particular, the experimental result shows that the relative error in the thickness measurement can be reduced from 6% to 1% by using the optimal polarizer azimuth angle when the incident angle is 45°. Furthermore, the optimal configuration for the dynamic metrology of a nickel foil under quasi-dynamic loading is investigated using the proposed optimization method.
Software-supported USER cloning strategies for site-directed mutagenesis and DNA assembly.
Genee, Hans Jasper; Bonde, Mads Tvillinggaard; Bagger, Frederik Otzen; Jespersen, Jakob Berg; Sommer, Morten O A; Wernersson, Rasmus; Olsen, Lars Rønn
2015-03-20
USER cloning is a fast and versatile method for engineering of plasmid DNA. We have developed a user friendly Web server tool that automates the design of optimal PCR primers for several distinct USER cloning-based applications. Our Web server, named AMUSER (Automated DNA Modifications with USER cloning), facilitates DNA assembly and introduction of virtually any type of site-directed mutagenesis by designing optimal PCR primers for the desired genetic changes. To demonstrate the utility, we designed primers for a simultaneous two-position site-directed mutagenesis of green fluorescent protein (GFP) to yellow fluorescent protein (YFP), which in a single step reaction resulted in a 94% cloning efficiency. AMUSER also supports degenerate nucleotide primers, single insert combinatorial assembly, and flexible parameters for PCR amplification. AMUSER is freely available online at http://www.cbs.dtu.dk/services/AMUSER/.
Joint optimization of maintenance, buffers and machines in manufacturing lines
NASA Astrophysics Data System (ADS)
Nahas, Nabil; Nourelfath, Mustapha
2018-01-01
This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.
A survey about methods dedicated to epistasis detection.
Niel, Clément; Sinoquet, Christine; Dina, Christian; Rocheleau, Ghislain
2015-01-01
During the past decade, findings of genome-wide association studies (GWAS) improved our knowledge and understanding of disease genetics. To date, thousands of SNPs have been associated with diseases and other complex traits. Statistical analysis typically looks for association between a phenotype and a SNP taken individually via single-locus tests. However, geneticists admit this is an oversimplified approach to tackle the complexity of underlying biological mechanisms. Interaction between SNPs, namely epistasis, must be considered. Unfortunately, epistasis detection gives rise to analytic challenges since analyzing every SNP combination is at present impractical at a genome-wide scale. In this review, we will present the main strategies recently proposed to detect epistatic interactions, along with their operating principle. Some of these methods are exhaustive, such as multifactor dimensionality reduction, likelihood ratio-based tests or receiver operating characteristic curve analysis; some are non-exhaustive, such as machine learning techniques (random forests, Bayesian networks) or combinatorial optimization approaches (ant colony optimization, computational evolution system).
A traveling salesman approach for predicting protein functions.
Johnson, Olin; Liu, Jing
2006-10-12
Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.
A traveling salesman approach for predicting protein functions
Johnson, Olin; Liu, Jing
2006-01-01
Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783
Sentence Processing in an Artificial Language: Learning and Using Combinatorial Constraints
ERIC Educational Resources Information Center
Amato, Michael S.; MacDonald, Maryellen C.
2010-01-01
A study combining artificial grammar and sentence comprehension methods investigated the learning and online use of probabilistic, nonadjacent combinatorial constraints. Participants learned a small artificial language describing cartoon monsters acting on objects. Self-paced reading of sentences in the artificial language revealed comprehenders'…
Kasperkiewicz, Paulina; Poreba, Marcin; Snipas, Scott J.; Parker, Heather; Winterbourn, Christine C.; Salvesen, Guy S.; Drag, Marcin
2014-01-01
The exploration of protease substrate specificity is generally restricted to naturally occurring amino acids, limiting the degree of conformational space that can be surveyed. We substantially enhanced this by incorporating 102 unnatural amino acids to explore the S1–S4 pockets of human neutrophil elastase. This approach provides hybrid natural and unnatural amino acid sequences, and thus we termed it the Hybrid Combinatorial Substrate Library. Library results were validated by the synthesis of individual tetrapeptide substrates, with the optimal substrate demonstrating more than three orders of magnitude higher catalytic efficiency than commonly used substrates of elastase. This optimal substrate was converted to an activity-based probe that demonstrated high selectivity and revealed the specific presence of active elastase during the process of neutrophil extracellular trap formation. We propose that this approach can be successfully used for any type of endopeptidase to deliver high activity and selectivity in substrates and probes. PMID:24550277
Optimizing Sensor and Actuator Arrays for ASAC Noise Control
NASA Technical Reports Server (NTRS)
Palumbo, Dan; Cabell, Ran
2000-01-01
This paper summarizes the development of an approach to optimizing the locations for arrays of sensors and actuators in active noise control systems. A type of directed combinatorial search, called Tabu Search, is used to select an optimal configuration from a much larger set of candidate locations. The benefit of using an optimized set is demonstrated. The importance of limiting actuator forces to realistic levels when evaluating the cost function is discussed. Results of flight testing an optimized system are presented. Although the technique has been applied primarily to Active Structural Acoustic Control systems, it can be adapted for use in other active noise control implementations.
Pantazes, Robert J; Saraf, Manish C; Maranas, Costas D
2007-08-01
In this paper, we introduce and test two new sequence-based protein scoring systems (i.e. S1, S2) for assessing the likelihood that a given protein hybrid will be functional. By binning together amino acids with similar properties (i.e. volume, hydrophobicity and charge) the scoring systems S1 and S2 allow for the quantification of the severity of mismatched interactions in the hybrids. The S2 scoring system is found to be able to significantly functionally enrich a cytochrome P450 library over other scoring methods. Given this scoring base, we subsequently constructed two separate optimization formulations (i.e. OPTCOMB and OPTOLIGO) for optimally designing protein combinatorial libraries involving recombination or mutations, respectively. Notably, two separate versions of OPTCOMB are generated (i.e. model M1, M2) with the latter allowing for position-dependent parental fragment skipping. Computational benchmarking results demonstrate the efficacy of models OPTCOMB and OPTOLIGO to generate high scoring libraries of a prespecified size.
Combinatorial Algorithms for Portfolio Optimization Problems - Case of Risk Moderate Investor
NASA Astrophysics Data System (ADS)
Juarna, A.
2017-03-01
Portfolio optimization problem is a problem of finding optimal combination of n stocks from N ≥ n available stocks that gives maximal aggregate return and minimal aggregate risk. In this paper given N = 43 from the IDX (Indonesia Stock Exchange) group of the 45 most-traded stocks, known as the LQ45, with p = 24 data of monthly returns for each stock, spanned over interval 2013-2014. This problem actually is a combinatorial one where its algorithm is constructed based on two considerations: risk moderate type of investor and maximum allowed correlation coefficient between every two eligible stocks. The main outputs resulted from implementation of the algorithms is a multiple curve of three portfolio’s attributes, e.g. the size, the ratio of return to risk, and the percentage of negative correlation coefficient for every two chosen stocks, as function of maximum allowed correlation coefficient between each two stocks. The output curve shows that the portfolio contains three stocks with ratio of return to risk at 14.57 if the maximum allowed correlation coefficient between every two eligible stocks is negative and contains 19 stocks with maximum allowed correlation coefficient 0.17 to get maximum ratio of return to risk at 25.48.
Bruhn, Peter; Geyer-Schulz, Andreas
2002-01-01
In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.
Non-rigid image registration using graph-cuts.
Tang, Tommy W H; Chung, Albert C S
2007-01-01
Non-rigid image registration is an ill-posed yet challenging problem due to its supernormal high degree of freedoms and inherent requirement of smoothness. Graph-cuts method is a powerful combinatorial optimization tool which has been successfully applied into image segmentation and stereo matching. Under some specific constraints, graph-cuts method yields either a global minimum or a local minimum in a strong sense. Thus, it is interesting to see the effects of using graph-cuts in non-rigid image registration. In this paper, we formulate non-rigid image registration as a discrete labeling problem. Each pixel in the source image is assigned a displacement label (which is a vector) indicating which position in the floating image it is spatially corresponding to. A smoothness constraint based on first derivative is used to penalize sharp changes in displacement labels across pixels. The whole system can be optimized by using the graph-cuts method via alpha-expansions. We compare 2D and 3D registration results of our method with two state-of-the-art approaches. It is found that our method is more robust to different challenging non-rigid registration cases with higher registration accuracy.
NASA Astrophysics Data System (ADS)
Moghaddam, Kamran S.; Usher, John S.
2011-07-01
In this article, a new multi-objective optimization model is developed to determine the optimal preventive maintenance and replacement schedules in a repairable and maintainable multi-component system. In this model, the planning horizon is divided into discrete and equally-sized periods in which three possible actions must be planned for each component, namely maintenance, replacement, or do nothing. The objective is to determine a plan of actions for each component in the system while minimizing the total cost and maximizing overall system reliability simultaneously over the planning horizon. Because of the complexity, combinatorial and highly nonlinear structure of the mathematical model, two metaheuristic solution methods, generational genetic algorithm, and a simulated annealing are applied to tackle the problem. The Pareto optimal solutions that provide good tradeoffs between the total cost and the overall reliability of the system can be obtained by the solution approach. Such a modeling approach should be useful for maintenance planners and engineers tasked with the problem of developing recommended maintenance plans for complex systems of components.
XY vs X Mixer in Quantum Alternating Operator Ansatz for Optimization Problems with Constraints
NASA Technical Reports Server (NTRS)
Wang, Zhihui; Rubin, Nicholas; Rieffel, Eleanor G.
2018-01-01
Quantum Approximate Optimization Algorithm, further generalized as Quantum Alternating Operator Ansatz (QAOA), is a family of algorithms for combinatorial optimization problems. It is a leading candidate to run on emerging universal quantum computers to gain insight into quantum heuristics. In constrained optimization, penalties are often introduced so that the ground state of the cost Hamiltonian encodes the solution (a standard practice in quantum annealing). An alternative is to choose a mixing Hamiltonian such that the constraint corresponds to a constant of motion and the quantum evolution stays in the feasible subspace. Better performance of the algorithm is speculated due to a much smaller search space. We consider problems with a constant Hamming weight as the constraint. We also compare different methods of generating the generalized W-state, which serves as a natural initial state for the Hamming-weight constraint. Using graph-coloring as an example, we compare the performance of using XY model as a mixer that preserves the Hamming weight with the performance of adding a penalty term in the cost Hamiltonian.
2012-01-01
us.army.mil ABSTRACT Scenario-based training exemplifies the learning-by-doing approach to human performance improvement. In this paper , we enumerate...through a narrative, mission, quest, or scenario. In this paper we argue for a combinatorial optimization search approach to selecting and ordering...the role of an expert for the purposes of practicing skills and knowledge in realistic situations in a learning-by-doing approach to performance
Investigation and Implementation of Matrix Permanent Algorithms for Identity Resolution
2014-12-01
calculation of the permanent of a matrix whose dimension is a function of target count [21]. However, the optimal approach for computing the permanent is...presently unclear. The primary objective of this project was to determine the optimal computing strategy(-ies) for the matrix permanent in tactical and...solving various combinatorial problems (see [16] for details and appli- cations to a wide variety of problems) and thus can be applied to compute a
Landscape Encodings Enhance Optimization
Klemm, Konstantin; Mehta, Anita; Stadler, Peter F.
2012-01-01
Hard combinatorial optimization problems deal with the search for the minimum cost solutions (ground states) of discrete systems under strong constraints. A transformation of state variables may enhance computational tractability. It has been argued that these state encodings are to be chosen invertible to retain the original size of the state space. Here we show how redundant non-invertible encodings enhance optimization by enriching the density of low-energy states. In addition, smooth landscapes may be established on encoded state spaces to guide local search dynamics towards the ground state. PMID:22496860
Optimal placement of excitations and sensors for verification of large dynamical systems
NASA Technical Reports Server (NTRS)
Salama, M.; Rose, T.; Garba, J.
1987-01-01
The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.
NASA Astrophysics Data System (ADS)
Cai, Xiaohui; Liu, Yang; Ren, Zhiming
2018-06-01
Reverse-time migration (RTM) is a powerful tool for imaging geologically complex structures such as steep-dip and subsalt. However, its implementation is quite computationally expensive. Recently, as a low-cost solution, the graphic processing unit (GPU) was introduced to improve the efficiency of RTM. In the paper, we develop three ameliorative strategies to implement RTM on GPU card. First, given the high accuracy and efficiency of the adaptive optimal finite-difference (FD) method based on least squares (LS) on central processing unit (CPU), we study the optimal LS-based FD method on GPU. Second, we develop the CPU-based hybrid absorbing boundary condition (ABC) to the GPU-based one by addressing two issues of the former when introduced to GPU card: time-consuming and chaotic threads. Third, for large-scale data, the combinatorial strategy for optimal checkpointing and efficient boundary storage is introduced for the trade-off between memory and recomputation. To save the time of communication between host and disk, the portable operating system interface (POSIX) thread is utilized to create the other CPU core at the checkpoints. Applications of the three strategies on GPU with the compute unified device architecture (CUDA) programming language in RTM demonstrate their efficiency and validity.
A combinatorial approach to protein docking with flexible side chains.
Althaus, Ernst; Kohlbacher, Oliver; Lenhof, Hans-Peter; Müller, Peter
2002-01-01
Rigid-body docking approaches are not sufficient to predict the structure of a protein complex from the unbound (native) structures of the two proteins. Accounting for side chain flexibility is an important step towards fully flexible protein docking. This work describes an approach that allows conformational flexibility for the side chains while keeping the protein backbone rigid. Starting from candidates created by a rigid-docking algorithm, we demangle the side chains of the docking site, thus creating reasonable approximations of the true complex structure. These structures are ranked with respect to the binding free energy. We present two new techniques for side chain demangling. Both approaches are based on a discrete representation of the side chain conformational space by the use of a rotamer library. This leads to a combinatorial optimization problem. For the solution of this problem, we propose a fast heuristic approach and an exact, albeit slower, method that uses branch-and-cut techniques. As a test set, we use the unbound structures of three proteases and the corresponding protein inhibitors. For each of the examples, the highest-ranking conformation produced was a good approximation of the true complex structure.
A ripple-spreading genetic algorithm for the aircraft sequencing problem.
Hu, Xiao-Bing; Di Paolo, Ezequiel A
2011-01-01
When genetic algorithms (GAs) are applied to combinatorial problems, permutation representations are usually adopted. As a result, such GAs are often confronted with feasibility and memory-efficiency problems. With the aircraft sequencing problem (ASP) as a study case, this paper reports on a novel binary-representation-based GA scheme for combinatorial problems. Unlike existing GAs for the ASP, which typically use permutation representations based on aircraft landing order, the new GA introduces a novel ripple-spreading model which transforms the original landing-order-based ASP solutions into value-based ones. In the new scheme, arriving aircraft are projected as points into an artificial space. A deterministic method inspired by the natural phenomenon of ripple-spreading on liquid surfaces is developed, which uses a few parameters as input to connect points on this space to form a landing sequence. A traditional GA, free of feasibility and memory-efficiency problems, can then be used to evolve the ripple-spreading related parameters in order to find an optimal sequence. Since the ripple-spreading model is the centerpiece of the new algorithm, it is called the ripple-spreading GA (RSGA). The advantages of the proposed RSGA are illustrated by extensive comparative studies for the case of the ASP.
"One-sample concept" micro-combinatory for high throughput TEM of binary films.
Sáfrán, György
2018-04-01
Phases of thin films may remarkably differ from that of bulk. Unlike to the comprehensive data files of Binary Phase Diagrams [1] available for bulk, complete phase maps for thin binary layers do not exist. This is due to both the diverse metastable, non-equilibrium or instable phases feasible in thin films and the required volume of characterization work with analytical techniques like TEM, SAED and EDS. The aim of the present work was to develop a method that remarkably facilitates the TEM study of the diverse binary phases of thin films, or the creation of phase maps. A micro-combinatorial method was worked out that enables both preparation and study of a gradient two-component film within a single TEM specimen. For a demonstration of the technique thin Mn x Al 1- x binary samples with evolving concentration from x = 0 to x = 1 have been prepared so that the transition from pure Mn to pure Al covers a 1.5 mm long track within the 3 mm diameter TEM grid. The proposed method enables the preparation and study of thin combinatorial samples including all feasible phases as a function of composition or other deposition parameters. Contrary to known "combinatorial chemistry", in which a series of different samples are deposited in one run, and investigated, one at a time, the present micro-combinatorial method produces a single specimen condensing a complete library of a binary system that can be studied, efficiently, within a single TEM session. That provides extremely high throughput for TEM characterization of composition-dependent phases, exploration of new materials, or the construction of phase diagrams of binary films. Copyright © 2018 Elsevier B.V. All rights reserved.
A comparison of approaches for finding minimum identifying codes on graphs
NASA Astrophysics Data System (ADS)
Horan, Victoria; Adachi, Steve; Bak, Stanley
2016-05-01
In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.
Adapting an Ant Colony Metaphor for Multi-Robot Chemical Plume Tracing
Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Li, Fei; Zeng, Ming
2012-01-01
We consider chemical plume tracing (CPT) in time-varying airflow environments using multiple mobile robots. The purpose of CPT is to approach a gas source with a previously unknown location in a given area. Therefore, the CPT could be considered as a dynamic optimization problem in continuous domains. The traditional ant colony optimization (ACO) algorithm has been successfully used for combinatorial optimization problems in discrete domains. To adapt the ant colony metaphor to the multi-robot CPT problem, the two-dimension continuous search area is discretized into grids and the virtual pheromone is updated according to both the gas concentration and wind information. To prevent the adapted ACO algorithm from being prematurely trapped in a local optimum, the upwind surge behavior is adopted by the robots with relatively higher gas concentration in order to explore more areas. The spiral surge (SS) algorithm is also examined for comparison. Experimental results using multiple real robots in two indoor natural ventilated airflow environments show that the proposed CPT method performs better than the SS algorithm. The simulation results for large-scale advection-diffusion plume environments show that the proposed method could also work in outdoor meandering plume environments. PMID:22666056
Adapting an ant colony metaphor for multi-robot chemical plume tracing.
Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Li, Fei; Zeng, Ming
2012-01-01
We consider chemical plume tracing (CPT) in time-varying airflow environments using multiple mobile robots. The purpose of CPT is to approach a gas source with a previously unknown location in a given area. Therefore, the CPT could be considered as a dynamic optimization problem in continuous domains. The traditional ant colony optimization (ACO) algorithm has been successfully used for combinatorial optimization problems in discrete domains. To adapt the ant colony metaphor to the multi-robot CPT problem, the two-dimension continuous search area is discretized into grids and the virtual pheromone is updated according to both the gas concentration and wind information. To prevent the adapted ACO algorithm from being prematurely trapped in a local optimum, the upwind surge behavior is adopted by the robots with relatively higher gas concentration in order to explore more areas. The spiral surge (SS) algorithm is also examined for comparison. Experimental results using multiple real robots in two indoor natural ventilated airflow environments show that the proposed CPT method performs better than the SS algorithm. The simulation results for large-scale advection-diffusion plume environments show that the proposed method could also work in outdoor meandering plume environments.
Particle-Based Microarrays of Oligonucleotides and Oligopeptides.
Nesterov-Mueller, Alexander; Maerkle, Frieder; Hahn, Lothar; Foertsch, Tobias; Schillo, Sebastian; Bykovskaya, Valentina; Sedlmayr, Martyna; Weber, Laura K; Ridder, Barbara; Soehindrijo, Miriam; Muenster, Bastian; Striffler, Jakob; Bischoff, F Ralf; Breitling, Frank; Loeffler, Felix F
2014-10-28
In this review, we describe different methods of microarray fabrication based on the use of micro-particles/-beads and point out future tendencies in the development of particle-based arrays. First, we consider oligonucleotide bead arrays, where each bead is a carrier of one specific sequence of oligonucleotides. This bead-based array approach, appearing in the late 1990s, enabled high-throughput oligonucleotide analysis and had a large impact on genome research. Furthermore, we consider particle-based peptide array fabrication using combinatorial chemistry. In this approach, particles can directly participate in both the synthesis and the transfer of synthesized combinatorial molecules to a substrate. Subsequently, we describe in more detail the synthesis of peptide arrays with amino acid polymer particles, which imbed the amino acids inside their polymer matrix. By heating these particles, the polymer matrix is transformed into a highly viscous gel, and thereby, imbedded monomers are allowed to participate in the coupling reaction. Finally, we focus on combinatorial laser fusing of particles for the synthesis of high-density peptide arrays. This method combines the advantages of particles and combinatorial lithographic approaches.
Particle-Based Microarrays of Oligonucleotides and Oligopeptides
Nesterov-Mueller, Alexander; Maerkle, Frieder; Hahn, Lothar; Foertsch, Tobias; Schillo, Sebastian; Bykovskaya, Valentina; Sedlmayr, Martyna; Weber, Laura K.; Ridder, Barbara; Soehindrijo, Miriam; Muenster, Bastian; Striffler, Jakob; Bischoff, F. Ralf; Breitling, Frank; Loeffler, Felix F.
2014-01-01
In this review, we describe different methods of microarray fabrication based on the use of micro-particles/-beads and point out future tendencies in the development of particle-based arrays. First, we consider oligonucleotide bead arrays, where each bead is a carrier of one specific sequence of oligonucleotides. This bead-based array approach, appearing in the late 1990s, enabled high-throughput oligonucleotide analysis and had a large impact on genome research. Furthermore, we consider particle-based peptide array fabrication using combinatorial chemistry. In this approach, particles can directly participate in both the synthesis and the transfer of synthesized combinatorial molecules to a substrate. Subsequently, we describe in more detail the synthesis of peptide arrays with amino acid polymer particles, which imbed the amino acids inside their polymer matrix. By heating these particles, the polymer matrix is transformed into a highly viscous gel, and thereby, imbedded monomers are allowed to participate in the coupling reaction. Finally, we focus on combinatorial laser fusing of particles for the synthesis of high-density peptide arrays. This method combines the advantages of particles and combinatorial lithographic approaches. PMID:27600347
BioPartsBuilder: a synthetic biology tool for combinatorial assembly of biological parts.
Yang, Kun; Stracquadanio, Giovanni; Luo, Jingchuan; Boeke, Jef D; Bader, Joel S
2016-03-15
Combinatorial assembly of DNA elements is an efficient method for building large-scale synthetic pathways from standardized, reusable components. These methods are particularly useful because they enable assembly of multiple DNA fragments in one reaction, at the cost of requiring that each fragment satisfies design constraints. We developed BioPartsBuilder as a biologist-friendly web tool to design biological parts that are compatible with DNA combinatorial assembly methods, such as Golden Gate and related methods. It retrieves biological sequences, enforces compliance with assembly design standards and provides a fabrication plan for each fragment. BioPartsBuilder is accessible at http://public.biopartsbuilder.org and an Amazon Web Services image is available from the AWS Market Place (AMI ID: ami-508acf38). Source code is released under the MIT license, and available for download at https://github.com/baderzone/biopartsbuilder joel.bader@jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
AFLOW: An Automatic Framework for High-throughput Materials Discovery
2011-11-14
computational ma- terials HT applications include combinatorial discov- ery of superconductors [1], Pareto-optimal search for alloys and catalysts [14, 15...Ducastelle, D. Gratias, Physica A 128 (1984) 334–350. [37] D. de Fontaine, Cluster Approach to Order- disorder Transfor- mations in Alloys, volume 47 of
Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors
Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang
2015-01-01
The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383
Cascaded Optimization for a Persistent Data Ferrying Unmanned Aircraft
NASA Astrophysics Data System (ADS)
Carfang, Anthony
This dissertation develops and assesses a cascaded method for designing optimal periodic trajectories and link schedules for an unmanned aircraft to ferry data between stationary ground nodes. This results in a fast solution method without the need to artificially constrain system dynamics. Focusing on a fundamental ferrying problem that involves one source and one destination, but includes complex vehicle and Radio-Frequency (RF) dynamics, a cascaded structure to the system dynamics is uncovered. This structure is exploited by reformulating the nonlinear optimization problem into one that reduces the independent control to the vehicle's motion, while the link scheduling control is folded into the objective function and implemented as an optimal policy that depends on candidate motion control. This formulation is proven to maintain optimality while reducing computation time in comparison to traditional ferry optimization methods. The discrete link scheduling problem takes the form of a combinatorial optimization problem that is known to be NP-Hard. A derived necessary condition for optimality guides the development of several heuristic algorithms, specifically the Most-Data-First Algorithm and the Knapsack Adaptation. These heuristics are extended to larger ferrying scenarios, and assessed analytically and through Monte Carlo simulation, showing better throughput performance in the same order of magnitude of computation time in comparison to other common link scheduling policies. The cascaded optimization method is implemented with a novel embedded software system on a small, unmanned aircraft to validate the simulation results with field experiments. To address the sensitivity of results on trajectory tracking performance, a system that combines motion and link control with waypoint-based navigation is developed and assessed through field experiments. The data ferrying algorithms are further extended by incorporating a Gaussian process to opportunistically learn the RF environment. By continuously improving RF models, the cascaded planner can continually improve the ferrying system's overall performance.
Experimental Design for Combinatorial and High Throughput Materials Development
NASA Astrophysics Data System (ADS)
Cawse, James N.
2002-12-01
In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.
Napolitano, Roberta; Soesbe, Todd C; De León-Rodríguez, Luis M; Sherry, A Dean; Udugamasooriya, D Gomika
2011-08-24
The sensitivity of magnetic resonance imaging (MRI) contrast agents is highly dependent on the rate of water exchange between the inner sphere of a paramagnetic ion and bulk water. Normally, identifying a paramagnetic complex that has optimal water exchange kinetics is done by synthesizing and testing one compound at a time. We report here a rapid, economical on-bead combinatorial synthesis of a library of imaging agents. Eighty different 1,4,7,10-tetraazacyclododecan-1,4,7,10-tetraacetic acid (DOTA)-tetraamide peptoid derivatives were prepared on beads using a variety of charged, uncharged but polar, hydrophobic, and variably sized primary amines. A single chemical exchange saturation transfer image of the on-bead library easily distinguished those compounds having the most favorable water exchange kinetics. This combinatorial approach will allow rapid screening of libraries of imaging agents to identify the chemical characteristics of a ligand that yield the most sensitive imaging agents. This technique could be automated and readily adapted to other types of MRI or magnetic resonance/positron emission tomography agents as well.
Kajiwara, Shota; Yamada, Ryosuke; Ogino, Hiroyasu
2018-04-10
Simple and cost-effective lipase expression host microorganisms are highly desirable. A combinatorial library strategy is used to improve the secretory expression of lipase from Bacillus thermocatenulatus (BTL2) in the culture supernatant of Saccharomyces cerevisiae. A plasmid library including expression cassettes composed of sequences encoding one of each 15 promoters, 15 secretion signals, and 15 terminators derived from yeast species, S. cerevisiae, Pichia pastoris, and Hansenula polymorpha, is constructed. The S. cerevisiae transformant YPH499/D4, comprising H. polymorpha GAP promoter, S. cerevisiae SAG1 secretion signal, and P. pastoris AOX1 terminator, is selected by high-throughput screening. This transformant expresses BTL2 extra-cellularly with a 130-fold higher than the control strain, comprising S. cerevisiae PGK1 promoter, S. cerevisiae α-factor secretion signal, and S. cerevisiae PGK1 terminator, after cultivation for 72 h. This combinatorial library strategy holds promising potential for application in the optimization of the secretory expression of proteins in yeast. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ant algorithms for discrete optimization.
Dorigo, M; Di Caro, G; Gambardella, L M
1999-01-01
This article presents an overview of recent work on ant algorithms, that is, algorithms for discrete optimization that took inspiration from the observation of ant colonies' foraging behavior, and introduces the ant colony optimization (ACO) metaheuristic. In the first part of the article the basic biological findings on real ants are reviewed and their artificial counterparts as well as the ACO metaheuristic are defined. In the second part of the article a number of applications of ACO algorithms to combinatorial optimization and routing in communications networks are described. We conclude with a discussion of related work and of some of the most important aspects of the ACO metaheuristic.
Sahib, Mouayad A.; Gambardella, Luca M.; Afzal, Wasif; Zamli, Kamal Z.
2016-01-01
Combinatorial test design is a plan of test that aims to reduce the amount of test cases systematically by choosing a subset of the test cases based on the combination of input variables. The subset covers all possible combinations of a given strength and hence tries to match the effectiveness of the exhaustive set. This mechanism of reduction has been used successfully in software testing research with t-way testing (where t indicates the interaction strength of combinations). Potentially, other systems may exhibit many similarities with this approach. Hence, it could form an emerging application in different areas of research due to its usefulness. To this end, more recently it has been applied in a few research areas successfully. In this paper, we explore the applicability of combinatorial test design technique for Fractional Order (FO), Proportional-Integral-Derivative (PID) parameter design controller, named as FOPID, for an automatic voltage regulator (AVR) system. Throughout the paper, we justify this new application theoretically and practically through simulations. In addition, we report on first experiments indicating its practical use in this field. We design different algorithms and adapted other strategies to cover all the combinations with an optimum and effective test set. Our findings indicate that combinatorial test design can find the combinations that lead to optimum design. Besides this, we also found that by increasing the strength of combination, we can approach to the optimum design in a way that with only 4-way combinatorial set, we can get the effectiveness of an exhaustive test set. This significantly reduced the number of tests needed and thus leads to an approach that optimizes design of parameters quickly. PMID:27829025
Comprehensive human transcription factor binding site map for combinatory binding motifs discovery.
Müller-Molina, Arnoldo J; Schöler, Hans R; Araúzo-Bravo, Marcos J
2012-01-01
To know the map between transcription factors (TFs) and their binding sites is essential to reverse engineer the regulation process. Only about 10%-20% of the transcription factor binding motifs (TFBMs) have been reported. This lack of data hinders understanding gene regulation. To address this drawback, we propose a computational method that exploits never used TF properties to discover the missing TFBMs and their sites in all human gene promoters. The method starts by predicting a dictionary of regulatory "DNA words." From this dictionary, it distills 4098 novel predictions. To disclose the crosstalk between motifs, an additional algorithm extracts TF combinatorial binding patterns creating a collection of TF regulatory syntactic rules. Using these rules, we narrowed down a list of 504 novel motifs that appear frequently in syntax patterns. We tested the predictions against 509 known motifs confirming that our system can reliably predict ab initio motifs with an accuracy of 81%-far higher than previous approaches. We found that on average, 90% of the discovered combinatorial binding patterns target at least 10 genes, suggesting that to control in an independent manner smaller gene sets, supplementary regulatory mechanisms are required. Additionally, we discovered that the new TFBMs and their combinatorial patterns convey biological meaning, targeting TFs and genes related to developmental functions. Thus, among all the possible available targets in the genome, the TFs tend to regulate other TFs and genes involved in developmental functions. We provide a comprehensive resource for regulation analysis that includes a dictionary of "DNA words," newly predicted motifs and their corresponding combinatorial patterns. Combinatorial patterns are a useful filter to discover TFBMs that play a major role in orchestrating other factors and thus, are likely to lock/unlock cellular functional clusters.
Comprehensive Human Transcription Factor Binding Site Map for Combinatory Binding Motifs Discovery
Müller-Molina, Arnoldo J.; Schöler, Hans R.; Araúzo-Bravo, Marcos J.
2012-01-01
To know the map between transcription factors (TFs) and their binding sites is essential to reverse engineer the regulation process. Only about 10%–20% of the transcription factor binding motifs (TFBMs) have been reported. This lack of data hinders understanding gene regulation. To address this drawback, we propose a computational method that exploits never used TF properties to discover the missing TFBMs and their sites in all human gene promoters. The method starts by predicting a dictionary of regulatory “DNA words.” From this dictionary, it distills 4098 novel predictions. To disclose the crosstalk between motifs, an additional algorithm extracts TF combinatorial binding patterns creating a collection of TF regulatory syntactic rules. Using these rules, we narrowed down a list of 504 novel motifs that appear frequently in syntax patterns. We tested the predictions against 509 known motifs confirming that our system can reliably predict ab initio motifs with an accuracy of 81%—far higher than previous approaches. We found that on average, 90% of the discovered combinatorial binding patterns target at least 10 genes, suggesting that to control in an independent manner smaller gene sets, supplementary regulatory mechanisms are required. Additionally, we discovered that the new TFBMs and their combinatorial patterns convey biological meaning, targeting TFs and genes related to developmental functions. Thus, among all the possible available targets in the genome, the TFs tend to regulate other TFs and genes involved in developmental functions. We provide a comprehensive resource for regulation analysis that includes a dictionary of “DNA words,” newly predicted motifs and their corresponding combinatorial patterns. Combinatorial patterns are a useful filter to discover TFBMs that play a major role in orchestrating other factors and thus, are likely to lock/unlock cellular functional clusters. PMID:23209563
Discovery of Cationic Polymers for Non-viral Gene Delivery using Combinatorial Approaches
Barua, Sutapa; Ramos, James; Potta, Thrimoorthy; Taylor, David; Huang, Huang-Chiao; Montanez, Gabriela; Rege, Kaushal
2015-01-01
Gene therapy is an attractive treatment option for diseases of genetic origin, including several cancers and cardiovascular diseases. While viruses are effective vectors for delivering exogenous genes to cells, concerns related to insertional mutagenesis, immunogenicity, lack of tropism, decay and high production costs necessitate the discovery of non-viral methods. Significant efforts have been focused on cationic polymers as non-viral alternatives for gene delivery. Recent studies have employed combinatorial syntheses and parallel screening methods for enhancing the efficacy of gene delivery, biocompatibility of the delivery vehicle, and overcoming cellular level barriers as they relate to polymer-mediated transgene uptake, transport, transcription, and expression. This review summarizes and discusses recent advances in combinatorial syntheses and parallel screening of cationic polymer libraries for the discovery of efficient and safe gene delivery systems. PMID:21843141
Mapping protein-protein interactions with phage-displayed combinatorial peptide libraries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kay, B. K.; Castagnoli, L.; Biosciences Division
This unit describes the process and analysis of affinity selecting bacteriophage M13 from libraries displaying combinatorial peptides fused to either a minor or major capsid protein. Direct affinity selection uses target protein bound to a microtiter plate followed by purification of selected phage by ELISA. Alternatively, there is a bead-based affinity selection method. These methods allow one to readily isolate peptide ligands that bind to a protein target of interest and use the consensus sequence to search proteomic databases for putative interacting proteins.
NASA Astrophysics Data System (ADS)
Kang, Angray S.; Barbas, Carlos F.; Janda, Kim D.; Benkovic, Stephen J.; Lerner, Richard A.
1991-05-01
We describe a method based on a phagemid vector with helper phage rescue for the construction and rapid analysis of combinatorial antibody Fab libraries. This approach should allow the generation and selection of many monoclonal antibodies. Antibody genes are expressed in concert with phage morphogenesis, thereby allowing incorporation of functional Fab molecules along the surface of filamentous phage. The power of the method depends upon the linkage of recognition and replication functions and is not limited to antibody molecules.
Cooperative combinatorial optimization: evolutionary computation case study.
Burgin, Mark; Eberbach, Eugene
2008-01-01
This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.
Swarm Intelligence Optimization and Its Applications
NASA Astrophysics Data System (ADS)
Ding, Caichang; Lu, Lu; Liu, Yuanchao; Peng, Wenxiu
Swarm Intelligence is a computational and behavioral metaphor for solving distributed problems inspired from biological examples provided by social insects such as ants, termites, bees, and wasps and by swarm, herd, flock, and shoal phenomena in vertebrates such as fish shoals and bird flocks. An example of successful research direction in Swarm Intelligence is ant colony optimization (ACO), which focuses on combinatorial optimization problems. Ant algorithms can be viewed as multi-agent systems (ant colony), where agents (individual ants) solve required tasks through cooperation in the same way that ants create complex social behavior from the combined efforts of individuals.
Biosynthesis and Heterologous Production of Epothilones
NASA Astrophysics Data System (ADS)
Müller, Rolf
Although a variety of chemical syntheses for the epothilones and various derivatives have been described, modifying the backbone of those natural products remains a major challenge. One alternative to chemical alteration is the elucidation and subsequent manipulation of the biosynthetic pathway via genetic engineering in the producing organism. This type of approach is known as “combinatorial biosynthesis” and holds great promise, especially in conjunction with semi-synthesis methods to alter the structure of the natural product. In parallel, production can be optimized in the natural producer if the regulatory mechanisms governing the biosynthesis are understood. Alternatively, the entire gene cluster can be transferred into a heterologous host, more amenable both to genetic alteration and overexpression.
Optimisation by hierarchical search
NASA Astrophysics Data System (ADS)
Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias
2015-03-01
Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
Optimal Iterative Task Scheduling for Parallel Simulations.
1991-03-01
State University, Pullman, Washington. November 1976. 19. Grimaldi , Ralph P . Discrete and Combinatorial Mathematics. Addison-Wesley. June 1989. 20...2 4.8.1 Problem Description .. .. .. .. ... .. ... .... 4-25 4.8.2 Reasons for Level-Strate- p Failure. .. .. .. .. ... 4-26...f- I CA A* overview................................ C-1 C .2 Sample A* r......................... .... C-I C-3 Evaluation P
ERIC Educational Resources Information Center
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-01-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The…
Solforosi, Laura; Mancini, Nicasio; Canducci, Filippo; Clementi, Nicola; Sautto, Giuseppe Andrea; Diotti, Roberta Antonia; Clementi, Massimo; Burioni, Roberto
2012-07-01
A novel phagemid vector, named pCM, was optimized for the cloning and display of antibody fragment (Fab) libraries on the surface of filamentous phage. This vector contains two long DNA "stuffer" fragments for easier differentiation of the correctly cut forms of the vector. Moreover, in pCM the fragment at the heavy-chain cloning site contains an acid phosphatase-encoding gene allowing an easy distinction of the Escherichia coli cells containing the unmodified form of the phagemid versus the heavy-chain fragment coding cDNA. In pCM transcription of heavy-chain Fd/gene III and light chain is driven by a single lacZ promoter. The light chain is directed to the periplasm by the ompA signal peptide, whereas the heavy-chain Fd/coat protein III is trafficked by the pelB signal peptide. The phagemid pCM was used to generate a human combinatorial phage display antibody library that allowed the selection of a monoclonal Fab fragment antibody directed against the nucleoprotein (NP) of Influenza A virus.
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Hassib, Lamyaa
2005-06-01
Multicomponent polymer-based formulations of optical sensor materials are difficult and time consuming to optimize using conventional approaches. To address these challenges, our long-term goal is to determine relationships between sensor formulation and sensor response parameters using new scientific methodologies. As the first step, we have designed and implemented an automated analytical instrumentation infrastructure for combinatorial and high-throughput development of polymeric sensor materials for optical sensors. Our approach is based on the fabrication and performance screening of discrete and gradient sensor arrays. Simultaneous formation of multiple sensor coatings into discrete 4×6, 6×8, and 8×12 element arrays (3-15μL volume per element) and their screening provides not only a well-recognized acceleration in the screening rate, but also considerably reduces or even eliminates sources of variability, which are randomly affecting sensors response during a conventional one-at-a-time sensor coating evaluation. The application of gradient sensor arrays provides additional capabilities for rapid finding of the optimal formulation parameters.
He, Jiankang; Du, Yanan; Guo, Yuqi; Hancock, Matthew J.; Wang, Ben; Shin, Hyeongho; Wu, Jinhui; Li, Dichen; Khademhosseini, Ali
2010-01-01
Combinatorial material synthesis is a powerful approach for creating composite material libraries for the high-throughput screening of cell–material interactions. Although current combinatorial screening platforms have been tremendously successful in identifying target (termed “hit”) materials from composite material libraries, new material synthesis approaches are needed to further optimize the concentrations and blending ratios of the component materials. Here we employed a microfluidic platform to rapidly synthesize composite materials containing cross-gradients of gelatin and chitosan for investigating cell–biomaterial interactions. The microfluidic synthesis of the cross-gradient was optimized experimentally and theoretically to produce quantitatively controllable variations in the concentrations and blending ratios of the two components. The anisotropic chemical compositions of the gelatin/chitosan cross-gradients were characterized by Fourier transform infrared spectrometry and X-ray photoelectron spectrometry. The three-dimensional (3D) porous gelatin/chitosan cross-gradient materials were shown to regulate the cellular morphology and proliferation of smooth muscle cells (SMCs) in a gradient-dependent manner. We envision that our microfluidic cross-gradient platform may accelerate the material development processes involved in a wide range of biomedical applications. PMID:20721897
2011-01-01
Abstract Background The combinatorial library strategy of using multiple candidate ligands in mixtures as library members is ideal in terms of cost and efficiency, but needs special screening methods to estimate the affinities of candidate ligands in such mixtures. Herein, a new method to screen candidate ligands present in unknown molar quantities in mixtures was investigated. Results The proposed method involves preparing a processed-mixture-for-screening (PMFS) with each mixture sample and an exogenous reference ligand, initiating competitive binding among ligands from the PMFS to a target immobilized on magnetic particles, recovering target-ligand complexes in equilibrium by magnetic force, extracting and concentrating bound ligands, and analyzing ligands in the PMFS and the concentrated extract by chromatography. The relative affinity of each candidate ligand to its reference ligand is estimated via an approximation equation assuming (a) the candidate ligand and its reference ligand bind to the same site(s) on the target, (b) their chromatographic peak areas are over five times their intercepts of linear response but within their linear ranges, (c) their binding ratios are below 10%. These prerequisites are met by optimizing primarily the quantity of the target used and the PMFS composition ratio. The new method was tested using the competitive binding of biotin derivatives from mixtures to streptavidin immobilized on magnetic particles as a model. Each mixture sample containing a limited number of candidate biotin derivatives with moderate differences in their molar quantities were prepared via parallel-combinatorial-synthesis (PCS) without purification, or via the pooling of individual compounds. Some purified biotin derivatives were used as reference ligands. This method showed resistance to variations in chromatographic quantification sensitivity and concentration ratios; optimized conditions to validate the approximation equation could be applied to different mixture samples. Relative affinities of candidate biotin derivatives with unknown molar quantities in each mixture sample were consistent with those estimated by a homogenous method using their purified counterparts as samples. Conclusions This new method is robust and effective for each mixture possessing a limited number of candidate ligands whose molar quantities have moderate differences, and its integration with PCS has promise to routinely practice the mixture-based library strategy. PMID:21545719
Gaytán, Paul; Yáñez, Jorge; Sánchez, Filiberto; Soberón, Xavier
2001-01-01
We describe here a method to generate combinatorial libraries of oligonucleotides mutated at the codon-level, with control of the mutagenesis rate so as to create predictable binomial distributions of mutants. The method allows enrichment of the libraries with single, double or larger multiplicity of amino acid replacements by appropriate choice of the mutagenesis rate, depending on the concentration of synthetic precursors. The method makes use of two sets of deoxynucleoside-phosphoramidites bearing orthogonal protecting groups [4,4′-dimethoxytrityl (DMT) and 9-fluorenylmethoxycarbonyl (Fmoc)] in the 5′ hydroxyl. These phosphoramidites are divergently combined during automated synthesis in such a way that wild-type codons are assembled with commercial DMT-deoxynucleoside-methyl-phosphoramidites while mutant codons are assembled with Fmoc-deoxynucleoside-methyl-phosphoramidites in an NNG/C fashion in a single synthesis column. This method is easily automated and suitable for low mutagenesis rates and large windows, such as those required for directed evolution and alanine scanning. Through the assembly of three oligonucleotide libraries at different mutagenesis rates, followed by cloning at the polylinker region of plasmid pUC18 and sequencing of 129 clones, we concluded that the method performs essentially as intended. PMID:11160911
Alves, Andreia; Vanermen, Guido; Covaci, Adrian; Voorspoels, Stefan
2016-09-01
A new, fast, and environmentally friendly method based on ultrasound assisted extraction combined with dispersive liquid-liquid microextraction (US-DLLME) was developed and optimized for assessing the levels of seven phthalate metabolites (including the mono(ethyl hexyl) phthalate (MEHP), mono(2-ethyl-5-hydroxyhexyl) phthalate (5-OH-MEHP), mono(2-ethyl-5-oxohexyl) phthalate (5-oxo-MEHP), mono-n-butyl phthalate (MnBP), mono-isobutyl phthalate (MiBP), monoethyl phthalate (MEP), and mono-benzyl phthalate (MBzP)) in human nails by UPLC-MS/MS. The optimization of the US-DLLME method was performed using a Taguchi combinatorial design (L9 array). Several parameters such as extraction solvent, solvent volume, extraction time, acid, acid concentration, and vortex time were studied. The optimal extraction conditions achieved were 180 μL of trichloroethylene (extraction solvent), 2 mL trifluoroacetic acid in methanol (2 M), 2 h extraction and 3 min vortex time. The optimized method had a good precision (6-17 %). The accuracy ranged from 79 to 108 % and the limit of method quantification (LOQm) was below 14 ng/g for all compounds. The developed US-DLLME method was applied to determine the target metabolites in 10 Belgian individuals. Levels of the analytes measured in nails ranged between <12 and 7982 ng/g. The MEHP, MBP isomers, and MEP were the major metabolites and detected in every sample. Miniaturization (low volumes of organic solvents used), low costs, speed, and simplicity are the main advantages of this US-DLLME based method. Graphical Abstract Extraction and phase separation of the US-DLLME procedure.
Exact model reduction of combinatorial reaction networks
Conzelmann, Holger; Fey, Dirk; Gilles, Ernst D
2008-01-01
Background Receptors and scaffold proteins usually possess a high number of distinct binding domains inducing the formation of large multiprotein signaling complexes. Due to combinatorial reasons the number of distinguishable species grows exponentially with the number of binding domains and can easily reach several millions. Even by including only a limited number of components and binding domains the resulting models are very large and hardly manageable. A novel model reduction technique allows the significant reduction and modularization of these models. Results We introduce methods that extend and complete the already introduced approach. For instance, we provide techniques to handle the formation of multi-scaffold complexes as well as receptor dimerization. Furthermore, we discuss a new modeling approach that allows the direct generation of exactly reduced model structures. The developed methods are used to reduce a model of EGF and insulin receptor crosstalk comprising 5,182 ordinary differential equations (ODEs) to a model with 87 ODEs. Conclusion The methods, presented in this contribution, significantly enhance the available methods to exactly reduce models of combinatorial reaction networks. PMID:18755034
Optimal Control Surface Layout for an Aeroservoelastic Wingbox
NASA Technical Reports Server (NTRS)
Stanford, Bret K.
2017-01-01
This paper demonstrates a technique for locating the optimal control surface layout of an aeroservoelastic Common Research Model wingbox, in the context of maneuver load alleviation and active utter suppression. The combinatorial actuator layout design is solved using ideas borrowed from topology optimization, where the effectiveness of a given control surface is tied to a layout design variable, which varies from zero (the actuator is removed) to one (the actuator is retained). These layout design variables are optimized concurrently with a large number of structural wingbox sizing variables and control surface actuation variables, in order to minimize the sum of structural weight and actuator weight. Results are presented that demonstrate interdependencies between structural sizing patterns and optimal control surface layouts, for both static and dynamic aeroelastic physics.
Efficient 3D multi-region prostate MRI segmentation using dual optimization.
Qiu, Wu; Yuan, Jing; Ukwatta, Eranga; Sun, Yue; Rajchl, Martin; Fenster, Aaron
2013-01-01
Efficient and accurate extraction of the prostate, in particular its clinically meaningful sub-regions from 3D MR images, is of great interest in image-guided prostate interventions and diagnosis of prostate cancer. In this work, we propose a novel multi-region segmentation approach to simultaneously locating the boundaries of the prostate and its two major sub-regions: the central gland and the peripheral zone. The proposed method utilizes the prior knowledge of the spatial region consistency and employs a customized prostate appearance model to simultaneously segment multiple clinically meaningful regions. We solve the resulted challenging combinatorial optimization problem by means of convex relaxation, for which we introduce a novel spatially continuous flow-maximization model and demonstrate its duality to the investigated convex relaxed optimization problem with the region consistency constraint. Moreover, the proposed continuous max-flow model naturally leads to a new and efficient continuous max-flow based algorithm, which enjoys great advantages in numerics and can be readily implemented on GPUs. Experiments using 15 T2-weighted 3D prostate MR images, by inter- and intra-operator variability, demonstrate the promising performance of the proposed approach.
Harańczyk, Maciej; Gutowski, Maciej
2007-01-01
We describe a procedure of finding low-energy tautomers of a molecule. The procedure consists of (i) combinatorial generation of a library of tautomers, (ii) screening based on the results of geometry optimization of initial structures performed at the density functional level of theory, and (iii) final refinement of geometry for the top hits at the second-order Möller-Plesset level of theory followed by single-point energy calculations at the coupled cluster level of theory with single, double, and perturbative triple excitations. The library of initial structures of various tautomers is generated with TauTGen, a tautomer generator program. The procedure proved to be successful for these molecular systems for which common chemical knowledge had not been sufficient to predict the most stable structures.
Fast Optimization of LiMgMnOx/La2O3 Catalysts for the Oxidative Coupling of Methane.
Li, Zhinian; He, Lei; Wang, Shenliang; Yi, Wuzhong; Zou, Shihui; Xiao, Liping; Fan, Jie
2017-01-09
The development of efficient catalyst for oxidative coupling of methane (OCM) reaction represents a grand challenge in direct conversion of methane into other useful products. Here, we reported that a newly developed combinatorial approach can be used for ultrafast optimization of La 2 O 3 -based multicomponent metal oxide catalysts in OCM reaction. This new approach integrated inkjet printing assisted synthesis (IJP-A) with multidimensional group testing strategy (m-GT) tactfully takes the place of conventionally high-throughput synthesis-and-screen experiment. Just within a week, 2048 formulated LiMgMnO x -La 2 O 3 catalysts in a 64·8·8·8·8 = 262 144 compositional space were fabricated by IJP-A in a four-round synthesis-and-screen process, and an optimized formulation has been successfully identified through only 4·8 = 32 times of tests via m-GT screening strategy. The screening process identifies the most promising ternary composition region is Li 0-0.48 Mg 0-6.54 Mn 0-0.62 -La 100 O x with an external C 2 yield of 10.87% at 700 °C. The yield of C 2 is two times as high as the pure nano-La 2 O 3 . The good performance of the optimized catalyst formulation has been validated by the manual preparation, which further prove the effectiveness of the new combinatorial methodology in fast discovery of heterogeneous catalyst.
Programming cells by multiplex genome engineering and accelerated evolution.
Wang, Harris H; Isaacs, Farren J; Carr, Peter A; Sun, Zachary Z; Xu, George; Forest, Craig R; Church, George M
2009-08-13
The breadth of genomic diversity found among organisms in nature allows populations to adapt to diverse environments. However, genomic diversity is difficult to generate in the laboratory and new phenotypes do not easily arise on practical timescales. Although in vitro and directed evolution methods have created genetic variants with usefully altered phenotypes, these methods are limited to laborious and serial manipulation of single genes and are not used for parallel and continuous directed evolution of gene networks or genomes. Here, we describe multiplex automated genome engineering (MAGE) for large-scale programming and evolution of cells. MAGE simultaneously targets many locations on the chromosome for modification in a single cell or across a population of cells, thus producing combinatorial genomic diversity. Because the process is cyclical and scalable, we constructed prototype devices that automate the MAGE technology to facilitate rapid and continuous generation of a diverse set of genetic changes (mismatches, insertions, deletions). We applied MAGE to optimize the 1-deoxy-D-xylulose-5-phosphate (DXP) biosynthesis pathway in Escherichia coli to overproduce the industrially important isoprenoid lycopene. Twenty-four genetic components in the DXP pathway were modified simultaneously using a complex pool of synthetic DNA, creating over 4.3 billion combinatorial genomic variants per day. We isolated variants with more than fivefold increase in lycopene production within 3 days, a significant improvement over existing metabolic engineering techniques. Our multiplex approach embraces engineering in the context of evolution by expediting the design and evolution of organisms with new and improved properties.
Hamper, Bruce C; Kesselring, Allen S; Chott, Robert C; Yang, Shengtian
2009-01-01
A solid-phase organic synthesis method has been developed for the preparation of trisubstituted pyrimidin-6-one carboxylic acids 12, which allows elaboration to a 3-dimensional combinatorial library. Three substituents are introduced by initial Knoevenagel condensation of an aldehyde and malonate ester resin 7 to give resin bound 1. Cyclization of 1 with an N-substituted amidine 10, oxidation, and cleavage afforded pyrimidinone 12. The initial solid-phase reaction sequence was followed by gel-phase (19)FNMR and direct-cleavage (1)H NMR of intermediate resins to determine the optimal conditions. The scope of the method for library production was determined by investigation of a 3 x 4 pilot library of twelve compounds. Cyclocondensation of N-methylamidines and 7 followed by CAN oxidation gave mixtures of the resin bound pyrimidin-6-one 11 and the regioisomeric pyrimidin-4-one 15, which after cleavage from the resin afforded a nearly 1:1 mixture of pyrimidin-6-one and pyrimidin-4-one carboxylic acids 12 and 16, respectively. The regiochemical assignment was confirmed by ROESY1D and gHMBC NMR experiments. A library was prepared using 8 aldehydes, 3 nitriles, and 4 amines to give a full combinatorial set of 96 pyrimidinones 12. Confirmation of structural identity and purity was carried out by LCMS using coupled ELS detection and by high-throughput flow (1)H NMR.
Kim, Kyung Lock; Park, Kyeng Min; Murray, James; Kim, Kimoon; Ryu, Sung Ho
2018-05-23
Combinatorial post-translational modifications (PTMs), which can serve as dynamic "molecular barcodes", have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
2018-01-01
Combinatorial post-translational modifications (PTMs), which can serve as dynamic “molecular barcodes”, have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
Design of diversity and focused combinatorial libraries in drug discovery.
Young, S Stanley; Ge, Nanxiang
2004-05-01
Using well-characterized chemical reactions and readily available monomers, chemists are able to create sets of compounds, termed libraries, which are useful in drug discovery processes. The design of combinatorial chemical libraries can be complex and there has been much information recently published offering suggestions on how the design process can be carried out. This review focuses on literature with the goal of organizing current thinking. At this point in time, it is clear that benchmarking of current suggested methods is required as opposed to further new methods.
Genetic algorithm parameters tuning for resource-constrained project scheduling problem
NASA Astrophysics Data System (ADS)
Tian, Xingke; Yuan, Shengrui
2018-04-01
Project Scheduling Problem (RCPSP) is a kind of important scheduling problem. To achieve a certain optimal goal such as the shortest duration, the smallest cost, the resource balance and so on, it is required to arrange the start and finish of all tasks under the condition of satisfying project timing constraints and resource constraints. In theory, the problem belongs to the NP-hard problem, and the model is abundant. Many combinatorial optimization problems are special cases of RCPSP, such as job shop scheduling, flow shop scheduling and so on. At present, the genetic algorithm (GA) has been used to deal with the classical RCPSP problem and achieved remarkable results. Vast scholars have also studied the improved genetic algorithm for the RCPSP problem, which makes it to solve the RCPSP problem more efficiently and accurately. However, for the selection of the main parameters of the genetic algorithm, there is no parameter optimization in these studies. Generally, we used the empirical method, but it cannot ensure to meet the optimal parameters. In this paper, the problem was carried out, which is the blind selection of parameters in the process of solving the RCPSP problem. We made sampling analysis, the establishment of proxy model and ultimately solved the optimal parameters.
2014-03-27
1959). On a linear-programming, combinatorial approach to the traveling - salesman problem . Operations Research, 58-66. Daugherty, P. J., Myers, M. B...1 Problem Statement... Problem Statement As of 01 September 2013, the USAF is tracking 12,571 individual Class VII assets valued at $213.5 million for final disposition
Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert
2014-01-07
Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Predictive Array Design. A method for sampling combinatorial chemistry library space.
Lipkin, M J; Rose, V S; Wood, J
2002-01-01
A method, Predictive Array Design, is presented for sampling combinatorial chemistry space and selecting a subarray for synthesis based on the experimental design method of Latin Squares. The method is appropriate for libraries with three sites of variation. Libraries with four sites of variation can be designed using the Graeco-Latin Square. Simulated annealing is used to optimise the physicochemical property profile of the sub-array. The sub-array can be used to make predictions of the activity of compounds in the all combinations array if we assume each monomer has a relatively constant contribution to activity and that the activity of a compound is composed of the sum of the activities of its constitutive monomers.
Selection of stably folded proteins by phage-display with proteolysis.
Bai, Yawen; Feng, Hanqiao
2004-05-01
To facilitate the process of protein design and learn the basic rules that control the structure and stability of proteins, combinatorial methods have been developed to select or screen proteins with desired properties from libraries of mutants. One such method uses phage-display and proteolysis to select stably folded proteins. This method does not rely on specific properties of proteins for selection. Therefore, in principle it can be applied to any protein. Since its first demonstration in 1998, the method has been used to create hyperthermophilic proteins, to evolve novel folded domains from a library generated by combinatorial shuffling of polypeptide segments and to convert a partially unfolded structure to a fully folded protein.
Method and apparatus for combinatorial chemistry
Foote, Robert S.
2007-02-20
A method and apparatus are provided for performing light-directed reactions in spatially addressable channels within a plurality of channels. One aspect of the invention employs photoactivatable reagents in solutions disposed into spatially addressable flow streams to control the parallel synthesis of molecules immobilized within the channels. The reagents may be photoactivated within a subset of channels at the site of immobilized substrate molecules or at a light-addressable site upstream from the substrate molecules. The method and apparatus of the invention find particularly utility in the synthesis of biopolymer arrays, e.g., oligonucleotides, peptides and carbohydrates, and in the combinatorial synthesis of small molecule arrays for drug discovery.
Method and apparatus for combinatorial chemistry
Foote, Robert S [Oak Ridge, TN
2012-06-05
A method and apparatus are provided for performing light-directed reactions in spatially addressable channels within a plurality of channels. One aspect of the invention employs photoactivatable reagents in solutions disposed into spatially addressable flow streams to control the parallel synthesis of molecules immobilized within the channels. The reagents may be photoactivated within a subset of channels at the site of immobilized substrate molecules or at a light-addressable site upstream from the substrate molecules. The method and apparatus of the invention find particularly utility in the synthesis of biopolymer arrays, e.g., oligonucleotides, peptides and carbohydrates, and in the combinatorial synthesis of small molecule arrays for drug discovery.
Improved mine blast algorithm for optimal cost design of water distribution systems
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Guen Yoo, Do; Kim, Joong Hoon
2015-12-01
The design of water distribution systems is a large class of combinatorial, nonlinear optimization problems with complex constraints such as conservation of mass and energy equations. Since feasible solutions are often extremely complex, traditional optimization techniques are insufficient. Recently, metaheuristic algorithms have been applied to this class of problems because they are highly efficient. In this article, a recently developed optimizer called the mine blast algorithm (MBA) is considered. The MBA is improved and coupled with the hydraulic simulator EPANET to find the optimal cost design for water distribution systems. The performance of the improved mine blast algorithm (IMBA) is demonstrated using the well-known Hanoi, New York tunnels and Balerma benchmark networks. Optimization results obtained using IMBA are compared to those using MBA and other optimizers in terms of their minimum construction costs and convergence rates. For the complex Balerma network, IMBA offers the cheapest network design compared to other optimization algorithms.
Exhaustive Search for Sparse Variable Selection in Linear Regression
NASA Astrophysics Data System (ADS)
Igarashi, Yasuhiko; Takenaka, Hikaru; Nakanishi-Ohno, Yoshinori; Uemura, Makoto; Ikeda, Shiro; Okada, Masato
2018-04-01
We propose a K-sparse exhaustive search (ES-K) method and a K-sparse approximate exhaustive search method (AES-K) for selecting variables in linear regression. With these methods, K-sparse combinations of variables are tested exhaustively assuming that the optimal combination of explanatory variables is K-sparse. By collecting the results of exhaustively computing ES-K, various approximate methods for selecting sparse variables can be summarized as density of states. With this density of states, we can compare different methods for selecting sparse variables such as relaxation and sampling. For large problems where the combinatorial explosion of explanatory variables is crucial, the AES-K method enables density of states to be effectively reconstructed by using the replica-exchange Monte Carlo method and the multiple histogram method. Applying the ES-K and AES-K methods to type Ia supernova data, we confirmed the conventional understanding in astronomy when an appropriate K is given beforehand. However, we found the difficulty to determine K from the data. Using virtual measurement and analysis, we argue that this is caused by data shortage.
A modular approach to intensity-modulated arc therapy optimization with noncoplanar trajectories
NASA Astrophysics Data System (ADS)
Papp, Dávid; Bortfeld, Thomas; Unkelbach, Jan
2015-07-01
Utilizing noncoplanar beam angles in volumetric modulated arc therapy (VMAT) has the potential to combine the benefits of arc therapy, such as short treatment times, with the benefits of noncoplanar intensity modulated radiotherapy (IMRT) plans, such as improved organ sparing. Recently, vendors introduced treatment machines that allow for simultaneous couch and gantry motion during beam delivery to make noncoplanar VMAT treatments possible. Our aim is to provide a reliable optimization method for noncoplanar isocentric arc therapy plan optimization. The proposed solution is modular in the sense that it can incorporate different existing beam angle selection and coplanar arc therapy optimization methods. Treatment planning is performed in three steps. First, a number of promising noncoplanar beam directions are selected using an iterative beam selection heuristic; these beams serve as anchor points of the arc therapy trajectory. In the second step, continuous gantry/couch angle trajectories are optimized using a simple combinatorial optimization model to define a beam trajectory that efficiently visits each of the anchor points. Treatment time is controlled by limiting the time the beam needs to trace the prescribed trajectory. In the third and final step, an optimal arc therapy plan is found along the prescribed beam trajectory. In principle any existing arc therapy optimization method could be incorporated into this step; for this work we use a sliding window VMAT algorithm. The approach is demonstrated using two particularly challenging cases. The first one is a lung SBRT patient whose planning goals could not be satisfied with fewer than nine noncoplanar IMRT fields when the patient was treated in the clinic. The second one is a brain tumor patient, where the target volume overlaps with the optic nerves and the chiasm and it is directly adjacent to the brainstem. Both cases illustrate that the large number of angles utilized by isocentric noncoplanar VMAT plans can help improve dose conformity, homogeneity, and organ sparing simultaneously using the same beam trajectory length and delivery time as a coplanar VMAT plan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papp, D; Unkelbach, J
2014-06-01
Purpose: Non-uniform fractionation, i.e. delivering distinct dose distributions in two subsequent fractions, can potentially improve outcomes by increasing biological dose to the target without increasing dose to healthy tissues. This is possible if both fractions deliver a similar dose to normal tissues (exploit the fractionation effect) but high single fraction doses to subvolumes of the target (hypofractionation). Optimization of such treatment plans can be formulated using biological equivalent dose (BED), but leads to intractable nonconvex optimization problems. We introduce a novel optimization approach to address this challenge. Methods: We first optimize a reference IMPT plan using standard techniques that deliversmore » a homogeneous target dose in both fractions. The method then divides the pencil beams into two sets, which are assigned to either fraction one or fraction two. The total intensity of each pencil beam, and therefore the physical dose, remains unchanged compared to the reference plan. The objectives are to maximize the mean BED in the target and to minimize the mean BED in normal tissues, which is a quadratic function of the pencil beam weights. The optimal reassignment of pencil beams to one of the two fractions is formulated as a binary quadratic optimization problem. A near-optimal solution to this problem can be obtained by convex relaxation and randomized rounding. Results: The method is demonstrated for a large arteriovenous malformation (AVM) case treated in two fractions. The algorithm yields a treatment plan, which delivers a high dose to parts of the AVM in one of the fractions, but similar doses in both fractions to the normal brain tissue adjacent to the AVM. Using the approach, the mean BED in the target was increased by approximately 10% compared to what would have been possible with a uniform reference plan for the same normal tissue mean BED.« less
Webb, Thomas R; Jiang, Luyong; Sviridov, Sergey; Venegas, Ruben E; Vlaskina, Anna V; McGrath, Douglas; Tucker, John; Wang, Jian; Deschenes, Alain; Li, Rongshi
2007-01-01
We report the further application of a novel approach to template and ligand design by the synthesis of agonists of the melanocortin receptor. This design method uses the conserved structural data from the three-dimensional conformations of beta-turn peptides to design rigid nonpeptide templates that mimic the orientation of the main chain C-alpha atoms in a peptide beta-turn. We report details on a new synthesis of derivatives of template 1 that are useful for the synthesis of exploratory libraries. The utility of this technique is further exemplified by several iterative rounds of high-throughput synthesis and screening, which result in new partially optimized nonpeptide agonists for several melanocortin receptors.
DNA Assembly Techniques for Next Generation Combinatorial Biosynthesis of Natural Products
Cobb, Ryan E.; Ning, Jonathan C.; Zhao, Huimin
2013-01-01
Natural product scaffolds remain important leads for pharmaceutical development. However, transforming a natural product into a drug entity often requires derivatization to enhance the compound’s therapeutic properties. A powerful method by which to perform this derivatization is combinatorial biosynthesis, the manipulation of the genes in the corresponding pathway to divert synthesis towards novel derivatives. While these manipulations have traditionally been carried out via restriction digestion/ligation-based cloning, the shortcomings of such techniques limit their throughput and thus the scope of corresponding combinatorial biosynthesis experiments. In the burgeoning field of synthetic biology, the demand for facile DNA assembly techniques has promoted the development of a host of novel DNA assembly strategies. Here we describe the advantages of these recently-developed tools for rapid, efficient synthesis of large DNA constructs. We also discuss their potential to facilitate the simultaneous assembly of complete libraries of natural product biosynthetic pathways, ushering in the next generation of combinatorial biosynthesis. PMID:24127070
Combinatorial study of degree assortativity in networks.
Estrada, Ernesto
2011-10-01
Why are some networks degree-degree correlated (assortative), while most of the real-world ones are anticorrelated (disassortative)? Here, we prove, by combinatorial methods, that the assortativity of a network depends only on three structural factors: transitivity (clustering coefficient), intermodular connectivity, and branching. Then, a network is assortative if the contributions of the first two factors are larger than that of the third. Highly branched networks are likely to be disassortative.
Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia
2003-01-01
Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.
A combinatorial perspective of the protein inference problem.
Yang, Chao; He, Zengyou; Yu, Weichuan
2013-01-01
In a shotgun proteomics experiment, proteins are the most biologically meaningful output. The success of proteomics studies depends on the ability to accurately and efficiently identify proteins. Many methods have been proposed to facilitate the identification of proteins from peptide identification results. However, the relationship between protein identification and peptide identification has not been thoroughly explained before. In this paper, we devote ourselves to a combinatorial perspective of the protein inference problem. We employ combinatorial mathematics to calculate the conditional protein probabilities (protein probability means the probability that a protein is correctly identified) under three assumptions, which lead to a lower bound, an upper bound, and an empirical estimation of protein probabilities, respectively. The combinatorial perspective enables us to obtain an analytical expression for protein inference. Our method achieves comparable results with ProteinProphet in a more efficient manner in experiments on two data sets of standard protein mixtures and two data sets of real samples. Based on our model, we study the impact of unique peptides and degenerate peptides (degenerate peptides are peptides shared by at least two proteins) on protein probabilities. Meanwhile, we also study the relationship between our model and ProteinProphet. We name our program ProteinInfer. Its Java source code, our supplementary document and experimental results are available at: >http://bioinformatics.ust.hk/proteininfer.
García-Pedrajas, Nicolás; Ortiz-Boyer, Domingo; Hervás-Martínez, César
2006-05-01
In this work we present a new approach to crossover operator in the genetic evolution of neural networks. The most widely used evolutionary computation paradigm for neural network evolution is evolutionary programming. This paradigm is usually preferred due to the problems caused by the application of crossover to neural network evolution. However, crossover is the most innovative operator within the field of evolutionary computation. One of the most notorious problems with the application of crossover to neural networks is known as the permutation problem. This problem occurs due to the fact that the same network can be represented in a genetic coding by many different codifications. Our approach modifies the standard crossover operator taking into account the special features of the individuals to be mated. We present a new model for mating individuals that considers the structure of the hidden layer and redefines the crossover operator. As each hidden node represents a non-linear projection of the input variables, we approach the crossover as a problem on combinatorial optimization. We can formulate the problem as the extraction of a subset of near-optimal projections to create the hidden layer of the new network. This new approach is compared to a classical crossover in 25 real-world problems with an excellent performance. Moreover, the networks obtained are much smaller than those obtained with classical crossover operator.
Effect of the Implicit Combinatorial Model on Combinatorial Reasoning in Secondary School Pupils.
ERIC Educational Resources Information Center
Batanero, Carmen; And Others
1997-01-01
Elementary combinatorial problems may be classified into three different combinatorial models: (1) selection; (2) partition; and (3) distribution. The main goal of this research was to determine the effect of the implicit combinatorial model on pupils' combinatorial reasoning before and after instruction. Gives an analysis of variance of the…
Cankorur-Cetinkaya, Ayca; Dias, Joao M L; Kludas, Jana; Slater, Nigel K H; Rousu, Juho; Oliver, Stephen G; Dikicioglu, Duygu
2017-06-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple-to-use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257).
Solution for a bipartite Euclidean traveling-salesman problem in one dimension
NASA Astrophysics Data System (ADS)
Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M.
2018-05-01
The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.
Solution for a bipartite Euclidean traveling-salesman problem in one dimension.
Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M
2018-05-01
The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Algorithms for optimizing cross-overs in DNA shuffling.
He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris
2012-03-21
DNA shuffling generates combinatorial libraries of chimeric genes by stochastically recombining parent genes. The resulting libraries are subjected to large-scale genetic selection or screening to identify those chimeras with favorable properties (e.g., enhanced stability or enzymatic activity). While DNA shuffling has been applied quite successfully, it is limited by its homology-dependent, stochastic nature. Consequently, it is used only with parents of sufficient overall sequence identity, and provides no control over the resulting chimeric library. This paper presents efficient methods to extend the scope of DNA shuffling to handle significantly more diverse parents and to generate more predictable, optimized libraries. Our CODNS (cross-over optimization for DNA shuffling) approach employs polynomial-time dynamic programming algorithms to select codons for the parental amino acids, allowing for zero or a fixed number of conservative substitutions. We first present efficient algorithms to optimize the local sequence identity or the nearest-neighbor approximation of the change in free energy upon annealing, objectives that were previously optimized by computationally-expensive integer programming methods. We then present efficient algorithms for more powerful objectives that seek to localize and enhance the frequency of recombination by producing "runs" of common nucleotides either overall or according to the sequence diversity of the resulting chimeras. We demonstrate the effectiveness of CODNS in choosing codons and allocating substitutions to promote recombination between parents targeted in earlier studies: two GAR transformylases (41% amino acid sequence identity), two very distantly related DNA polymerases, Pol X and β (15%), and beta-lactamases of varying identity (26-47%). Our methods provide the protein engineer with a new approach to DNA shuffling that supports substantially more diverse parents, is more deterministic, and generates more predictable and more diverse chimeric libraries.
Ebalunode, Jerry O; Zheng, Weifan; Tropsha, Alexander
2011-01-01
Optimization of chemical library composition affords more efficient identification of hits from biological screening experiments. The optimization could be achieved through rational selection of reagents used in combinatorial library synthesis. However, with a rapid advent of parallel synthesis methods and availability of millions of compounds synthesized by many vendors, it may be more efficient to design targeted libraries by means of virtual screening of commercial compound collections. This chapter reviews the application of advanced cheminformatics approaches such as quantitative structure-activity relationships (QSAR) and pharmacophore modeling (both ligand and structure based) for virtual screening. Both approaches rely on empirical SAR data to build models; thus, the emphasis is placed on achieving models of the highest rigor and external predictive power. We present several examples of successful applications of both approaches for virtual screening to illustrate their utility. We suggest that the expert use of both QSAR and pharmacophore models, either independently or in combination, enables users to achieve targeted libraries enriched with experimentally confirmed hit compounds.
Tug-Of-War Model for Two-Bandit Problem
NASA Astrophysics Data System (ADS)
Kim, Song-Ju; Aono, Masashi; Hara, Masahiko
The amoeba of the true slime mold Physarum polycephalum shows high computational capabilities. In the so-called amoeba-based computing, some computing tasks including combinatorial optimization are performed by the amoeba instead of a digital computer. We expect that there must be problems living organisms are good at solving. The “multi-armed bandit problem” would be the one of such problems. Consider a number of slot machines. Each of the machines has an arm which gives a player a reward with a certain probability when pulled. The problem is to determine the optimal strategy for maximizing the total reward sum after a certain number of trials. To maximize the total reward sum, it is necessary to judge correctly and quickly which machine has the highest reward probability. Therefore, the player should explore many machines to gather much knowledge on which machine is the best, but should not fail to exploit the reward from the known best machine. We consider that living organisms follow some efficient method to solve the problem.
Coordinated Platoon Routing in a Metropolitan Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Jeffrey; Munson, Todd; Sokolov, Vadim
2016-10-10
Platooning vehicles—connected and automated vehicles traveling with small intervehicle distances—use less fuel because of reduced aerodynamic drag. Given a network de- fined by vertex and edge sets and a set of vehicles with origin/destination nodes/times, we model and solve the combinatorial optimization problem of coordinated routing of vehicles in a manner that routes them to their destination on time while using the least amount of fuel. Common approaches decompose the platoon coordination and vehicle routing into separate problems. Our model addresses both problems simultaneously to obtain the best solution. We use modern modeling techniques and constraints implied from analyzing themore » platoon routing problem to address larger numbers of vehicles and larger networks than previously considered. While the numerical method used is unable to certify optimality for candidate solutions to all networks and parameters considered, we obtain excellent solutions in approximately one minute for much larger networks and vehicle sets than previously considered in the literature.« less
CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox
NASA Astrophysics Data System (ADS)
Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano
2018-03-01
Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.
Masmoudi, Fatma; Ben Khedher, Saoussen; Kamoun, Amel; Zouari, Nabil; Tounsi, Slim; Trigui, Mohamed
2017-04-01
This work is directed towards Bacillus amyloliquefaciens strain BLB371 metabolite production for biocontrol of fungal phytopathogens. In order to maximise antifungal metabolite production by this strain, two approaches were combined: random mutagenesis and medium component optimization. After three rounds of mutagenesis, a hyper active mutant, named M3-7, was obtained. It produces 7 fold more antifungal metabolites (1800AU/mL) than the wild strain in MC medium. A hybrid design was applied to optimise a new medium to enhance antifungal metabolite production by M3-7. The new optimized medium (35g/L of peptone, 32.5g/L of sucrose, 10.5g/L of yeast extract, 2.4g/L of KH 2 PO 4 , 1.3g/L of MgSO 4 and 23mg/L of MnSO 4 ) achieved 1.62 fold enhancement in antifungal compound production (3000AU/mL) by this mutant, compared to that achieved in MC medium. Therefore, combinatory effect of these two approaches (mutagenesis and medium component optimization) allowed 12 fold improvement in antifungal activity (from 250UA/mL to 3000UA/mL). This improvement was confirmed against several phytopathogenic fungi with an increase of MIC and MFC over than 50%. More interestingly, a total eradication of gray mold was obtained on tomato fruits infected by Botrytis cinerea and treated by M3-7, compared to those treated by BLB371. From the practical point of view, combining random mutagenesis and medium optimization could be considered as an excellent tool for obtaining promising biological products useful against phytopathogenic fungi. Copyright © 2017 Elsevier GmbH. All rights reserved.
Building synthetic gene circuits from combinatorial libraries: screening and selection strategies.
Schaerli, Yolanda; Isalan, Mark
2013-07-01
The promise of wide-ranging biotechnology applications inspires synthetic biologists to design novel genetic circuits. However, building such circuits rationally is still not straightforward and often involves painstaking trial-and-error. Mimicking the process of natural selection can help us to bridge the gap between our incomplete understanding of nature's design rules and our desire to build functional networks. By adopting the powerful method of directed evolution, which is usually applied to protein engineering, functional networks can be obtained through screening or selecting from randomised combinatorial libraries. This review first highlights the practical options to introduce combinatorial diversity into gene circuits and then examines strategies for identifying the potentially rare library members with desired functions, either by screening or selection.
Xiang, Xiao-Dong; Sun, Xiaodong; Schultz, Peter G.
2000-01-01
This invention relates to new phosphor materials and to combinatorial methods of synthesizing and detecting the same. In addition, methods of using phosphors to generate luminescence are also disclosed.
ERIC Educational Resources Information Center
MacGregor, James N.; Chronicle, Edward P.; Ormerod, Thomas C.
2006-01-01
We compared the performance of three heuristics with that of subjects on variants of a well-known combinatorial optimization task, the Traveling Salesperson Problem (TSP). The present task consisted of finding the shortest path through an array of points from one side of the array to the other. Like the standard TSP, the task is computationally…
Navigation Solution for a Multiple Satellite and Multiple Ground Architecture
2014-09-14
Primer Vector Theory . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2.6 The Traveling Salesman Problem . . . . . . . . . . . . . . . . . . 12...the Traveling Salesman problem [42]. It is framed as a nonlinear programming, complete combinatorial optimization where the orbital debris pieces relate...impulsive maneuvers and applies his findings to a Hohmann transfer with the addition of mid-course burns and wait times. 2.2.6 The Traveling Salesman
Liu, Chun; Kroll, Andreas
2016-01-01
Multi-robot task allocation determines the task sequence and distribution for a group of robots in multi-robot systems, which is one of constrained combinatorial optimization problems and more complex in case of cooperative tasks because they introduce additional spatial and temporal constraints. To solve multi-robot task allocation problems with cooperative tasks efficiently, a subpopulation-based genetic algorithm, a crossover-free genetic algorithm employing mutation operators and elitism selection in each subpopulation, is developed in this paper. Moreover, the impact of mutation operators (swap, insertion, inversion, displacement, and their various combinations) is analyzed when solving several industrial plant inspection problems. The experimental results show that: (1) the proposed genetic algorithm can obtain better solutions than the tested binary tournament genetic algorithm with partially mapped crossover; (2) inversion mutation performs better than other tested mutation operators when solving problems without cooperative tasks, and the swap-inversion combination performs better than other tested mutation operators/combinations when solving problems with cooperative tasks. As it is difficult to produce all desired effects with a single mutation operator, using multiple mutation operators (including both inversion and swap) is suggested when solving similar combinatorial optimization problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardirossian, Narbe; Head-Gordon, Martin
2016-06-07
A combinatorially optimized, range-separated hybrid, meta-GGA density functional with VV10 nonlocal correlation is presented in this paper. The final 12-parameter functional form is selected from approximately 10 × 10 9 candidate fits that are trained on a training set of 870 data points and tested on a primary test set of 2964 data points. The resulting density functional, ωB97M-V, is further tested for transferability on a secondary test set of 1152 data points. For comparison, ωB97M-V is benchmarked against 11 leading density functionals including M06-2X, ωB97X-D, M08-HX, M11, ωM05-D, ωB97X-V, and MN15. Encouragingly, the overall performance of ωB97M-V on nearlymore » 5000 data points clearly surpasses that of all of the tested density functionals. Finally, in order to facilitate the use of ωB97M-V, its basis set dependence and integration grid sensitivity are thoroughly assessed, and recommendations that take into account both efficiency and accuracy are provided.« less
Discovery of Peptidomimetic Ligands of EED as Allosteric Inhibitors of PRC2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnash, Kimberly D.; The, Juliana; Norris-Drouin, Jacqueline L.
The function of EED within polycomb repressive complex 2 (PRC2) is mediated by a complex network of protein–protein interactions. Allosteric activation of PRC2 by binding of methylated proteins to the embryonic ectoderm development (EED) aromatic cage is essential for full catalytic activity, but details of this regulation are not fully understood. EED’s recognition of the product of PRC2 activity, histone H3 lysine 27 trimethylation (H3K27me3), stimulates PRC2 methyltransferase activity at adjacent nucleosomes leading to H3K27me3 propagation and, ultimately, gene repression. By coupling combinatorial chemistry and structure-based design, we optimized a low-affinity methylated jumonji, AT-rich interactive domain 2 (Jarid2) peptide tomore » a smaller, more potent peptidomimetic ligand (K d = 1.14 ± 0.14 μM) of the aromatic cage of EED. Our strategy illustrates the effectiveness of applying combinatorial chemistry to achieve both ligand potency and property optimization. Furthermore, the resulting ligands, UNC5114 and UNC5115, demonstrate that targeted disruption of EED’s reader function can lead to allosteric inhibition of PRC2 catalytic activity.« less
Brasil, Christiane Regina Soares; Delbem, Alexandre Claudio Botazzo; da Silva, Fernando Luís Barroso
2013-07-30
This article focuses on the development of an approach for ab initio protein structure prediction (PSP) without using any earlier knowledge from similar protein structures, as fragment-based statistics or inference of secondary structures. Such an approach is called purely ab initio prediction. The article shows that well-designed multiobjective evolutionary algorithms can predict relevant protein structures in a purely ab initio way. One challenge for purely ab initio PSP is the prediction of structures with β-sheets. To work with such proteins, this research has also developed procedures to efficiently estimate hydrogen bond and solvation contribution energies. Considering van der Waals, electrostatic, hydrogen bond, and solvation contribution energies, the PSP is a problem with four energetic terms to be minimized. Each interaction energy term can be considered an objective of an optimization method. Combinatorial problems with four objectives have been considered too complex for the available multiobjective optimization (MOO) methods. The proposed approach, called "Multiobjective evolutionary algorithms with many tables" (MEAMT), can efficiently deal with four objectives through the combination thereof, performing a more adequate sampling of the objective space. Therefore, this method can better map the promising regions in this space, predicting structures in a purely ab initio way. In other words, MEAMT is an efficient optimization method for MOO, which explores simultaneously the search space as well as the objective space. MEAMT can predict structures with one or two domains with RMSDs comparable to values obtained by recently developed ab initio methods (GAPFCG , I-PAES, and Quark) that use different levels of earlier knowledge. Copyright © 2013 Wiley Periodicals, Inc.
Breast cancer prognosis by combinatorial analysis of gene expression data.
Alexe, Gabriela; Alexe, Sorin; Axelrod, David E; Bonates, Tibérius O; Lozina, Irina I; Reiss, Michael; Hammer, Peter L
2006-01-01
The potential of applying data analysis tools to microarray data for diagnosis and prognosis is illustrated on the recent breast cancer dataset of van 't Veer and coworkers. We re-examine that dataset using the novel technique of logical analysis of data (LAD), with the double objective of discovering patterns characteristic for cases with good or poor outcome, using them for accurate and justifiable predictions; and deriving novel information about the role of genes, the existence of special classes of cases, and other factors. Data were analyzed using the combinatorics and optimization-based method of LAD, recently shown to provide highly accurate diagnostic and prognostic systems in cardiology, cancer proteomics, hematology, pulmonology, and other disciplines. LAD identified a subset of 17 of the 25,000 genes, capable of fully distinguishing between patients with poor, respectively good prognoses. An extensive list of 'patterns' or 'combinatorial biomarkers' (that is, combinations of genes and limitations on their expression levels) was generated, and 40 patterns were used to create a prognostic system, shown to have 100% and 92.9% weighted accuracy on the training and test sets, respectively. The prognostic system uses fewer genes than other methods, and has similar or better accuracy than those reported in other studies. Out of the 17 genes identified by LAD, three (respectively, five) were shown to play a significant role in determining poor (respectively, good) prognosis. Two new classes of patients (described by similar sets of covering patterns, gene expression ranges, and clinical features) were discovered. As a by-product of the study, it is shown that the training and the test sets of van 't Veer have differing characteristics. The study shows that LAD provides an accurate and fully explanatory prognostic system for breast cancer using genomic data (that is, a system that, in addition to predicting good or poor prognosis, provides an individualized explanation of the reasons for that prognosis for each patient). Moreover, the LAD model provides valuable insights into the roles of individual and combinatorial biomarkers, allows the discovery of new classes of patients, and generates a vast library of biomedical research hypotheses.
NASA Astrophysics Data System (ADS)
Evans, Garrett Nolan
In this work, I present two projects that both contribute to the aim of discovering how intelligence manifests in the brain. The first project is a method for analyzing recorded neural signals, which takes the form of a convolution-based metric on neural membrane potential recordings. Relying only on integral and algebraic operations, the metric compares the timing and number of spikes within recordings as well as the recordings' subthreshold features: summarizing differences in these with a single "distance" between the recordings. Like van Rossum's (2001) metric for spike trains, the metric is based on a convolution operation that it performs on the input data. The kernel used for the convolution is carefully chosen such that it produces a desirable frequency space response and, unlike van Rossum's kernel, causes the metric to be first order both in differences between nearby spike times and in differences between same-time membrane potential values: an important trait. The second project is a combinatorial syntax method for connectionist semantic network encoding. Combinatorial syntax has been a point on which those who support a symbol-processing view of intelligent processing and those who favor a connectionist view have had difficulty seeing eye-to-eye. Symbol-processing theorists have persuasively argued that combinatorial syntax is necessary for certain intelligent mental operations, such as reasoning by analogy. Connectionists have focused on the versatility and adaptability offered by self-organizing networks of simple processing units. With this project, I show that there is a way to reconcile the two perspectives and to ascribe a combinatorial syntax to a connectionist network. The critical principle is to interpret nodes, or units, in the connectionist network as bound integrations of the interpretations for nodes that they share links with. Nodes need not correspond exactly to neurons and may correspond instead to distributed sets, or assemblies, of neurons.
Sin(x)**2 + cos(x)**2 = 1. [programming identities using comparative combinatorial substitutions
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
Attempts to achieve tasteful automatic employment of the identities sin sq x + cos sq x = 1 and cos sq h x -sin sq h x = 1 in a manner which truly minimizes the complexity of the resulting expression are described. The disappointments of trigonometric reduction, trigonometric expansion, pattern matching, Poisson series, and Demoivre's theorem are related. The advantages of using the method of comparative combinatorial substitutions are illustrated.
Villagra, David; Goethe, John; Schwartz, Harold I; Szarek, Bonnie; Kocherla, Mohan; Gorowski, Krystyna; Windemuth, Andreas; Ruaño, Gualberto
2011-01-01
Aims We aim to demonstrate clinical relevance and utility of four novel drug-metabolism indices derived from a combinatory (multigene) approach to CYP2C9, CYP2C19 and CYP2D6 allele scoring. Each index considers all three genes as complementary components of a liver enzyme drug metabolism system and uniquely benchmarks innate hepatic drug metabolism reserve or alteration through CYP450 combinatory genotype scores. Methods A total of 1199 psychiatric referrals were genotyped for polymorphisms in the CYP2C9, CYP2C19 and CYP2D6 gene loci and were scored on each of the four indices. The data were used to create distributions and rankings of innate drug metabolism capacity to which individuals can be compared. Drug-specific indices are a combination of the drug metabolism indices with substrate-specific coefficients. Results The combinatory drug metabolism indices proved useful in positioning individuals relative to a population with regard to innate drug metabolism capacity prior to pharmacotherapy. Drug-specific indices generate pharmacogenetic guidance of immediate clinical relevance, and can be further modified to incorporate covariates in particular clinical cases. Conclusions We believe that this combinatory approach represents an improvement over the current gene-by-gene reporting by providing greater scope while still allowing for the resolution of a single-gene index when needed. This method will result in novel clinical and research applications, facilitating the translation from pharmacogenomics to personalized medicine, particularly in psychiatry where many drugs are metabolized or activated by multiple CYP450 isoenzymes. PMID:21861665
MDTS: automatic complex materials design using Monte Carlo tree search.
M Dieb, Thaer; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-01-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
MDTS: automatic complex materials design using Monte Carlo tree search
NASA Astrophysics Data System (ADS)
Dieb, Thaer M.; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-12-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
A theoretical comparison of evolutionary algorithms and simulated annealing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, W.E.
1995-08-28
This paper theoretically compares the performance of simulated annealing and evolutionary algorithms. Our main result is that under mild conditions a wide variety of evolutionary algorithms can be shown to have greater performance than simulated annealing after a sufficiently large number of function evaluations. This class of EAs includes variants of evolutionary strategie and evolutionary programming, the canonical genetic algorithm, as well as a variety of genetic algorithms that have been applied to combinatorial optimization problems. The proof of this result is based on a performance analysis of a very general class of stochastic optimization algorithms, which has implications formore » the performance of a variety of other optimization algorithm.« less
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical bio-sensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Li, J.; Ye, Q.; Koehne, J.; Chen, H.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical biosensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Modern drug discovery technologies: opportunities and challenges in lead discovery.
Guido, Rafael V C; Oliva, Glaucius; Andricopulo, Adriano D
2011-12-01
The identification of promising hits and the generation of high quality leads are crucial steps in the early stages of drug discovery projects. The definition and assessment of both chemical and biological space have revitalized the screening process model and emphasized the importance of exploring the intrinsic complementary nature of classical and modern methods in drug research. In this context, the widespread use of combinatorial chemistry and sophisticated screening methods for the discovery of lead compounds has created a large demand for small organic molecules that act on specific drug targets. Modern drug discovery involves the employment of a wide variety of technologies and expertise in multidisciplinary research teams. The synergistic effects between experimental and computational approaches on the selection and optimization of bioactive compounds emphasize the importance of the integration of advanced technologies in drug discovery programs. These technologies (VS, HTS, SBDD, LBDD, QSAR, and so on) are complementary in the sense that they have mutual goals, thereby the combination of both empirical and in silico efforts is feasible at many different levels of lead optimization and new chemical entity (NCE) discovery. This paper provides a brief perspective on the evolution and use of key drug design technologies, highlighting opportunities and challenges.
Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M
2006-04-21
Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.
Bootstrapping on Undirected Binary Networks Via Statistical Mechanics
NASA Astrophysics Data System (ADS)
Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice
2014-09-01
We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T.
2004-01-01
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems. PMID:15556886
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T
2004-11-22
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems.
Combinatorial influence of environmental parameters on transcription factor activity.
Knijnenburg, T A; Wessels, L F A; Reinders, M J T
2008-07-01
Cells receive a wide variety of environmental signals, which are often processed combinatorially to generate specific genetic responses. Changes in transcript levels, as observed across different environmental conditions, can, to a large extent, be attributed to changes in the activity of transcription factors (TFs). However, in unraveling these transcription regulation networks, the actual environmental signals are often not incorporated into the model, simply because they have not been measured. The unquantified heterogeneity of the environmental parameters across microarray experiments frustrates regulatory network inference. We propose an inference algorithm that models the influence of environmental parameters on gene expression. The approach is based on a yeast microarray compendium of chemostat steady-state experiments. Chemostat cultivation enables the accurate control and measurement of many of the key cultivation parameters, such as nutrient concentrations, growth rate and temperature. The observed transcript levels are explained by inferring the activity of TFs in response to combinations of cultivation parameters. The interplay between activated enhancers and repressors that bind a gene promoter determine the possible up- or downregulation of the gene. The model is translated into a linear integer optimization problem. The resulting regulatory network identifies the combinatorial effects of environmental parameters on TF activity and gene expression. The Matlab code is available from the authors upon request. Supplementary data are available at Bioinformatics online.
Causal gene identification using combinatorial V-structure search.
Cai, Ruichu; Zhang, Zhenjie; Hao, Zhifeng
2013-07-01
With the advances of biomedical techniques in the last decade, the costs of human genomic sequencing and genomic activity monitoring are coming down rapidly. To support the huge genome-based business in the near future, researchers are eager to find killer applications based on human genome information. Causal gene identification is one of the most promising applications, which may help the potential patients to estimate the risk of certain genetic diseases and locate the target gene for further genetic therapy. Unfortunately, existing pattern recognition techniques, such as Bayesian networks, cannot be directly applied to find the accurate causal relationship between genes and diseases. This is mainly due to the insufficient number of samples and the extremely high dimensionality of the gene space. In this paper, we present the first practical solution to causal gene identification, utilizing a new combinatorial formulation over V-Structures commonly used in conventional Bayesian networks, by exploring the combinations of significant V-Structures. We prove the NP-hardness of the combinatorial search problem under a general settings on the significance measure on the V-Structures, and present a greedy algorithm to find sub-optimal results. Extensive experiments show that our proposal is both scalable and effective, particularly with interesting findings on the causal genes over real human genome data. Copyright © 2013 Elsevier Ltd. All rights reserved.
High-throughput screening for combinatorial thin-film library of thermoelectric materials.
Watanabe, Masaki; Kita, Takuji; Fukumura, Tomoteru; Ohtomo, Akira; Ueno, Kazunori; Kawasaki, Masashi
2008-01-01
A high-throughput method has been developed to evaluate the Seebeck coefficient and electrical resistivity of combinatorial thin-film libraries of thermoelectric materials from room temperature to 673 K. Thin-film samples several millimeters in size were deposited on an integrated Al2O3 substrate with embedded lead wires and local heaters for measurement of the thermopower under a controlled temperature gradient. An infrared camera was used for real-time observation of the temperature difference Delta T between two electrical contacts on the sample to obtain the Seebeck coefficient. The Seebeck coefficient and electrical resistivity of constantan thin films were shown to be almost identical to standard data for bulk constantan. High-throughput screening was demonstrated for a thermoelectric Mg-Si-Ge combinatorial library.
Combinatorial and Algorithmic Rigidity: Beyond Two Dimensions
2012-12-01
problem. Manuscript, 2010. [35] G. Panina and I. Streinu. Flattening single-vertex origami : the non- expansive case. Computational Geometry : Theory and...in 2008, under the DARPA solicitation “Mathemat- ical Challenges, BAA 07-68”. It addressed Mathematical Challenge Ten: Al- gorithmic Origami and...a number of optimal algorithms and provided critical complexity analysis. The topic of algorithmic origami was successfully engaged from the same
Data-Driven Online and Real-Time Combinatorial Optimization
2013-10-30
Problem , the online Traveling Salesman Problem , and variations of the online Quota Hamil- tonian Path Problem and the online Traveling ...has the lowest competitive ratio among all algorithms of this kind. Second, we consider the Online Traveling Salesman Problem , and consider randomized...matroid secretary problem on a partition matroid. 6. Jaillet, P. and X. Lu. “Online Traveling Salesman Problems with Rejection Options”, submitted
Uher, Vojtěch; Gajdoš, Petr; Radecký, Michal; Snášel, Václav
2016-01-01
The Differential Evolution (DE) is a widely used bioinspired optimization algorithm developed by Storn and Price. It is popular for its simplicity and robustness. This algorithm was primarily designed for real-valued problems and continuous functions, but several modified versions optimizing both integer and discrete-valued problems have been developed. The discrete-coded DE has been mostly used for combinatorial problems in a set of enumerative variants. However, the DE has a great potential in the spatial data analysis and pattern recognition. This paper formulates the problem as a search of a combination of distinct vertices which meet the specified conditions. It proposes a novel approach called the Multidimensional Discrete Differential Evolution (MDDE) applying the principle of the discrete-coded DE in discrete point clouds (PCs). The paper examines the local searching abilities of the MDDE and its convergence to the global optimum in the PCs. The multidimensional discrete vertices cannot be simply ordered to get a convenient course of the discrete data, which is crucial for good convergence of a population. A novel mutation operator utilizing linear ordering of spatial data based on the space filling curves is introduced. The algorithm is tested on several spatial datasets and optimization problems. The experiments show that the MDDE is an efficient and fast method for discrete optimizations in the multidimensional point clouds.
Utilization of the Discrete Differential Evolution for Optimization in Multidimensional Point Clouds
Radecký, Michal; Snášel, Václav
2016-01-01
The Differential Evolution (DE) is a widely used bioinspired optimization algorithm developed by Storn and Price. It is popular for its simplicity and robustness. This algorithm was primarily designed for real-valued problems and continuous functions, but several modified versions optimizing both integer and discrete-valued problems have been developed. The discrete-coded DE has been mostly used for combinatorial problems in a set of enumerative variants. However, the DE has a great potential in the spatial data analysis and pattern recognition. This paper formulates the problem as a search of a combination of distinct vertices which meet the specified conditions. It proposes a novel approach called the Multidimensional Discrete Differential Evolution (MDDE) applying the principle of the discrete-coded DE in discrete point clouds (PCs). The paper examines the local searching abilities of the MDDE and its convergence to the global optimum in the PCs. The multidimensional discrete vertices cannot be simply ordered to get a convenient course of the discrete data, which is crucial for good convergence of a population. A novel mutation operator utilizing linear ordering of spatial data based on the space filling curves is introduced. The algorithm is tested on several spatial datasets and optimization problems. The experiments show that the MDDE is an efficient and fast method for discrete optimizations in the multidimensional point clouds. PMID:27974884
Siol, Sebastian; Dhakal, Tara P; Gudavalli, Ganesh S; Rajbhandari, Pravakar P; DeHart, Clay; Baranowski, Lauryn L; Zakutayev, Andriy
2016-06-08
High-throughput computational and experimental techniques have been used in the past to accelerate the discovery of new promising solar cell materials. An important part of the development of novel thin film solar cell technologies, that is still considered a bottleneck for both theory and experiment, is the search for alternative interfacial contact (buffer) layers. The research and development of contact materials is difficult due to the inherent complexity that arises from its interactions at the interface with the absorber. A promising alternative to the commonly used CdS buffer layer in thin film solar cells that contain absorbers with lower electron affinity can be found in β-In2S3. However, the synthesis conditions for the sputter deposition of this material are not well-established. Here, In2S3 is investigated as a solar cell contact material utilizing a high-throughput combinatorial screening of the temperature-flux parameter space, followed by a number of spatially resolved characterization techniques. It is demonstrated that, by tuning the sulfur partial pressure, phase pure β-In2S3 could be deposited using a broad range of substrate temperatures between 500 °C and ambient temperature. Combinatorial photovoltaic device libraries with Al/ZnO/In2S3/Cu2ZnSnS4/Mo/SiO2 structure were built at optimal processing conditions to investigate the feasibility of the sputtered In2S3 buffer layers and of an accelerated optimization of the device structure. The performance of the resulting In2S3/Cu2ZnSnS4 photovoltaic devices is on par with CdS/Cu2ZnSnS4 reference solar cells with similar values for short circuit currents and open circuit voltages, despite the overall quite low efficiency of the devices (∼2%). Overall, these results demonstrate how a high-throughput experimental approach can be used to accelerate the development of contact materials and facilitate the optimization of thin film solar cell devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, H.B. III; Rosenkrantz, D.J.; Stearns, R.E.
We study both the complexity and approximability of various graph and combinatorial problems specified using two dimensional narrow periodic specifications (see [CM93, HW92, KMW67, KO91, Or84b, Wa93]). The following two general kinds of results are presented. (1) We prove that a number of natural graph and combinatorial problems are NEXPTIME- or EXPSPACE-complete when instances are so specified; (2) In contrast, we prove that the optimization versions of several of these NEXPTIME-, EXPSPACE-complete problems have polynomial time approximation algorithms with constant performance guarantees. Moreover, some of these problems even have polynomial time approximation schemes. We also sketch how our NEXPTIME-hardness resultsmore » can be used to prove analogous NEXPTIME-hardness results for problems specified using other kinds of succinct specification languages. Our results provide the first natural problems for which there is a proven exponential (and possibly doubly exponential) gap between the complexities of finding exact and approximate solutions.« less
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Perspective: Stochastic magnetic devices for cognitive computing
NASA Astrophysics Data System (ADS)
Roy, Kaushik; Sengupta, Abhronil; Shim, Yong
2018-06-01
Stochastic switching of nanomagnets can potentially enable probabilistic cognitive hardware consisting of noisy neural and synaptic components. Furthermore, computational paradigms inspired from the Ising computing model require stochasticity for achieving near-optimality in solutions to various types of combinatorial optimization problems such as the Graph Coloring Problem or the Travelling Salesman Problem. Achieving optimal solutions in such problems are computationally exhaustive and requires natural annealing to arrive at the near-optimal solutions. Stochastic switching of devices also finds use in applications involving Deep Belief Networks and Bayesian Inference. In this article, we provide a multi-disciplinary perspective across the stack of devices, circuits, and algorithms to illustrate how the stochastic switching dynamics of spintronic devices in the presence of thermal noise can provide a direct mapping to the computational units of such probabilistic intelligent systems.
Three-dimensional unstructured grid generation via incremental insertion and local optimization
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Wiltberger, N. Lyn; Gandhi, Amar S.
1992-01-01
Algorithms for the generation of 3D unstructured surface and volume grids are discussed. These algorithms are based on incremental insertion and local optimization. The present algorithms are very general and permit local grid optimization based on various measures of grid quality. This is very important; unlike the 2D Delaunay triangulation, the 3D Delaunay triangulation appears not to have a lexicographic characterization of angularity. (The Delaunay triangulation is known to minimize that maximum containment sphere, but unfortunately this is not true lexicographically). Consequently, Delaunay triangulations in three-space can result in poorly shaped tetrahedral elements. Using the present algorithms, 3D meshes can be constructed which optimize a certain angle measure, albeit locally. We also discuss the combinatorial aspects of the algorithm as well as implementational details.
Ant Colony Optimization for Markowitz Mean-Variance Portfolio Model
NASA Astrophysics Data System (ADS)
Deng, Guang-Feng; Lin, Woo-Tsong
This work presents Ant Colony Optimization (ACO), which was initially developed to be a meta-heuristic for combinatorial optimization, for solving the cardinality constraints Markowitz mean-variance portfolio model (nonlinear mixed quadratic programming problem). To our knowledge, an efficient algorithmic solution for this problem has not been proposed until now. Using heuristic algorithms in this case is imperative. Numerical solutions are obtained for five analyses of weekly price data for the following indices for the period March, 1992 to September, 1997: Hang Seng 31 in Hong Kong, DAX 100 in Germany, FTSE 100 in UK, S&P 100 in USA and Nikkei 225 in Japan. The test results indicate that the ACO is much more robust and effective than Particle swarm optimization (PSO), especially for low-risk investment portfolios.
Optimization design of urban expressway ramp control
NASA Astrophysics Data System (ADS)
Xu, Hongke; Li, Peiqi; Zheng, Jinnan; Sun, Xiuzhen; Lin, Shan
2017-05-01
In this paper, various types of expressway systems are analyzed, and a variety of signal combinations are proposed to mitigate traffic congestion. And various signal combinations are used to verify the effectiveness of the multi-signal combinatorial control strategy. The simulation software VISSIM was used to simulate the system. Based on the network model of 25 kinds of road length combinations and the simulation results, an optimization scheme suitable for the practical road model is summarized. The simulation results show that the controller can reduce the travel time by 25% under the large traffic flow and improve the road capacity by about 20%.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.
Barhen, J; Toomarian, N; Protopopescu, V
1987-12-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
Nash Social Welfare in Multiagent Resource Allocation
NASA Astrophysics Data System (ADS)
Ramezani, Sara; Endriss, Ulle
We study different aspects of the multiagent resource allocation problem when the objective is to find an allocation that maximizes Nash social welfare, the product of the utilities of the individual agents. The Nash solution is an important welfare criterion that combines efficiency and fairness considerations. We show that the problem of finding an optimal outcome is NP-hard for a number of different languages for representing agent preferences; we establish new results regarding convergence to Nash-optimal outcomes in a distributed negotiation framework; and we design and test algorithms similar to those applied in combinatorial auctions for computing such an outcome directly.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot
NASA Technical Reports Server (NTRS)
Barhen, Jacob; Toomarian, N.; Protopopescu, V.
1987-01-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
[Hyperspectral remote sensing image classification based on SVM optimized by clonal selection].
Liu, Qing-Jie; Jing, Lin-Hai; Wang, Meng-Fei; Lin, Qi-Zhong
2013-03-01
Model selection for support vector machine (SVM) involving kernel and the margin parameter values selection is usually time-consuming, impacts training efficiency of SVM model and final classification accuracies of SVM hyperspectral remote sensing image classifier greatly. Firstly, based on combinatorial optimization theory and cross-validation method, artificial immune clonal selection algorithm is introduced to the optimal selection of SVM (CSSVM) kernel parameter a and margin parameter C to improve the training efficiency of SVM model. Then an experiment of classifying AVIRIS in India Pine site of USA was performed for testing the novel CSSVM, as well as a traditional SVM classifier with general Grid Searching cross-validation method (GSSVM) for comparison. And then, evaluation indexes including SVM model training time, classification overall accuracy (OA) and Kappa index of both CSSVM and GSSVM were all analyzed quantitatively. It is demonstrated that OA of CSSVM on test samples and whole image are 85.1% and 81.58, the differences from that of GSSVM are both within 0.08% respectively; And Kappa indexes reach 0.8213 and 0.7728, the differences from that of GSSVM are both within 0.001; While the ratio of model training time of CSSVM and GSSVM is between 1/6 and 1/10. Therefore, CSSVM is fast and accurate algorithm for hyperspectral image classification and is superior to GSSVM.
An evolutionary strategy based on partial imitation for solving optimization problems
NASA Astrophysics Data System (ADS)
Javarone, Marco Alberto
2016-12-01
In this work we introduce an evolutionary strategy to solve combinatorial optimization tasks, i.e. problems characterized by a discrete search space. In particular, we focus on the Traveling Salesman Problem (TSP), i.e. a famous problem whose search space grows exponentially, increasing the number of cities, up to becoming NP-hard. The solutions of the TSP can be codified by arrays of cities, and can be evaluated by fitness, computed according to a cost function (e.g. the length of a path). Our method is based on the evolution of an agent population by means of an imitative mechanism, we define 'partial imitation'. In particular, agents receive a random solution and then, interacting among themselves, may imitate the solutions of agents with a higher fitness. Since the imitation mechanism is only partial, agents copy only one entry (randomly chosen) of another array (i.e. solution). In doing so, the population converges towards a shared solution, behaving like a spin system undergoing a cooling process, i.e. driven towards an ordered phase. We highlight that the adopted 'partial imitation' mechanism allows the population to generate solutions over time, before reaching the final equilibrium. Results of numerical simulations show that our method is able to find, in a finite time, both optimal and suboptimal solutions, depending on the size of the considered search space.
Merrick, C A; Wardrope, C; Paget, J E; Colloms, S D; Rosser, S J
2016-01-01
Metabolic pathway engineering in microbial hosts for heterologous biosynthesis of commodity compounds and fine chemicals offers a cheaper, greener, and more reliable method of production than does chemical synthesis. However, engineering metabolic pathways within a microbe is a complicated process: levels of gene expression, protein stability, enzyme activity, and metabolic flux must be balanced for high productivity without compromising host cell viability. A major rate-limiting step in engineering microbes for optimum biosynthesis of a target compound is DNA assembly, as current methods can be cumbersome and costly. Serine integrase recombinational assembly (SIRA) is a rapid DNA assembly method that utilizes serine integrases, and is particularly applicable to rapid optimization of engineered metabolic pathways. Using six pairs of orthogonal attP and attB sites with different central dinucleotide sequences that follow SIRA design principles, we have demonstrated that ΦC31 integrase can be used to (1) insert a single piece of DNA into a substrate plasmid; (2) assemble three, four, and five DNA parts encoding the enzymes for functional metabolic pathways in a one-pot reaction; (3) generate combinatorial libraries of metabolic pathway constructs with varied ribosome binding site strengths or gene orders in a one-pot reaction; and (4) replace and add DNA parts within a construct through targeted postassembly modification. We explain the mechanism of SIRA and the principles behind designing a SIRA reaction. We also provide protocols for making SIRA reaction components and practical methods for applying SIRA to rapid optimization of metabolic pathways. © 2016 Elsevier Inc. All rights reserved.
Cognitive foundations for model-based sensor fusion
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.
2003-08-01
Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with emotional evaluation and overcomes the combinatorial complexity of concurrent fusion, tracking, and detection. The presentation will discuss examples of performance, where computational speedups of many orders of magnitude were attained leading to performance improvements of up to 10 dB (and better).
Branch-pipe-routing approach for ships using improved genetic algorithm
NASA Astrophysics Data System (ADS)
Sui, Haiteng; Niu, Wentie
2016-09-01
Branch-pipe routing plays fundamental and critical roles in ship-pipe design. The branch-pipe-routing problem is a complex combinatorial optimization problem and is thus difficult to solve when depending only on human experts. A modified genetic-algorithm-based approach is proposed in this paper to solve this problem. The simplified layout space is first divided into threedimensional (3D) grids to build its mathematical model. Branch pipes in layout space are regarded as a combination of several two-point pipes, and the pipe route between two connection points is generated using an improved maze algorithm. The coding of branch pipes is then defined, and the genetic operators are devised, especially the complete crossover strategy that greatly accelerates the convergence speed. Finally, simulation tests demonstrate the performance of proposed method.
Perspective. Extremely fine tuning of doping enabled by combinatorial molecular-beam epitaxy
Wu, J.; Bozovic, I.
2015-04-06
Chemical doping provides an effective method to control the electric properties of complex oxides. However, the state-of-art accuracy in controlling doping is limited to about 1%. This hampers elucidation of the precise doping dependences of physical properties and phenomena of interest, such as quantum phase transitions. Using the combinatorial molecular beam epitaxy, we improve the accuracy in tuning the doping level by two orders of magnitude. We illustrate this novel method by two examples: a systematic investigation of the doping dependence of interface superconductivity, and a study of the competing ground states in the vicinity of the insulator-to-superconductor transition.
MGA trajectory planning with an ACO-inspired algorithm
NASA Astrophysics Data System (ADS)
Ceriotti, Matteo; Vasile, Massimiliano
2010-11-01
Given a set of celestial bodies, the problem of finding an optimal sequence of swing-bys, deep space manoeuvres (DSM) and transfer arcs connecting the elements of the set is combinatorial in nature. The number of possible paths grows exponentially with the number of celestial bodies. Therefore, the design of an optimal multiple gravity assist (MGA) trajectory is a NP-hard mixed combinatorial-continuous problem. Its automated solution would greatly improve the design of future space missions, allowing the assessment of a large number of alternative mission options in a short time. This work proposes to formulate the complete automated design of a multiple gravity assist trajectory as an autonomous planning and scheduling problem. The resulting scheduled plan will provide the optimal planetary sequence and a good estimation of the set of associated optimal trajectories. The trajectory model consists of a sequence of celestial bodies connected by two-dimensional transfer arcs containing one DSM. For each transfer arc, the position of the planet and the spacecraft, at the time of arrival, are matched by varying the pericentre of the preceding swing-by, or the magnitude of the launch excess velocity, for the first arc. For each departure date, this model generates a full tree of possible transfers from the departure to the destination planet. Each leaf of the tree represents a planetary encounter and a possible way to reach that planet. An algorithm inspired by ant colony optimization (ACO) is devised to explore the space of possible plans. The ants explore the tree from departure to destination adding one node at the time: every time an ant is at a node, a probability function is used to select a feasible direction. This approach to automatic trajectory planning is applied to the design of optimal transfers to Saturn and among the Galilean moons of Jupiter. Solutions are compared to those found through more traditional genetic-algorithm techniques.
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov
2015-08-01
Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.
MASM: a market architecture for sensor management in distributed sensor networks
NASA Astrophysics Data System (ADS)
Viswanath, Avasarala; Mullen, Tracy; Hall, David; Garga, Amulya
2005-03-01
Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.
Optimization of personalized therapies for anticancer treatment.
Vazquez, Alexei
2013-04-12
As today, there are hundreds of targeted therapies for the treatment of cancer, many of which have companion biomarkers that are in use to inform treatment decisions. If we would consider this whole arsenal of targeted therapies as a treatment option for every patient, very soon we will reach a scenario where each patient is positive for several markers suggesting their treatment with several targeted therapies. Given the documented side effects of anticancer drugs, it is clear that such a strategy is unfeasible. Here, we propose a strategy that optimizes the design of combinatorial therapies to achieve the best response rates with the minimal toxicity. In this methodology markers are assigned to drugs such that we achieve a high overall response rate while using personalized combinations of minimal size. We tested this methodology in an in silico cancer patient cohort, constructed from in vitro data for 714 cell lines and 138 drugs reported by the Sanger Institute. Our analysis indicates that, even in the context of personalized medicine, combinations of three or more drugs are required to achieve high response rates. Furthermore, patient-to-patient variations in pharmacokinetics have a significant impact in the overall response rate. A 10 fold increase in the pharmacokinetics variations resulted in a significant drop the overall response rate. The design of optimal combinatorial therapy for anticancer treatment requires a transition from the one-drug/one-biomarker approach to global strategies that simultaneously assign makers to a catalog of drugs. The methodology reported here provides a framework to achieve this transition.
2014-01-01
All-oxide-based photovoltaics (PVs) encompass the potential for extremely low cost solar cells, provided they can obtain an order of magnitude improvement in their power conversion efficiencies. To achieve this goal, we perform a combinatorial materials study of metal oxide based light absorbers, charge transporters, junctions between them, and PV devices. Here we report the development of a combinatorial internal quantum efficiency (IQE) method. IQE measures the efficiency associated with the charge separation and collection processes, and thus is a proxy for PV activity of materials once placed into devices, discarding optical properties that cause uncontrolled light harvesting. The IQE is supported by high-throughput techniques for bandgap fitting, composition analysis, and thickness mapping, which are also crucial parameters for the combinatorial investigation cycle of photovoltaics. As a model system we use a library of 169 solar cells with a varying thickness of sprayed titanium dioxide (TiO2) as the window layer, and covarying thickness and composition of binary compounds of copper oxides (Cu–O) as the light absorber, fabricated by Pulsed Laser Deposition (PLD). The analysis on the combinatorial devices shows the correlation between compositions and bandgap, and their effect on PV activity within several device configurations. The analysis suggests that the presence of Cu4O3 plays a significant role in the PV activity of binary Cu–O compounds. PMID:24410367
Scott-Phillips, Thomas C; Blythe, Richard A
2013-11-06
In a combinatorial communication system, some signals consist of the combinations of other signals. Such systems are more efficient than equivalent, non-combinatorial systems, yet despite this they are rare in nature. Why? Previous explanations have focused on the adaptive limits of combinatorial communication, or on its purported cognitive difficulties, but neither of these explains the full distribution of combinatorial communication in the natural world. Here, we present a nonlinear dynamical model of the emergence of combinatorial communication that, unlike previous models, considers how initially non-communicative behaviour evolves to take on a communicative function. We derive three basic principles about the emergence of combinatorial communication. We hence show that the interdependence of signals and responses places significant constraints on the historical pathways by which combinatorial signals might emerge, to the extent that anything other than the most simple form of combinatorial communication is extremely unlikely. We also argue that these constraints can be bypassed if individuals have the socio-cognitive capacity to engage in ostensive communication. Humans, but probably no other species, have this ability. This may explain why language, which is massively combinatorial, is such an extreme exception to nature's general trend for non-combinatorial communication.
Materials Discovery | Materials Science | NREL
measurement methods and specialized analysis algorithms. Projects Basic Research The basic research projects applications using high-throughput combinatorial research methods. Email | 303-384-6467 Photo of John Perkins
Cankorur-Cetinkaya, Ayca; Dias, Joao M. L.; Kludas, Jana; Slater, Nigel K. H.; Rousu, Juho; Dikicioglu, Duygu
2017-01-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple‐to‐use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257). PMID:28635591
The hypergraph regularity method and its applications
Rödl, V.; Nagle, B.; Skokan, J.; Schacht, M.; Kohayakawa, Y.
2005-01-01
Szemerédi's regularity lemma asserts that every graph can be decomposed into relatively few random-like subgraphs. This random-like behavior enables one to find and enumerate subgraphs of a given isomorphism type, yielding the so-called counting lemma for graphs. The combined application of these two lemmas is known as the regularity method for graphs and has proved useful in graph theory, combinatorial geometry, combinatorial number theory, and theoretical computer science. Here, we report on recent advances in the regularity method for k-uniform hypergraphs, for arbitrary k ≥ 2. This method, purely combinatorial in nature, gives alternative proofs of density theorems originally due to E. Szemerédi, H. Furstenberg, and Y. Katznelson. Further results in extremal combinatorics also have been obtained with this approach. The two main components of the regularity method for k-uniform hypergraphs, the regularity lemma and the counting lemma, have been obtained recently: Rödl and Skokan (based on earlier work of Frankl and Rödl) generalized Szemerédi's regularity lemma to k-uniform hypergraphs, and Nagle, Rödl, and Schacht succeeded in proving a counting lemma accompanying the Rödl–Skokan hypergraph regularity lemma. The counting lemma is proved by reducing the counting problem to a simpler one previously investigated by Kohayakawa, Rödl, and Skokan. Similar results were obtained independently by W. T. Gowers, following a different approach. PMID:15919821
A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies
Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.
2008-01-01
Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969
Analytical validation of a psychiatric pharmacogenomic test.
Jablonski, Michael R; King, Nina; Wang, Yongbao; Winner, Joel G; Watterson, Lucas R; Gunselman, Sandra; Dechairo, Bryan M
2018-05-01
The aim of this study was to validate the analytical performance of a combinatorial pharmacogenomics test designed to aid in the appropriate medication selection for neuropsychiatric conditions. Genomic DNA was isolated from buccal swabs. Twelve genes (65 variants/alleles) associated with psychotropic medication metabolism, side effects, and mechanisms of actions were evaluated by bead array, MALDI-TOF mass spectrometry, and/or capillary electrophoresis methods (GeneSight Psychotropic, Assurex Health, Inc.). The combinatorial pharmacogenomics test has a dynamic range of 2.5-20 ng/μl of input genomic DNA, with comparable performance for all assays included in the test. Both the precision and accuracy of the test were >99.9%, with individual gene components between 99.4 and 100%. This study demonstrates that the combinatorial pharmacogenomics test is robust and reproducible, making it suitable for clinical use.
NASA Astrophysics Data System (ADS)
Orito, Yukiko; Yamamoto, Hisashi; Tsujimura, Yasuhiro; Kambayashi, Yasushi
The portfolio optimizations are to determine the proportion-weighted combination in the portfolio in order to achieve investment targets. This optimization is one of the multi-dimensional combinatorial optimizations and it is difficult for the portfolio constructed in the past period to keep its performance in the future period. In order to keep the good performances of portfolios, we propose the extended information ratio as an objective function, using the information ratio, beta, prime beta, or correlation coefficient in this paper. We apply the simulated annealing (SA) to optimize the portfolio employing the proposed ratio. For the SA, we make the neighbor by the operation that changes the structure of the weights in the portfolio. In the numerical experiments, we show that our portfolios keep the good performances when the market trend of the future period becomes different from that of the past period.
Combinatorial approaches to gene recognition.
Roytberg, M A; Astakhova, T V; Gelfand, M S
1997-01-01
Recognition of genes via exon assembly approaches leads naturally to the use of dynamic programming. We consider the general graph-theoretical formulation of the exon assembly problem and analyze in detail some specific variants: multicriterial optimization in the case of non-linear gene-scoring functions; context-dependent schemes for scoring exons and related procedures for exon filtering; and highly specific recognition of arbitrary gene segments, oligonucleotide probes and polymerase chain reaction (PCR) primers.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses.
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C; Altman, Sidney; Schwarz, Udo D; Kyriakides, Themis R; Schroers, Jan
2016-05-27
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
NASA Astrophysics Data System (ADS)
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-05-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Solving Connected Subgraph Problems in Wildlife Conservation
NASA Astrophysics Data System (ADS)
Dilkina, Bistra; Gomes, Carla P.
We investigate mathematical formulations and solution techniques for a variant of the Connected Subgraph Problem. Given a connected graph with costs and profits associated with the nodes, the goal is to find a connected subgraph that contains a subset of distinguished vertices. In this work we focus on the budget-constrained version, where we maximize the total profit of the nodes in the subgraph subject to a budget constraint on the total cost. We propose several mixed-integer formulations for enforcing the subgraph connectivity requirement, which plays a key role in the combinatorial structure of the problem. We show that a new formulation based on subtour elimination constraints is more effective at capturing the combinatorial structure of the problem, providing significant advantages over the previously considered encoding which was based on a single commodity flow. We test our formulations on synthetic instances as well as on real-world instances of an important problem in environmental conservation concerning the design of wildlife corridors. Our encoding results in a much tighter LP relaxation, and more importantly, it results in finding better integer feasible solutions as well as much better upper bounds on the objective (often proving optimality or within less than 1% of optimality), both when considering the synthetic instances as well as the real-world wildlife corridor instances.
EcoFlex: A Multifunctional MoClo Kit for E. coli Synthetic Biology.
Moore, Simon J; Lai, Hung-En; Kelwick, Richard J R; Chee, Soo Mei; Bell, David J; Polizzi, Karen Marie; Freemont, Paul S
2016-10-21
Golden Gate cloning is a prominent DNA assembly tool in synthetic biology for the assembly of plasmid constructs often used in combinatorial pathway optimization, with a number of assembly kits developed specifically for yeast and plant-based expression. However, its use for synthetic biology in commonly used bacterial systems such as Escherichia coli has surprisingly been overlooked. Here, we introduce EcoFlex a simplified modular package of DNA parts for a variety of applications in E. coli, cell-free protein synthesis, protein purification and hierarchical assembly of transcription units based on the MoClo assembly standard. The kit features a library of constitutive promoters, T7 expression, RBS strength variants, synthetic terminators, protein purification tags and fluorescence proteins. We validate EcoFlex by assembling a 68-part containing (20 genes) plasmid (31 kb), characterize in vivo and in vitro library parts, and perform combinatorial pathway assembly, using pooled libraries of either fluorescent proteins or the biosynthetic genes for the antimicrobial pigment violacein as a proof-of-concept. To minimize pathway screening, we also introduce a secondary module design site to simplify MoClo pathway optimization. In summary, EcoFlex provides a standardized and multifunctional kit for a variety of applications in E. coli synthetic biology.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-01-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design. PMID:27230692
Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle
Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza; ...
2017-05-18
We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less
Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza
We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less
ERIC Educational Resources Information Center
Duarte, Robert; Nielson, Janne T.; Dragojlovic, Veljko
2004-01-01
A group of techniques aimed at synthesizing a large number of structurally diverse compounds is called combinatorial synthesis. Synthesis of chemiluminescence esters using parallel combinatorial synthesis and mix-and-split combinatorial synthesis is experimented.
Exploring metabolic pathways in genome-scale networks via generating flux modes.
Rezola, A; de Figueiredo, L F; Brock, M; Pey, J; Podhorski, A; Wittmann, C; Schuster, S; Bockmayr, A; Planes, F J
2011-02-15
The reconstruction of metabolic networks at the genome scale has allowed the analysis of metabolic pathways at an unprecedented level of complexity. Elementary flux modes (EFMs) are an appropriate concept for such analysis. However, their number grows in a combinatorial fashion as the size of the metabolic network increases, which renders the application of EFMs approach to large metabolic networks difficult. Novel methods are expected to deal with such complexity. In this article, we present a novel optimization-based method for determining a minimal generating set of EFMs, i.e. a convex basis. We show that a subset of elements of this convex basis can be effectively computed even in large metabolic networks. Our method was applied to examine the structure of pathways producing lysine in Escherichia coli. We obtained a more varied and informative set of pathways in comparison with existing methods. In addition, an alternative pathway to produce lysine was identified using a detour via propionyl-CoA, which shows the predictive power of our novel approach. The source code in C++ is available upon request.
ACOustic: A Nature-Inspired Exploration Indicator for Ant Colony Optimization.
Sagban, Rafid; Ku-Mahamud, Ku Ruhana; Abu Bakar, Muhamad Shahbani
2015-01-01
A statistical machine learning indicator, ACOustic, is proposed to evaluate the exploration behavior in the iterations of ant colony optimization algorithms. This idea is inspired by the behavior of some parasites in their mimicry to the queens' acoustics of their ant hosts. The parasites' reaction results from their ability to indicate the state of penetration. The proposed indicator solves the problem of robustness that results from the difference of magnitudes in the distance's matrix, especially when combinatorial optimization problems with rugged fitness landscape are applied. The performance of the proposed indicator is evaluated against the existing indicators in six variants of ant colony optimization algorithms. Instances for travelling salesman problem and quadratic assignment problem are used in the experimental evaluation. The analytical results showed that the proposed indicator is more informative and more robust.
Asessing for Structural Understanding in Childrens' Combinatorial Problem Solving.
ERIC Educational Resources Information Center
English, Lyn
1999-01-01
Assesses children's structural understanding of combinatorial problems when presented in a variety of task situations. Provides an explanatory model of students' combinatorial understandings that informs teaching and assessment. Addresses several components of children's structural understanding of elementary combinatorial problems. (Contains 50…
Genetic Networks and Anticipation of Gene Expression Patterns
NASA Astrophysics Data System (ADS)
Gebert, J.; Lätsch, M.; Pickl, S. W.; Radde, N.; Weber, G.-W.; Wünschiers, R.
2004-08-01
An interesting problem for computational biology is the analysis of time-series expression data. Here, the application of modern methods from dynamical systems, optimization theory, numerical algorithms and the utilization of implicit discrete information lead to a deeper understanding. In [1], we suggested to represent the behavior of time-series gene expression patterns by a system of ordinary differential equations, which we analytically and algorithmically investigated under the parametrical aspect of stability or instability. Our algorithm strongly exploited combinatorial information. In this paper, we deepen, extend and exemplify this study from the viewpoint of underlying mathematical modelling. This modelling consists in evaluating DNA-microarray measurements as the basis of anticipatory prediction, in the choice of a smooth model given by differential equations, in an approach of the right-hand side with parametric matrices, and in a discrete approximation which is a least squares optimization problem. We give a mathematical and biological discussion, and pay attention to the special case of a linear system, where the matrices do not depend on the state of expressions. Here, we present first numerical examples.
Lin, En-Chiang; Cole, Jesse J; Jacobs, Heiko O
2010-11-10
This article reports and applies a recently discovered programmable multimaterial deposition process to the formation and combinatorial improvement of 3D nanostructured devices. The gas-phase deposition process produces charged <5 nm particles of silver, tungsten, and platinum and uses externally biased electrodes to control the material flux and to turn deposition ON/OFF in selected domains. Domains host nanostructured dielectrics to define arrays of electrodynamic 10 × nanolenses to further control the flux to form <100 nm resolution deposits. The unique feature of the process is that material type, amount, and sequence can be altered from one domain to the next leading to different types of nanostructures including multimaterial bridges, interconnects, or nanowire arrays with 20 nm positional accuracy. These features enable combinatorial nanostructured materials and device discovery. As a first demonstration, we produce and identify in a combinatorial way 3D nanostructured electrode designs that improve light scattering, absorption, and minority carrier extraction of bulk heterojunction photovoltaic cells. Photovoltaic cells from domains with long and dense nanowire arrays improve the relative power conversion efficiency by 47% when compared to flat domains on the same substrate.
An application of different dioids in public key cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com
2014-11-18
Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less
Combinatorial influence of environmental parameters on transcription factor activity
Knijnenburg, T.A.; Wessels, L.F.A.; Reinders, M.J.T.
2008-01-01
Motivation: Cells receive a wide variety of environmental signals, which are often processed combinatorially to generate specific genetic responses. Changes in transcript levels, as observed across different environmental conditions, can, to a large extent, be attributed to changes in the activity of transcription factors (TFs). However, in unraveling these transcription regulation networks, the actual environmental signals are often not incorporated into the model, simply because they have not been measured. The unquantified heterogeneity of the environmental parameters across microarray experiments frustrates regulatory network inference. Results: We propose an inference algorithm that models the influence of environmental parameters on gene expression. The approach is based on a yeast microarray compendium of chemostat steady-state experiments. Chemostat cultivation enables the accurate control and measurement of many of the key cultivation parameters, such as nutrient concentrations, growth rate and temperature. The observed transcript levels are explained by inferring the activity of TFs in response to combinations of cultivation parameters. The interplay between activated enhancers and repressors that bind a gene promoter determine the possible up- or downregulation of the gene. The model is translated into a linear integer optimization problem. The resulting regulatory network identifies the combinatorial effects of environmental parameters on TF activity and gene expression. Availability: The Matlab code is available from the authors upon request. Contact: t.a.knijnenburg@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18586711
Chimeric Rhinoviruses Displaying MPER Epitopes Elicit Anti-HIV Neutralizing Responses
Yi, Guohua; Lapelosa, Mauro; Bradley, Rachel; Mariano, Thomas M.; Dietz, Denise Elsasser; Hughes, Scott; Wrin, Terri; Petropoulos, Chris; Gallicchio, Emilio; Levy, Ronald M.; Arnold, Eddy; Arnold, Gail Ferstandig
2013-01-01
Background The development of an effective AIDS vaccine has been a formidable task, but remains a critical necessity. The well conserved membrane-proximal external region (MPER) of the HIV-1 gp41 glycoprotein is one of the crucial targets for AIDS vaccine development, as it has the necessary attribute of being able to elicit antibodies capable of neutralizing diverse isolates of HIV. Methodology/Principle Findings Guided by X-ray crystallography, molecular modeling, combinatorial chemistry, and powerful selection techniques, we designed and produced six combinatorial libraries of chimeric human rhinoviruses (HRV) displaying the MPER epitopes corresponding to mAbs 2F5, 4E10, and/or Z13e1, connected to an immunogenic surface loop of HRV via linkers of varying lengths and sequences. Not all libraries led to viable chimeric viruses with the desired sequences, but the combinatorial approach allowed us to examine large numbers of MPER-displaying chimeras. Among the chimeras were five that elicited antibodies capable of significantly neutralizing HIV-1 pseudoviruses from at least three subtypes, in one case leading to neutralization of 10 pseudoviruses from all six subtypes tested. Conclusions Optimization of these chimeras or closely related chimeras could conceivably lead to useful components of an effective AIDS vaccine. While the MPER of HIV may not be immunodominant in natural infection by HIV-1, its presence in a vaccine cocktail could provide critical breadth of protection. PMID:24039745
Pesavento, James J; Garcia, Benjamin A; Streeky, James A; Kelleher, Neil L; Mizzen, Craig A
2007-09-01
Recent developments in top down mass spectrometry have enabled closely related histone variants and their modified forms to be identified and quantitated with unprecedented precision, facilitating efforts to better understand how histones contribute to the epigenetic regulation of gene transcription and other nuclear processes. It is therefore crucial that intact MS profiles accurately reflect the levels of variants and modified forms present in a given cell type or cell state for the full benefit of such efforts to be realized. Here we show that partial oxidation of Met and Cys residues in histone samples prepared by conventional methods, together with oxidation that can accrue during storage or during chip-based automated nanoflow electrospray ionization, confounds MS analysis by altering the intact MS profile as well as hindering posttranslational modification localization after MS/MS. We also describe an optimized performic acid oxidation procedure that circumvents these problems without catalyzing additional oxidations or altering the levels of posttranslational modifications common in histones. MS and MS/MS of HeLa cell core histones confirmed that Met and Cys were the only residues oxidized and that complete oxidation restored true intact abundance ratios and significantly enhanced MS/MS data quality. This allowed for the unequivocal detection, at the intact molecule level, of novel combinatorially modified forms of H4 that would have been missed otherwise. Oxidation also enhanced the separation of human core histones by reverse phase chromatography and decreased the levels of salt-adducted forms observed in ESI-FTMS. This method represents a simple and easily automated means for enhancing the accuracy and sensitivity of top down analyses of combinatorially modified forms of histones that may also be of benefit for top down or bottom up analyses of other proteins.
Combinatorial structures to modeling simple games and applications
NASA Astrophysics Data System (ADS)
Molinero, Xavier
2017-09-01
We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.
Combinatorial studies of (1-x)Na0.5Bi0.5TiO3-xBaTiO3 thin-film chips
NASA Astrophysics Data System (ADS)
Cheng, Hong-Wei; Zhang, Xue-Jin; Zhang, Shan-Tao; Feng, Yan; Chen, Yan-Feng; Liu, Zhi-Guo; Cheng, Guang-Xi
2004-09-01
Applying a combinatorial methodology, (1-x)Na0.5Bi0.5TiO3-xBaTiO3 (NBT-BT) thin-film chips were fabricated on (001)-LaAlO3 substrates by pulsed laser deposition with a few quaternary masks. A series of NBT-BT library with the composition of BT ranged from 0 to 44% was obtained with uniform composition and well crystallinity. The relation between the concentration of NBT-BT and their structural and dielectric properties were investigated by x-ray diffraction (XRD), evanescent microwave probe, atomic force microscopy, and Raman spectroscopy. An obvious morphotropic phase boundary (MPB) was established to be about 9% BT by XRD, Raman frequency shift, and dielectric anomaly, different from the well-known MPB of the materials. The result shows the high efficiency of combinatorial method in searching new relaxor ferroelectrics.
Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms
NASA Astrophysics Data System (ADS)
Lopez, Nicolas
This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.
Combinatorial synthesis of bimetallic complexes with three halogeno bridges.
Gauthier, Sébastien; Quebatte, Laurent; Scopelliti, Rosario; Severin, Kay
2004-06-07
Methods for the synthesis of bimetallic complexes in which two different metal fragments are connected by three chloro or bromo bridges are reported. The reactions are general, fast, and give rise to structurally defined products in quantitative yields. Therefore, they are ideally suited for generating a library of homo- and heterobimetallic complexes in a combinatorial fashion. This is of special interest for applications in homogeneous catalysis. Selected members of this library were synthesized and comprehensively characterized; single-crystal X-ray analyses were performed for 15 new bimetallic compounds.
The effects of variable biome distribution on global climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noever, D.A.; Brittain, A.; Matsos, H.C.
1996-12-31
In projecting climatic adjustments to anthropogenically elevated atmospheric carbon dioxide, most global climate models fix biome distribution to current geographic conditions. The authors develop a model that examines the albedo-related effects of biome distribution on global temperature. The model was tested on historical biome changes since 1860 and the results fit both the observed trend and order of magnitude change in global temperature. Once backtested in this way on historical data, the model is then used to generate an optimized future biome distribution which minimizes projected greenhouse effects on global temperature. Because of the complexity of this combinatorial search anmore » artificial intelligence method, the genetic algorithm, was employed. The genetic algorithm assigns various biome distributions to the planet, then adjusts their percentage area and albedo effects to regulate or moderate temperature changes.« less
Logical analysis of diffuse large B-cell lymphomas.
Alexe, G; Alexe, S; Axelrod, D E; Hammer, P L; Weissmann, D
2005-07-01
The goal of this study is to re-examine the oligonucleotide microarray dataset of Shipp et al., which contains the intensity levels of 6817 genes of 58 patients with diffuse large B-cell lymphoma (DLBCL) and 19 with follicular lymphoma (FL), by means of the combinatorics, optimisation, and logic-based methodology of logical analysis of data (LAD). The motivations for this new analysis included the previously demonstrated capabilities of LAD and its expected potential (1) to identify different informative genes than those discovered by conventional statistical methods, (2) to identify combinations of gene expression levels capable of characterizing different types of lymphoma, and (3) to assemble collections of such combinations that if considered jointly are capable of accurately distinguishing different types of lymphoma. The central concept of LAD is a pattern or combinatorial biomarker, a concept that resembles a rule as used in decision tree methods. LAD is able to exhaustively generate the collection of all those patterns which satisfy certain quality constraints, through a systematic combinatorial process guided by clear optimization criteria. Then, based on a set covering approach, LAD aggregates the collection of patterns into classification models. In addition, LAD is able to use the information provided by large collections of patterns in order to extract subsets of variables, which collectively are able to distinguish between different types of disease. For the differential diagnosis of DLBCL versus FL, a model based on eight significant genes is constructed and shown to have a sensitivity of 94.7% and a specificity of 100% on the test set. For the prognosis of good versus poor outcome among the DLBCL patients, a model is constructed on another set consisting also of eight significant genes, and shown to have a sensitivity of 87.5% and a specificity of 90% on the test set. The genes selected by LAD also work well as a basis for other kinds of statistical analysis, indicating their robustness. These two models exhibit accuracies that compare favorably to those in the original study. In addition, the current study also provides a ranking by importance of the genes in the selected significant subsets as well as a library of dozens of combinatorial biomarkers (i.e. pairs or triplets of genes) that can serve as a source of mathematically generated, statistically significant research hypotheses in need of biological explanation.
Engineered Peptides for Applications in Cancer-Targeted Drug Delivery and Tumor Detection.
Soudy, R; Byeon, N; Raghuwanshi, Y; Ahmed, S; Lavasanifar, A; Kaur, K
2017-01-01
Cancer-targeting peptides as ligands for targeted delivery of anticancer drugs or drug carriers have the potential to significantly enhance the selectivity and the therapeutic benefit of current chemotherapeutic agents. Identification of tumor-specific biomarkers like integrins, aminopeptidase N, and epidermal growth factor receptor as well as the popularity of phage display techniques along with synthetic combinatorial methods used for peptide design and structure optimization have fueled the advancement and application of peptide ligands for targeted drug delivery and tumor detection in cancer treatment, detection and guided therapy. Although considerable preclinical data have shown remarkable success in the use of tumor targeting peptides, peptides generally suffer from poor pharmacokinetics, enzymatic instability, and weak receptor affinity, and they need further structural modification before successful translation to clinics is possible. The current review gives an overview of the different engineering strategies that have been developed for peptide structure optimization to confer selectivity and stability. We also provide an update on the methods used for peptide ligand identification, and peptide- receptor interactions. Additionally, some applications for the use of peptides in targeted delivery of chemotherapeutics and diagnostics over the past 5 years are summarized. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.
Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro
2017-03-30
Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.
High-Throughput Identification of Combinatorial Ligands for DNA Delivery in Cell Culture
NASA Astrophysics Data System (ADS)
Svahn, Mathias G.; Rabe, Kersten S.; Barger, Geoffrey; EL-Andaloussi, Samir; Simonson, Oscar E.; Didier, Boturyn; Olivier, Renaudet; Dumy, Pascal; Brandén, Lars J.; Niemeyer, Christof M.; Smith, C. I. Edvard
2008-10-01
Finding the optimal combinations of ligands for tissue-specific delivery is tedious even if only a few well-established compounds are tested. The cargo affects the receptor-ligand interaction, especially when it is charged like DNA. The ligand should therefore be evaluated together with its cargo. Several viruses have been shown to interact with more than one receptor, for efficient internalization. We here present a DNA oligonucleotide-based method for inexpensive and rapid screening of biotin labeled ligands for combinatorial effects on cellular binding and uptake. The oligonucleotide complex was designed as a 44 bp double-stranded DNA oligonucleotide with one central streptavidin molecule and a second streptavidin at the terminus. The use of a highly advanced robotic platform ensured stringent processing and execution of the experiments. The oligonucleotides were fluorescently labeled and used for detection and analysis of cell-bound, internalized and intra-cellular compartmentalized constructs by an automated line-scanning confocal microscope, IN Cell Analyzer 3000. All possible combinations of 22 ligands were explored in sets of 2 and tested on 6 different human cell lines in triplicates. In total, 10 000 transfections were performed on the automation platform. Cell-specific combinations of ligands were identified and their relative position on the scaffold oligonucleotide was found to be of importance. The ligands were found to be cargo dependent, carbohydrates were more potent for DNA delivery whereas cell penetrating peptides were more potent for delivery of less charged particles.
Boyen, Peter; Van Dyck, Dries; Neven, Frank; van Ham, Roeland C H J; van Dijk, Aalt D J
2011-01-01
Correlated motif mining (cmm) is the problem of finding overrepresented pairs of patterns, called motifs, in sequences of interacting proteins. Algorithmic solutions for cmm thereby provide a computational method for predicting binding sites for protein interaction. In this paper, we adopt a motif-driven approach where the support of candidate motif pairs is evaluated in the network. We experimentally establish the superiority of the Chi-square-based support measure over other support measures. Furthermore, we obtain that cmm is an np-hard problem for a large class of support measures (including Chi-square) and reformulate the search for correlated motifs as a combinatorial optimization problem. We then present the generic metaheuristic slider which uses steepest ascent with a neighborhood function based on sliding motifs and employs the Chi-square-based support measure. We show that slider outperforms existing motif-driven cmm methods and scales to large protein-protein interaction networks. The slider-implementation and the data used in the experiments are available on http://bioinformatics.uhasselt.be.
Sequential bearings-only-tracking initiation with particle filtering method.
Liu, Bin; Hao, Chengpeng
2013-01-01
The tracking initiation problem is examined in the context of autonomous bearings-only-tracking (BOT) of a single appearing/disappearing target in the presence of clutter measurements. In general, this problem suffers from a combinatorial explosion in the number of potential tracks resulted from the uncertainty in the linkage between the target and the measurement (a.k.a the data association problem). In addition, the nonlinear measurements lead to a non-Gaussian posterior probability density function (pdf) in the optimal Bayesian sequential estimation framework. The consequence of this nonlinear/non-Gaussian context is the absence of a closed-form solution. This paper models the linkage uncertainty and the nonlinear/non-Gaussian estimation problem jointly with solid Bayesian formalism. A particle filtering (PF) algorithm is derived for estimating the model's parameters in a sequential manner. Numerical results show that the proposed solution provides a significant benefit over the most commonly used methods, IPDA and IMMPDA. The posterior Cramér-Rao bounds are also involved for performance evaluation.
Liu, Wang; Li, Yu-Long; Feng, Mu-Ting; Zhao, Yu-Wei; Ding, Xianting; He, Ben; Liu, Xuan
2018-01-01
Aim: Combined use of herbal medicines in patients underwent dual antiplatelet therapy (DAPT) might cause bleeding or thrombosis because herbal medicines with anti-platelet activities may exhibit interactions with DAPT. In this study, we tried to use a feedback system control (FSC) optimization technique to optimize dose strategy and clarify possible interactions in combined use of DAPT and herbal medicines. Methods: Herbal medicines with reported anti-platelet activities were selected by searching related references in Pubmed. Experimental anti-platelet activities of representative compounds originated from these herbal medicines were investigated using in vitro assay, namely ADP-induced aggregation of rat platelet-rich-plasma. FSC scheme hybridized artificial intelligence calculation and bench experiments to iteratively optimize 4-drug combination and 2-drug combination from these drug candidates. Results: Totally 68 herbal medicines were reported to have anti-platelet activities. In the present study, 7 representative compounds from these herbal medicines were selected to study combinatorial drug optimization together with DAPT, i.e., aspirin and ticagrelor. FSC technique first down-selected 9 drug candidates to the most significant 5 drugs. Then, FSC further secured 4 drugs in the optimal combination, including aspirin, ticagrelor, ferulic acid from DangGui, and forskolin from MaoHouQiaoRuiHua. Finally, FSC quantitatively estimated the possible interactions between aspirin:ticagrelor, aspirin:ferulic acid, ticagrelor:forskolin, and ferulic acid:forskolin. The estimation was further verified by experimentally determined Combination Index (CI) values. Conclusion: Results of the present study suggested that FSC optimization technique could be used in optimization of anti-platelet drug combinations and might be helpful in designing personal anti-platelet therapy strategy. Furthermore, FSC analysis could also identify interactions between different drugs which might provide useful information for research of signal cascades in platelet. PMID:29780330
Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems
Van Benthem, Mark H.; Keenan, Michael R.
2008-11-11
A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.
Combinatorial discovery of enzymes with utility in biomass transformation
Fox, Brian G; Elsen, Nathaniel L
2015-02-03
Methods for the cell-free identification of polypeptide and polypeptide combinations with utility in biomass transformation, as well as specific novel polypeptides and cell-free systems containing polypeptide combinations discovered by such methods are disclosed.
Alignment of Tractograms As Graph Matching.
Olivetti, Emanuele; Sharmin, Nusrat; Avesani, Paolo
2016-01-01
The white matter pathways of the brain can be reconstructed as 3D polylines, called streamlines, through the analysis of diffusion magnetic resonance imaging (dMRI) data. The whole set of streamlines is called tractogram and represents the structural connectome of the brain. In multiple applications, like group-analysis, segmentation, or atlasing, tractograms of different subjects need to be aligned. Typically, this is done with registration methods, that transform the tractograms in order to increase their similarity. In contrast with transformation-based registration methods, in this work we propose the concept of tractogram correspondence, whose aim is to find which streamline of one tractogram corresponds to which streamline in another tractogram, i.e., a map from one tractogram to another. As a further contribution, we propose to use the relational information of each streamline, i.e., its distances from the other streamlines in its own tractogram, as the building block to define the optimal correspondence. We provide an operational procedure to find the optimal correspondence through a combinatorial optimization problem and we discuss its similarity to the graph matching problem. In this work, we propose to represent tractograms as graphs and we adopt a recent inexact sub-graph matching algorithm to approximate the solution of the tractogram correspondence problem. On tractograms generated from the Human Connectome Project dataset, we report experimental evidence that tractogram correspondence, implemented as graph matching, provides much better alignment than affine registration and comparable if not better results than non-linear registration of volumes.
Making it stick: chasing the optimal stem cells for cardiac regeneration
Quijada, Pearl; Sussman, Mark A
2014-01-01
Despite the increasing use of stem cells for regenerative-based cardiac therapy, the optimal stem cell population(s) remains in a cloud of uncertainty. In the past decade, the field has witnessed a surge of researchers discovering stem cell populations reported to directly and/or indirectly contribute to cardiac regeneration through processes of cardiomyogenic commitment and/or release of cardioprotective paracrine factors. This review centers upon defining basic biological characteristics of stem cells used for sustaining cardiac integrity during disease and maintenance of communication between the cardiac environment and stem cells. Given the limited successes achieved so far in regenerative therapy, the future requires development of unprecedented concepts involving combinatorial approaches to create and deliver the optimal stem cell(s) that will enhance myocardial healing. PMID:25340282
Optimal placement of actuators and sensors in control augmented structural optimization
NASA Technical Reports Server (NTRS)
Sepulveda, A. E.; Schmit, L. A., Jr.
1990-01-01
A control-augmented structural synthesis methodology is presented in which actuator and sensor placement is treated in terms of (0,1) variables. Structural member sizes and control variables are treated simultaneously as design variables. A multiobjective utopian approach is used to obtain a compromise solution for inherently conflicting objective functions such as strucutal mass control effort and number of actuators. Constraints are imposed on transient displacements, natural frequencies, actuator forces and dynamic stability as well as controllability and observability of the system. The combinatorial aspects of the mixed - (0,1) continuous variable design optimization problem are made tractable by combining approximation concepts with branch and bound techniques. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure set forth.
A coherent Ising machine for 2000-node optimization problems
NASA Astrophysics Data System (ADS)
Inagaki, Takahiro; Haribara, Yoshitaka; Igarashi, Koji; Sonobe, Tomohiro; Tamate, Shuhei; Honjo, Toshimori; Marandi, Alireza; McMahon, Peter L.; Umeki, Takeshi; Enbutsu, Koji; Tadanaga, Osamu; Takenouchi, Hirokazu; Aihara, Kazuyuki; Kawarabayashi, Ken-ichi; Inoue, Kyo; Utsunomiya, Shoko; Takesue, Hiroki
2016-11-01
The analysis and optimization of complex systems can be reduced to mathematical problems collectively known as combinatorial optimization. Many such problems can be mapped onto ground-state search problems of the Ising model, and various artificial spin systems are now emerging as promising approaches. However, physical Ising machines have suffered from limited numbers of spin-spin couplings because of implementations based on localized spins, resulting in severe scalability problems. We report a 2000-spin network with all-to-all spin-spin couplings. Using a measurement and feedback scheme, we coupled time-multiplexed degenerate optical parametric oscillators to implement maximum cut problems on arbitrary graph topologies with up to 2000 nodes. Our coherent Ising machine outperformed simulated annealing in terms of accuracy and computation time for a 2000-node complete graph.
Qiu, Wu; Yuan, Jing; Ukwatta, Eranga; Sun, Yue; Rajchl, Martin; Fenster, Aaron
2014-04-01
We propose a novel global optimization-based approach to segmentation of 3-D prostate transrectal ultrasound (TRUS) and T2 weighted magnetic resonance (MR) images, enforcing inherent axial symmetry of prostate shapes to simultaneously adjust a series of 2-D slice-wise segmentations in a "global" 3-D sense. We show that the introduced challenging combinatorial optimization problem can be solved globally and exactly by means of convex relaxation. In this regard, we propose a novel coherent continuous max-flow model (CCMFM), which derives a new and efficient duality-based algorithm, leading to a GPU-based implementation to achieve high computational speeds. Experiments with 25 3-D TRUS images and 30 3-D T2w MR images from our dataset, and 50 3-D T2w MR images from a public dataset, demonstrate that the proposed approach can segment a 3-D prostate TRUS/MR image within 5-6 s including 4-5 s for initialization, yielding a mean Dice similarity coefficient of 93.2%±2.0% for 3-D TRUS images and 88.5%±3.5% for 3-D MR images. The proposed method also yields relatively low intra- and inter-observer variability introduced by user manual initialization, suggesting a high reproducibility, independent of observers.
Generating subtour elimination constraints for the TSP from pure integer solutions.
Pferschy, Ulrich; Staněk, Rostislav
2017-01-01
The traveling salesman problem ( TSP ) is one of the most prominent combinatorial optimization problems. Given a complete graph [Formula: see text] and non-negative distances d for every edge, the TSP asks for a shortest tour through all vertices with respect to the distances d. The method of choice for solving the TSP to optimality is a branch and cut approach . Usually the integrality constraints are relaxed first and all separation processes to identify violated inequalities are done on fractional solutions . In our approach we try to exploit the impressive performance of current ILP-solvers and work only with integer solutions without ever interfering with fractional solutions. We stick to a very simple ILP-model and relax the subtour elimination constraints only. The resulting problem is solved to integer optimality, violated constraints (which are trivial to find) are added and the process is repeated until a feasible solution is found. In order to speed up the algorithm we pursue several attempts to find as many relevant subtours as possible. These attempts are based on the clustering of vertices with additional insights gained from empirical observations and random graph theory. Computational results are performed on test instances taken from the TSPLIB95 and on random Euclidean graphs .
NASA Astrophysics Data System (ADS)
Long, Kim Chenming
Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this application of the proposed algorithm, TSEA, with several state-of-the-art multiobjective optimization algorithms reveals that TSEA outperforms these algorithms by providing retrofit solutions with greater reliability for the same costs (i.e., closer to the Pareto-optimal front) after the algorithms are executed for the same number of generations. This research also demonstrates that TSEA competes with and, in some situations, outperforms state-of-the-art multiobjective optimization algorithms such as NSGA II and SPEA 2 when applied to classic bicriteria test problems in the technical literature and other complex, sizable real-world applications. The successful implementation of TSEA contributes to the safety of aeronautical structures by providing a systematic way to guide aircraft structural retrofitting efforts, as well as a potentially useful algorithm for a wide range of multiobjective optimization problems in engineering and other fields.
NASA Astrophysics Data System (ADS)
Strom, C. S.; Bennema, P.
1997-03-01
This work (Part II) explores the relation between units and morphology. It shows the equivalence in behaviour between the attachment energies and the results of Monte Carlo growth kinetics simulations. The energetically optimal combination of the F slices in 1 1 0, 0 1 1 and 1 1 1 in a monomolecular interpretation leads to unsatisfactory agreement with experimentally observed morphology. In a tetrameric (or octameric) interpretation, the unit cell must be subdivided self-consistently in terms of stable molecular clusters. Thus, the presence or absence of the 1 1 1 form functions as a direct experimental criterion for distinguishing between monomolecular growth layers, and tetrameric (or octameric) growth layers of the same composition, but subjected to the condition of combinatorial compatibility, as the F slices combine to produce the growth habit. When that condition is taken into account, the tetrameric (or octameric) theoretical morphology in the broken bond model is in good agreement with experiment over a wide range. Subjectmatter for future study is summarized.
Fuentes, Paulina; Zhou, Fei; Erban, Alexander; Karcher, Daniel; Kopka, Joachim; Bock, Ralph
2016-06-14
Artemisinin-based therapies are the only effective treatment for malaria, the most devastating disease in human history. To meet the growing demand for artemisinin and make it accessible to the poorest, an inexpensive and rapidly scalable production platform is urgently needed. Here we have developed a new synthetic biology approach, combinatorial supertransformation of transplastomic recipient lines (COSTREL), and applied it to introduce the complete pathway for artemisinic acid, the precursor of artemisinin, into the high-biomass crop tobacco. We first introduced the core pathway of artemisinic acid biosynthesis into the chloroplast genome. The transplastomic plants were then combinatorially supertransformed with cassettes for all additional enzymes known to affect flux through the artemisinin pathway. By screening large populations of COSTREL lines, we isolated plants that produce more than 120 milligram artemisinic acid per kilogram biomass. Our work provides an efficient strategy for engineering complex biochemical pathways into plants and optimizing the metabolic output.
NASA Astrophysics Data System (ADS)
Goto, Masahiro; Sasaki, Michiko; Xu, Yibin; Zhan, Tianzhuo; Isoda, Yukihiro; Shinohara, Yoshikazu
2017-06-01
p- and n-type bismuth telluride thin films have been synthesized by using a combinatorial sputter coating system (COSCOS). The crystal structure and crystal preferred orientation of the thin films were changed by controlling the coating condition of the radio frequency (RF) power during the sputter coating. As a result, the p- and n-type films and their dimensionless figure of merit (ZT) were optimized by the technique. The properties of the thin films such as the crystal structure, crystal preferred orientation, material composition and surface morphology were analyzed by X-ray diffraction, energy-dispersive X-ray spectroscopy and atomic force microscopy. Also, the thermoelectric properties of the Seebeck coefficient, electrical conductivity and thermal conductivity were measured. ZT for n- and p-type bismuth telluride thin films was found to be 0.27 and 0.40 at RF powers of 90 and 120 W, respectively. The proposed technology can be used to fabricate thermoelectric p-n modules of bismuth telluride without any doping process.
Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching
NASA Astrophysics Data System (ADS)
Shen, Kaiming; Yu, Wei
2018-05-01
This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.
Zhang, Weizhe; Bai, Enci; He, Hui; Cheng, Albert M.K.
2015-01-01
Reducing energy consumption is becoming very important in order to keep battery life and lower overall operational costs for heterogeneous real-time multiprocessor systems. In this paper, we first formulate this as a combinatorial optimization problem. Then, a successful meta-heuristic, called Shuffled Frog Leaping Algorithm (SFLA) is proposed to reduce the energy consumption. Precocity remission and local optimal avoidance techniques are proposed to avoid the precocity and improve the solution quality. Convergence acceleration significantly reduces the search time. Experimental results show that the SFLA-based energy-aware meta-heuristic uses 30% less energy than the Ant Colony Optimization (ACO) algorithm, and 60% less energy than the Genetic Algorithm (GA) algorithm. Remarkably, the running time of the SFLA-based meta-heuristic is 20 and 200 times less than ACO and GA, respectively, for finding the optimal solution. PMID:26110406
An Analytical Framework for Runtime of a Class of Continuous Evolutionary Algorithms.
Zhang, Yushan; Hu, Guiwu
2015-01-01
Although there have been many studies on the runtime of evolutionary algorithms in discrete optimization, relatively few theoretical results have been proposed on continuous optimization, such as evolutionary programming (EP). This paper proposes an analysis of the runtime of two EP algorithms based on Gaussian and Cauchy mutations, using an absorbing Markov chain. Given a constant variation, we calculate the runtime upper bound of special Gaussian mutation EP and Cauchy mutation EP. Our analysis reveals that the upper bounds are impacted by individual number, problem dimension number n, searching range, and the Lebesgue measure of the optimal neighborhood. Furthermore, we provide conditions whereby the average runtime of the considered EP can be no more than a polynomial of n. The condition is that the Lebesgue measure of the optimal neighborhood is larger than a combinatorial calculation of an exponential and the given polynomial of n.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Brian; Samimi, Peyman; Collins, Peter
2017-06-01
A novel method to systematically vary temperature and thus study the resulting microstructure of a material is presented. This new method has the potential to be used in a combinatorial fashion, allowing the rapid study of thermal holds on microstructures to be conducted. This is demonstrated on a beta titanium alloy, where the thermal history has a strong effect on microstructure. It is informed by simulation and executed using the resistive heating capabilities of a Gleeble 3800 thermomechanical simulator. Spatially varying isothermal holds of 4 h were affected, where the temperature range of the multiple isothermal holds varied by ~175more » °C.« less
Van den Bulcke, Marc; Lievens, Antoon; Barbau-Piednoir, Elodie; MbongoloMbella, Guillaume; Roosens, Nancy; Sneyers, Myriam; Casi, Amaya Leunda
2010-03-01
The detection of genetically modified (GM) materials in food and feed products is a complex multi-step analytical process invoking screening, identification, and often quantification of the genetically modified organisms (GMO) present in a sample. "Combinatory qPCR SYBRGreen screening" (CoSYPS) is a matrix-based approach for determining the presence of GM plant materials in products. The CoSYPS decision-support system (DSS) interprets the analytical results of SYBRGREEN qPCR analysis based on four values: the C(t)- and T(m) values and the LOD and LOQ for each method. A theoretical explanation of the different concepts applied in CoSYPS analysis is given (GMO Universe, "Prime number tracing", matrix/combinatory approach) and documented using the RoundUp Ready soy GTS40-3-2 as an example. By applying a limited set of SYBRGREEN qPCR methods and through application of a newly developed "prime number"-based algorithm, the nature of subsets of corresponding GMO in a sample can be determined. Together, these analyses provide guidance for semi-quantitative estimation of GMO presence in a food and feed product.
Searching Fragment Spaces with feature trees.
Lessel, Uta; Wellenzohn, Bernd; Lilienthal, Markus; Claussen, Holger
2009-02-01
Virtual combinatorial chemistry easily produces billions of compounds, for which conventional virtual screening cannot be performed even with the fastest methods available. An efficient solution for such a scenario is the generation of Fragment Spaces, which encode huge numbers of virtual compounds by their fragments/reagents and rules of how to combine them. Similarity-based searches can be performed in such spaces without ever fully enumerating all virtual products. Here we describe the generation of a huge Fragment Space encoding about 5 * 10(11) compounds based on established in-house synthesis protocols for combinatorial libraries, i.e., we encode practically evaluated combinatorial chemistry protocols in a machine readable form, rendering them accessible to in silico search methods. We show how such searches in this Fragment Space can be integrated as a first step in an overall workflow. It reduces the extremely huge number of virtual products by several orders of magnitude so that the resulting list of molecules becomes more manageable for further more elaborated and time-consuming analysis steps. Results of a case study are presented and discussed, which lead to some general conclusions for an efficient expansion of the chemical space to be screened in pharmaceutical companies.
Modeling chemical reactions for drug design.
Gasteiger, Johann
2007-01-01
Chemical reactions are involved at many stages of the drug design process. This starts with the analysis of biochemical pathways that are controlled by enzymes that might be downregulated in certain diseases. In the lead discovery and lead optimization process compounds have to be synthesized in order to test them for their biological activity. And finally, the metabolism of a drug has to be established. A better understanding of chemical reactions could strongly help in making the drug design process more efficient. We have developed methods for quantifying the concepts an organic chemist is using in rationalizing reaction mechanisms. These methods allow a comprehensive modeling of chemical reactivity and thus are applicable to a wide variety of chemical reactions, from gas phase reactions to biochemical pathways. They are empirical in nature and therefore allow the rapid processing of large sets of structures and reactions. We will show here how methods have been developed for the prediction of acidity values and of the regioselectivity in organic reactions, for designing the synthesis of organic molecules and of combinatorial libraries, and for furthering our understanding of enzyme-catalyzed reactions and of the metabolism of drugs.
Dynamic combinatorial libraries: new opportunities in systems chemistry.
Hunt, Rosemary A R; Otto, Sijbren
2011-01-21
Combinatorial chemistry is a tool for selecting molecules with special properties. Dynamic combinatorial chemistry started off aiming to be just that. However, unlike ordinary combinatorial chemistry, the interconnectedness of dynamic libraries gives them an extra dimension. An understanding of these molecular networks at systems level is essential for their use as a selection tool and creates exciting new opportunities in systems chemistry. In this feature article we discuss selected examples and considerations related to the advanced exploitation of dynamic combinatorial libraries for their originally conceived purpose of identifying strong binding interactions. Also reviewed are examples illustrating a trend towards increasing complexity in terms of network behaviour and reversible chemistry. Finally, new applications of dynamic combinatorial chemistry in self-assembly, transport and self-replication are discussed.
NASA Astrophysics Data System (ADS)
Tajuddin, Wan Ahmad
1994-02-01
Ease in finding the configuration at the global energy minimum in a symmetric neural network is important for combinatorial optimization problems. We carry out a comprehensive survey of available strategies for seeking global minima by comparing their performances in the binary representation problem. We recall our previous comparison of steepest descent with analog dynamics, genetic hill-climbing, simulated diffusion, simulated annealing, threshold accepting and simulated tunneling. To this, we add comparisons to other strategies including taboo search and one with field-ordered updating.
Caracciolo, Sergio; Sicuro, Gabriele
2014-10-01
We discuss the equivalence relation between the Euclidean bipartite matching problem on the line and on the circumference and the Brownian bridge process on the same domains. The equivalence allows us to compute the correlation function and the optimal cost of the original combinatorial problem in the thermodynamic limit; moreover, we solve also the minimax problem on the line and on the circumference. The properties of the average cost and correlation functions are discussed.
An Evaluation of a Modified Simulated Annealing Algorithm for Various Formulations
1990-08-01
trials of the K"h Markov chain, is sufficiently close to q(c, ), the stationary distribution at ck la (Lk,c,,) - q(c.) < epsilon Requiring the final...Wiley and Sons . Aarts, E. H. L., & Van Laarhoven, P. J. M. (1985). Statistical cooling: A general approach to combinatorial optimization problems...Birkhoff, G. (1946). Tres observaciones sobre el algebra lineal, Rev. Univ. Nac. TucumanSer. A, 5, 147-151. Bohr, Niels (1913). Old quantum theory
Chen, Xin; Wu, Qiong; Sun, Ruimin; Zhang, Louxin
2012-01-01
The discovery of single-nucleotide polymorphisms (SNPs) has important implications in a variety of genetic studies on human diseases and biological functions. One valuable approach proposed for SNP discovery is based on base-specific cleavage and mass spectrometry. However, it is still very challenging to achieve the full potential of this SNP discovery approach. In this study, we formulate two new combinatorial optimization problems. While both problems are aimed at reconstructing the sample sequence that would attain the minimum number of SNPs, they search over different candidate sequence spaces. The first problem, denoted as SNP - MSP, limits its search to sequences whose in silico predicted mass spectra have all their signals contained in the measured mass spectra. In contrast, the second problem, denoted as SNP - MSQ, limits its search to sequences whose in silico predicted mass spectra instead contain all the signals of the measured mass spectra. We present an exact dynamic programming algorithm for solving the SNP - MSP problem and also show that the SNP - MSQ problem is NP-hard by a reduction from a restricted variation of the 3-partition problem. We believe that an efficient solution to either problem above could offer a seamless integration of information in four complementary base-specific cleavage reactions, thereby improving the capability of the underlying biotechnology for sensitive and accurate SNP discovery.
NASA Astrophysics Data System (ADS)
Umam, M. I. H.; Santosa, B.
2018-04-01
Combinatorial optimization has been frequently used to solve both problems in science, engineering, and commercial applications. One combinatorial problems in the field of transportation is to find a shortest travel route that can be taken from the initial point of departure to point of destination, as well as minimizing travel costs and travel time. When the distance from one (initial) node to another (destination) node is the same with the distance to travel back from destination to initial, this problems known to the Traveling Salesman Problem (TSP), otherwise it call as an Asymmetric Traveling Salesman Problem (ATSP). The most recent optimization techniques is Symbiotic Organisms Search (SOS). This paper discuss how to hybrid the SOS algorithm with variable neighborhoods search (SOS-VNS) that can be applied to solve the ATSP problem. The proposed mechanism to add the variable neighborhoods search as a local search is to generate the better initial solution and then we modify the phase of parasites with adapting mechanism of mutation. After modification, the performance of the algorithm SOS-VNS is evaluated with several data sets and then the results is compared with the best known solution and some algorithm such PSO algorithm and SOS original algorithm. The SOS-VNS algorithm shows better results based on convergence, divergence and computing time.
Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine
2012-12-09
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a multiobjective evolutionary algorithm SPEA2(26), and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Number Partitioning via Quantum Adiabatic Computation
NASA Technical Reports Server (NTRS)
Smelyanskiy, Vadim N.; Toussaint, Udo
2002-01-01
We study both analytically and numerically the complexity of the adiabatic quantum evolution algorithm applied to random instances of combinatorial optimization problems. We use as an example the NP-complete set partition problem and obtain an asymptotic expression for the minimal gap separating the ground and exited states of a system during the execution of the algorithm. We show that for computationally hard problem instances the size of the minimal gap scales exponentially with the problem size. This result is in qualitative agreement with the direct numerical simulation of the algorithm for small instances of the set partition problem. We describe the statistical properties of the optimization problem that are responsible for the exponential behavior of the algorithm.
Combinatorial algorithms for design of DNA arrays.
Hannenhalli, Sridhar; Hubell, Earl; Lipshutz, Robert; Pevzner, Pavel A
2002-01-01
Optimal design of DNA arrays requires the development of algorithms with two-fold goals: reducing the effects caused by unintended illumination (border length minimization problem) and reducing the complexity of masks (mask decomposition problem). We describe algorithms that reduce the number of rectangles in mask decomposition by 20-30% as compared to a standard array design under the assumption that the arrangement of oligonucleotides on the array is fixed. This algorithm produces provably optimal solution for all studied real instances of array design. We also address the difficult problem of finding an arrangement which minimizes the border length and come up with a new idea of threading that significantly reduces the border length as compared to standard designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heredia-Langner, Alejandro; Amidan, Brett G.; Matzner, Shari
We present results from the optimization of a re-identification process using two sets of biometric data obtained from the Civilian American and European Surface Anthropometry Resource Project (CAESAR) database. The datasets contain real measurements of features for 2378 individuals in a standing (43 features) and seated (16 features) position. A genetic algorithm (GA) was used to search a large combinatorial space where different features are available between the probe (seated) and gallery (standing) datasets. Results show that optimized model predictions obtained using less than half of the 43 gallery features and data from roughly 16% of the individuals available producemore » better re-identification rates than two other approaches that use all the information available.« less
Lee, M L; Schneider, G
2001-01-01
Natural products were analyzed to determine whether they contain appealing novel scaffold architectures for potential use in combinatorial chemistry. Ring systems were extracted and clustered on the basis of structural similarity. Several such potential scaffolds for combinatorial chemistry were identified that are not present in current trade drugs. For one of these scaffolds a virtual combinatorial library was generated. Pharmacophoric properties of natural products, trade drugs, and the virtual combinatorial library were assessed using a self-organizing map. Obviously, current trade drugs and natural products have several topological pharmacophore patterns in common. These features can be systematically explored with selected combinatorial libraries based on a combination of natural product-derived and synthetic molecular building blocks.
Integrating Multiple Data Sources for Combinatorial Marker Discovery: A Study in Tumorigenesis.
Bandyopadhyay, Sanghamitra; Mallik, Saurav
2018-01-01
Identification of combinatorial markers from multiple data sources is a challenging task in bioinformatics. Here, we propose a novel computational framework for identifying significant combinatorial markers ( s) using both gene expression and methylation data. The gene expression and methylation data are integrated into a single continuous data as well as a (post-discretized) boolean data based on their intrinsic (i.e., inverse) relationship. A novel combined score of methylation and expression data (viz., ) is introduced which is computed on the integrated continuous data for identifying initial non-redundant set of genes. Thereafter, (maximal) frequent closed homogeneous genesets are identified using a well-known biclustering algorithm applied on the integrated boolean data of the determined non-redundant set of genes. A novel sample-based weighted support ( ) is then proposed that is consecutively calculated on the integrated boolean data of the determined non-redundant set of genes in order to identify the non-redundant significant genesets. The top few resulting genesets are identified as potential s. Since our proposed method generates a smaller number of significant non-redundant genesets than those by other popular methods, the method is much faster than the others. Application of the proposed technique on an expression and a methylation data for Uterine tumor or Prostate Carcinoma produces a set of significant combination of markers. We expect that such a combination of markers will produce lower false positives than individual markers.
Use of combinatorial chemistry to speed drug discovery.
Rádl, S
1998-10-01
IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.
Royston, Kendra J.; Udayakumar, Neha; Lewis, Kayla; Tollefsbol, Trygve O.
2017-01-01
With cancer often classified as a disease that has an important epigenetic component, natural compounds that have the ability to regulate the epigenome become ideal candidates for study. Humans have a complex diet, which illustrates the need to elucidate the mechanisms of interaction between these bioactive compounds in combination. The natural compounds withaferin A (WA), from the Indian winter cherry, and sulforaphane (SFN), from cruciferous vegetables, have numerous anti-cancer effects and some report their ability to regulate epigenetic processes. Our study is the first to investigate the combinatorial effects of low physiologically achievable concentrations of WA and SFN on breast cancer cell proliferation, histone deacetylase1 (HDAC1) and DNA methyltransferases (DNMTs). No adverse effects were observed on control cells at optimal concentrations. There was synergistic inhibition of cellular viability in MCF-7 cells and a greater induction of apoptosis with the combinatorial approach than with either compound administered alone in both MDA-MB-231 and MCF-7 cells. HDAC expression was down-regulated at multiple levels. Lastly, we determined the combined effects of these bioactive compounds on the pro-apoptotic BAX and anti-apoptotic BCL-2 and found decreases in BCL-2 and increases in BAX. Taken together, our findings demonstrate the ability of low concentrations of combinatorial WA and SFN to promote cancer cell death and regulate key epigenetic modifiers in human breast cancer cells. PMID:28534825
Royston, Kendra J; Udayakumar, Neha; Lewis, Kayla; Tollefsbol, Trygve O
2017-05-19
With cancer often classified as a disease that has an important epigenetic component, natural compounds that have the ability to regulate the epigenome become ideal candidates for study. Humans have a complex diet, which illustrates the need to elucidate the mechanisms of interaction between these bioactive compounds in combination. The natural compounds withaferin A (WA), from the Indian winter cherry, and sulforaphane (SFN), from cruciferous vegetables, have numerous anti-cancer effects and some report their ability to regulate epigenetic processes. Our study is the first to investigate the combinatorial effects of low physiologically achievable concentrations of WA and SFN on breast cancer cell proliferation, histone deacetylase1 (HDAC1) and DNA methyltransferases (DNMTs). No adverse effects were observed on control cells at optimal concentrations. There was synergistic inhibition of cellular viability in MCF-7 cells and a greater induction of apoptosis with the combinatorial approach than with either compound administered alone in both MDA-MB-231 and MCF-7 cells. HDAC expression was down-regulated at multiple levels. Lastly, we determined the combined effects of these bioactive compounds on the pro-apoptotic BAX and anti-apoptotic BCL-2 and found decreases in BCL-2 and increases in BAX . Taken together, our findings demonstrate the ability of low concentrations of combinatorial WA and SFN to promote cancer cell death and regulate key epigenetic modifiers in human breast cancer cells.
NATURAL PRODUCTS: A CONTINUING SOURCE OF NOVEL DRUG LEADS
Cragg, Gordon M.; Newman, David J.
2013-01-01
1. Background Nature has been a source of medicinal products for millennia, with many useful drugs developed from plant sources. Following discovery of the penicillins, drug discovery from microbial sources occurred and diving techniques in the 1970s opened the seas. Combinatorial chemistry (late 1980s), shifted the focus of drug discovery efforts from Nature to the laboratory bench. 2. Scope of Review This review traces natural products drug discovery, outlining important drugs from natural sources that revolutionized treatment of serious diseases. It is clear Nature will continue to be a major source of new structural leads, and effective drug development depends on multidisciplinary collaborations. 3. Major Conclusions The explosion of genetic information led not only to novel screens, but the genetic techniques permitted the implementation of combinatorial biosynthetic technology and genome mining. The knowledge gained has allowed unknown molecules to be identified. These novel bioactive structures can be optimized by using combinatorial chemistry generating new drug candidates for many diseases. 4 General Significance: The advent of genetic techniques that permitted the isolation / expression of biosynthetic cassettes from microbes may well be the new frontier for natural products lead discovery. It is now apparent that biodiversity may be much greater in those organisms. The numbers of potential species involved in the microbial world are many orders of magnitude greater than those of plants and multi-celled animals. Coupling these numbers to the number of currently unexpressed biosynthetic clusters now identified (>10 per species) the potential of microbial diversity remains essentially untapped. PMID:23428572
Uncertainty management by relaxation of conflicting constraints in production process scheduling
NASA Technical Reports Server (NTRS)
Dorn, Juergen; Slany, Wolfgang; Stary, Christian
1992-01-01
Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.
Computational approaches for rational design of proteins with novel functionalities
Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul
2012-01-01
Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes. PMID:24688643
NASA Astrophysics Data System (ADS)
Tang, Nicholas C.; Chilkoti, Ashutosh
2016-04-01
Most genes are synthesized using seamless assembly methods that rely on the polymerase chain reaction (PCR). However, PCR of genes encoding repetitive proteins either fails or generates nonspecific products. Motivated by the need to efficiently generate new protein polymers through high-throughput gene synthesis, here we report a codon-scrambling algorithm that enables the PCR-based gene synthesis of repetitive proteins by exploiting the codon redundancy of amino acids and finding the least-repetitive synonymous gene sequence. We also show that the codon-scrambling problem is analogous to the well-known travelling salesman problem, and obtain an exact solution to it by using De Bruijn graphs and a modern mixed integer linear programme solver. As experimental proof of the utility of this approach, we use it to optimize the synthetic genes for 19 repetitive proteins, and show that the gene fragments are amenable to PCR-based gene assembly and recombinant expression.
Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan
2017-01-01
Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.
Hughes, I
1998-09-24
The direct analysis of selected components from combinatorial libraries by sensitive methods such as mass spectrometry is potentially more efficient than deconvolution and tagging strategies since additional steps of resynthesis or introduction of molecular tags are avoided. A substituent selection procedure is described that eliminates the mass degeneracy commonly observed in libraries prepared by "split-and-mix" methods, without recourse to high-resolution mass measurements. A set of simple rules guides the choice of substituents such that all components of the library have unique nominal masses. Additional rules extend the scope by ensuring that characteristic isotopic mass patterns distinguish isobaric components. The method is applicable to libraries having from two to four varying substituent groups and can encode from a few hundred to several thousand components. No restrictions are imposed on the manner in which the "self-coded" library is synthesized or screened.
Combinatorial alloying improves bismuth vanadate photoanodes via reduced monoclinic distortion
Newhouse, P. F.; Guevarra, D.; Umehara, M.; ...
2018-01-01
Energy technologies are enabled by materials innovations, requiring efficient methods to search high dimensional parameter spaces, such as multi-element alloying for enhancing solar fuels photoanodes.
Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred
2011-10-01
A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.
Device for preparing combinatorial libraries in powder metallurgy.
Yang, Shoufeng; Evans, Julian R G
2004-01-01
This paper describes a powder-metering, -mixing, and -dispensing mechanism that can be used as a method for producing large numbers of samples for metallurgical evaluation or electrical or mechanical testing from multicomponent metal and cermet powder systems. It is designed to make use of the same commercial powders that are used in powder metallurgy and, therefore, to produce samples that are faithful to the microstructure of finished products. The particle assemblies produced by the device could be consolidated by die pressing, isostatic pressing, laser sintering, or direct melting. The powder metering valve provides both on/off and flow rate control of dry powders in open capillaries using acoustic vibration. The valve is simple and involves no relative movement, avoiding seizure with fine powders. An orchestra of such valves can be arranged on a building platform to prepare multicomponent combinatorial libraries. As with many combinatorial devices, identification and evaluation of sources of mixing error as a function of sample size is mandatory. Such an analysis is presented.
Loeffler, Felix F; Foertsch, Tobias C; Popov, Roman; Mattes, Daniela S; Schlageter, Martin; Sedlmayr, Martyna; Ridder, Barbara; Dang, Florian-Xuan; von Bojničić-Kninski, Clemens; Weber, Laura K; Fischer, Andrea; Greifenstein, Juliane; Bykovskaya, Valentina; Buliev, Ivan; Bischoff, F Ralf; Hahn, Lothar; Meier, Michael A R; Bräse, Stefan; Powell, Annie K; Balaban, Teodor Silviu; Breitling, Frank; Nesterov-Mueller, Alexander
2016-06-14
Laser writing is used to structure surfaces in many different ways in materials and life sciences. However, combinatorial patterning applications are still limited. Here we present a method for cost-efficient combinatorial synthesis of very-high-density peptide arrays with natural and synthetic monomers. A laser automatically transfers nanometre-thin solid material spots from different donor slides to an acceptor. Each donor bears a thin polymer film, embedding one type of monomer. Coupling occurs in a separate heating step, where the matrix becomes viscous and building blocks diffuse and couple to the acceptor surface. Furthermore, we can consecutively deposit two material layers of activation reagents and amino acids. Subsequent heat-induced mixing facilitates an in situ activation and coupling of the monomers. This allows us to incorporate building blocks with click chemistry compatibility or a large variety of commercially available non-activated, for example, posttranslationally modified building blocks into the array's peptides with >17,000 spots per cm(2).
MIFT: GIFT Combinatorial Geometry Input to VCS Code
1977-03-01
r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package
On k-ary n-cubes: Theory and applications
NASA Technical Reports Server (NTRS)
Mao, Weizhen; Nicol, David M.
1994-01-01
Many parallel processing networks can be viewed as graphs called k-ary n-cubes, whose special cases include rings, hypercubes and toruses. In this paper, combinatorial properties of k-ary n-cubes are explored. In particular, the problem of characterizing the subgraph of a given number of nodes with the maximum edge count is studied. These theoretical results are then used to compute a lower bounding function in branch-and-bound partitioning algorithms and to establish the optimality of some irregular partitions.
Probabilistic Analysis of Combinatorial Optimization Problems on Hypergraph Matchings
2012-02-01
per dimension” ( recall that d is equal to the number of independent subsets of vertices Vk in the hypergraph Hd jn, and n denotes the number of...disjoint solutions whose costs are iid random variables. First, recalling the interpretation of feasible MAP solu- tions as paths in the index graph G, we...elements. On the other hand, recall that a (feasible) path G can be described as a set of n vectors D f.i .1/ 1 ; : : : ; i .1/ d /; : : : ; .i .n
A stochastic discrete optimization model for designing container terminal facilities
NASA Astrophysics Data System (ADS)
Zukhruf, Febri; Frazila, Russ Bona; Burhani, Jzolanda Tsavalista
2017-11-01
As uncertainty essentially affect the total transportation cost, it remains important in the container terminal that incorporates several modes and transshipments process. This paper then presents a stochastic discrete optimization model for designing the container terminal, which involves the decision of facilities improvement action. The container terminal operation model is constructed by accounting the variation of demand and facilities performance. In addition, for illustrating the conflicting issue that practically raises in the terminal operation, the model also takes into account the possible increment delay of facilities due to the increasing number of equipment, especially the container truck. Those variations expectantly reflect the uncertainty issue in the container terminal operation. A Monte Carlo simulation is invoked to propagate the variations by following the observed distribution. The problem is constructed within the framework of the combinatorial optimization problem for investigating the optimal decision of facilities improvement. A new variant of glow-worm swarm optimization (GSO) is thus proposed for solving the optimization, which is rarely explored in the transportation field. The model applicability is tested by considering the actual characteristics of the container terminal.
Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning
NASA Technical Reports Server (NTRS)
Smelyanskiy, V. N.; Toussaint, U. V.; Timucin, D. A.
2002-01-01
We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum excitation gap. g min, = O(n 2(exp -n/2), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to 'the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.
NASA Astrophysics Data System (ADS)
Christen, Hans M.; Ohkubo, Isao; Rouleau, Christopher M.; Jellison, Gerald E., Jr.; Puretzky, Alex A.; Geohegan, David B.; Lowndes, Douglas H.
2005-01-01
Parallel (multi-sample) approaches, such as discrete combinatorial synthesis or continuous compositional-spread (CCS), can significantly increase the rate of materials discovery and process optimization. Here we review our generalized CCS method, based on pulsed-laser deposition, in which the synchronization between laser firing and substrate translation (behind a fixed slit aperture) yields the desired variations of composition and thickness. In situ alloying makes this approach applicable to the non-equilibrium synthesis of metastable phases. Deposition on a heater plate with a controlled spatial temperature variation can additionally be used for growth-temperature-dependence studies. Composition and temperature variations are controlled on length scales large enough to yield sample sizes sufficient for conventional characterization techniques (such as temperature-dependent measurements of resistivity or magnetic properties). This technique has been applied to various experimental studies, and we present here the results for the growth of electro-optic materials (SrxBa1-xNb2O6) and magnetic perovskites (Sr1-xCaxRuO3), and discuss the application to the understanding and optimization of catalysts used in the synthesis of dense forests of carbon nanotubes.
NASA Astrophysics Data System (ADS)
Winfield, J. M.; Douglas, N. H. M.; deSouza, N. M.; Collins, D. J.
2014-05-01
We present the development and application of a phantom for assessment and optimization of fat suppression over a large field-of-view in diffusion-weighted magnetic resonance imaging at 1.5 T and 3 T. A Perspex cylinder (inner diameter 185 mm, height 300 mm) which contains a second cylinder (inner diameter 140 mm) was constructed. The inner cylinder was filled with water doped with copper sulphate and sodium chloride and the annulus was filled with corn oil, which closely matches the spectrum and longitudinal relaxation times of subcutaneous abdominal fat. Placement of the phantom on the couch at 45° to the z-axis presented an elliptical cross-section, which was of a similar size and shape to axial abdominal images. The use of a phantom for optimization of fat suppression allowed quantitative comparison between studies without the differences introduced by variability between human subjects. We have demonstrated that the phantom is suitable for selection of inversion delay times, spectral adiabatic inversion recovery delays and assessment of combinatorial methods of fat suppression. The phantom is valuable in protocol development and the assessment of new techniques, particularly in multi-centre trials.
Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning
NASA Technical Reports Server (NTRS)
Smelyanskiy, Vadius; vonToussaint, Udo V.; Timucin, Dogan A.; Clancy, Daniel (Technical Monitor)
2002-01-01
We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum exitation gap, gmin = O(n2(sup -n/2)), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.
NASA Astrophysics Data System (ADS)
Kunze, Herb; La Torre, Davide; Lin, Jianyi
2017-01-01
We consider the inverse problem associated with IFSM: Given a target function f , find an IFSM, such that its fixed point f ¯ is sufficiently close to f in the Lp distance. Forte and Vrscay [1] showed how to reduce this problem to a quadratic optimization model. In this paper, we extend the collage-based method developed by Kunze, La Torre and Vrscay ([2][3][4]), by proposing the minimization of the 1-norm instead of the 0-norm. In fact, optimization problems involving the 0-norm are combinatorial in nature, and hence in general NP-hard. To overcome these difficulties, we introduce the 1-norm and propose a Sequential Quadratic Programming algorithm to solve the corresponding inverse problem. As in Kunze, La Torre and Vrscay [3] in our formulation, the minimization of collage error is treated as a multi-criteria problem that includes three different and conflicting criteria i.e., collage error, entropy and sparsity. This multi-criteria program is solved by means of a scalarization technique which reduces the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented.
Selective host molecules obtained by dynamic adaptive chemistry.
Matache, Mihaela; Bogdan, Elena; Hădade, Niculina D
2014-02-17
Up till 20 years ago, in order to endow molecules with function there were two mainstream lines of thought. One was to rationally design the positioning of chemical functionalities within candidate molecules, followed by an iterative synthesis-optimization process. The second was the use of a "brutal force" approach of combinatorial chemistry coupled with advanced screening for function. Although both methods provided important results, "rational design" often resulted in time-consuming efforts of modeling and synthesis only to find that the candidate molecule was not performing the designed job. "Combinatorial chemistry" suffered from a fundamental limitation related to the focusing of the libraries employed, often using lead compounds that limit its scope. Dynamic constitutional chemistry has developed as a combination of the two approaches above. Through the rational use of reversible chemical bonds together with a large plethora of precursor libraries, one is now able to build functional structures, ranging from quite simple molecules up to large polymeric structures. Thus, by introduction of the dynamic component within the molecular recognition processes, a new perspective of deciphering the world of the molecular events has aroused together with a new field of chemistry. Since its birth dynamic constitutional chemistry has continuously gained attention, in particular due to its ability to easily create from scratch outstanding molecular structures as well as the addition of adaptive features. The fundamental concepts defining the dynamic constitutional chemistry have been continuously extended to currently place it at the intersection between the supramolecular chemistry and newly defined adaptive chemistry, a pivotal feature towards evolutive chemistry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Combinatorial Nano-Bio Interfaces.
Cai, Pingqiang; Zhang, Xiaoqian; Wang, Ming; Wu, Yun-Long; Chen, Xiaodong
2018-06-08
Nano-bio interfaces are emerging from the convergence of engineered nanomaterials and biological entities. Despite rapid growth, clinical translation of biomedical nanomaterials is heavily compromised by the lack of comprehensive understanding of biophysicochemical interactions at nano-bio interfaces. In the past decade, a few investigations have adopted a combinatorial approach toward decoding nano-bio interfaces. Combinatorial nano-bio interfaces comprise the design of nanocombinatorial libraries and high-throughput bioevaluation. In this Perspective, we address challenges in combinatorial nano-bio interfaces and call for multiparametric nanocombinatorics (composition, morphology, mechanics, surface chemistry), multiscale bioevaluation (biomolecules, organelles, cells, tissues/organs), and the recruitment of computational modeling and artificial intelligence. Leveraging combinatorial nano-bio interfaces will shed light on precision nanomedicine and its potential applications.
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
Apparatus for combinatorial screening of electrochemical materials
Kepler, Keith Douglas [Belmont, CA; Wang, Yu [Foster City, CA
2009-12-15
A high throughput combinatorial screening method and apparatus for the evaluation of electrochemical materials using a single voltage source (2) is disclosed wherein temperature changes arising from the application of an electrical load to a cell array (1) are used to evaluate the relative electrochemical efficiency of the materials comprising the array. The apparatus may include an array of electrochemical cells (1) that are connected to each other in parallel or in series, an electronic load (2) for applying a voltage or current to the electrochemical cells (1), and a device (3), external to the cells, for monitoring the relative temperature of each cell when the load is applied.
Defining Clonal Color in Fluorescent Multi-Clonal Tracking
Wu, Juwell W.; Turcotte, Raphaël; Alt, Clemens; Runnels, Judith M.; Tsao, Hensin; Lin, Charles P.
2016-01-01
Clonal heterogeneity and selection underpin many biological processes including development and tumor progression. Combinatorial fluorescent protein expression in germline cells has proven its utility for tracking the formation and regeneration of different organ systems. Such cell populations encoded by combinatorial fluorescent proteins are also attractive tools for understanding clonal expansion and clonal competition in cancer. However, the assignment of clonal identity requires an analytical framework in which clonal markings can be parameterized and validated. Here we present a systematic and quantitative method for RGB analysis of fluorescent melanoma cancer clones. We then demonstrate refined clonal trackability of melanoma cells using this scheme. PMID:27073117
Antenna array geometry optimization for a passive coherent localisation system
NASA Astrophysics Data System (ADS)
Knott, Peter; Kuschel, Heiner; O'Hagan, Daniel
2012-11-01
Passive Coherent Localisation (PCL), also known as Passive Radar, making use of RF sources of opportunity such as Radio or TV Broadcasting Stations, Cellular Phone Network Base Stations, etc. is an advancing technology for covert operation because no active radar transmitter is required. It is also an attractive addition to existing active radar stations because it has the potential to discover low-flying and low-observable targets. The CORA (Covert Radar) experimental passive radar system currently developed at Fraunhofer-FHR features a multi-channel digital radar receiver and a circular antenna array with separate elements for the VHF- and the UHF-range and is used to exploit alternatively Digital Audio (DAB) or Video Broadcasting (DVB-T) signals. For an extension of the system, a wideband antenna array is being designed for which a new discone antenna element has been developed covering the full DVB-T frequency range. The present paper describes the outline of the system and the numerical modelling and optimisation methods applied to solve the complex task of antenna array design: Electromagnetic full wave analysis is required for the parametric design of the antenna elements while combinatorial optimization methods are applied to find the best array positions and excitation coefficients for a regular omni-directional antenna performance. The different steps are combined in an iterative loop until the optimum array layout is found. Simulation and experimental results for the current system will be shown.
Weber, Gerhard-Wilhelm; Ozöğür-Akyüz, Süreyya; Kropat, Erik
2009-06-01
An emerging research area in computational biology and biotechnology is devoted to mathematical modeling and prediction of gene-expression patterns; it nowadays requests mathematics to deeply understand its foundations. This article surveys data mining and machine learning methods for an analysis of complex systems in computational biology. It mathematically deepens recent advances in modeling and prediction by rigorously introducing the environment and aspects of errors and uncertainty into the genetic context within the framework of matrix and interval arithmetics. Given the data from DNA microarray experiments and environmental measurements, we extract nonlinear ordinary differential equations which contain parameters that are to be determined. This is done by a generalized Chebychev approximation and generalized semi-infinite optimization. Then, time-discretized dynamical systems are studied. By a combinatorial algorithm which constructs and follows polyhedra sequences, the region of parametric stability is detected. In addition, we analyze the topological landscape of gene-environment networks in terms of structural stability. As a second strategy, we will review recent model selection and kernel learning methods for binary classification which can be used to classify microarray data for cancerous cells or for discrimination of other kind of diseases. This review is practically motivated and theoretically elaborated; it is devoted to a contribution to better health care, progress in medicine, a better education, and more healthy living conditions.