Collaboration pathway(s) using new tools for optimizing operational climate monitoring from space
NASA Astrophysics Data System (ADS)
Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.
2014-10-01
Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the needs of decision makers, scientific investigators and global users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent (2014) rulebased decision engine modeling runs that targeted optimizing the intended NPOESS architecture, becomes a surrogate for global operational climate monitoring architecture(s). This rule-based systems tools provide valuable insight for Global climate architectures, through the comparison and evaluation of alternatives considered and the exhaustive range of trade space explored. A representative optimization of Global ECV's (essential climate variables) climate monitoring architecture(s) is explored and described in some detail with thoughts on appropriate rule-based valuations. The optimization tools(s) suggest and support global collaboration pathways and hopefully elicit responses from the audience and climate science shareholders.
Collaboration pathway(s) using new tools for optimizing `operational' climate monitoring from space
NASA Astrophysics Data System (ADS)
Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.
2015-09-01
Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a long term solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the collective needs of policy makers, scientific communities and global academic users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent rule-based expert system (RBES) optimization modeling of the intended NPOESS architecture becomes a surrogate for global operational climate monitoring architecture(s). These rulebased systems tools provide valuable insight for global climate architectures, by comparison/evaluation of alternatives and the sheer range of trade space explored. Optimization of climate monitoring architecture(s) for a partial list of ECV (essential climate variables) is explored and described in detail with dialogue on appropriate rule-based valuations. These optimization tool(s) suggest global collaboration advantages and elicit responses from the audience and climate science community. This paper will focus on recent research exploring joint requirement implications of the high profile NPOESS architecture and extends the research and tools to optimization for a climate centric case study. This reflects work from SPIE RS Conferences 2013 and 2014, abridged for simplification30, 32. First, the heavily securitized NPOESS architecture; inspired the recent research question - was Complexity (as a cost/risk factor) overlooked when considering the benefits of aggregating different missions into a single platform. Now years later a complete reversal; should agencies considering Disaggregation as the answer. We'll discuss what some academic research suggests. Second, using the GCOS requirements of earth climate observations via ECV (essential climate variables) many collected from space-based sensors; and accepting their definitions of global coverages intended to ensure the needs of major global and international organizations (UNFCCC and IPCC) are met as a core objective. Consider how new optimization tools like rule-based engines (RBES) offer alternative methods of evaluating collaborative architectures and constellations? What would the trade space of optimized operational climate monitoring architectures of ECV look like? Third, using the RBES tool kit (2014) demonstrate with application to a climate centric rule-based decision engine - optimizing architectural trades of earth observation satellite systems, allowing comparison(s) to existing architectures and gaining insights for global collaborative architectures. How difficult is it to pull together an optimized climate case study - utilizing for example 12 climate based instruments on multiple existing platforms and nominal handful of orbits; for best cost and performance benefits against the collection requirements of representative set of ECV. How much effort and resources would an organization expect to invest to realize these analysis and utility benefits?
Orbit design and optimization based on global telecommunication performance metrics
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Lee, Charles H.; Kerridge, Stuart; Cheung, Kar-Ming; Edwards, Charles D.
2006-01-01
The orbit selection of telecommunications orbiters is one of the critical design processes and should be guided by global telecom performance metrics and mission-specific constraints. In order to aid the orbit selection, we have coupled the Telecom Orbit Analysis and Simulation Tool (TOAST) with genetic optimization algorithms. As a demonstration, we have applied the developed tool to select an optimal orbit for general Mars telecommunications orbiters with the constraint of being a frozen orbit. While a typical optimization goal is to minimize tele-communications down time, several relevant performance metrics are examined: 1) area-weighted average gap time, 2) global maximum of local maximum gap time, 3) global maximum of local minimum gap time. Optimal solutions are found with each of the metrics. Common and different features among the optimal solutions as well as the advantage and disadvantage of each metric are presented. The optimal solutions are compared with several candidate orbits that were considered during the development of Mars Telecommunications Orbiter.
Global Simulation of Aviation Operations
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Sheth, Kapil; Ng, Hok Kwan; Morando, Alex; Li, Jinhua
2016-01-01
The simulation and analysis of global air traffic is limited due to a lack of simulation tools and the difficulty in accessing data sources. This paper provides a global simulation of aviation operations combining flight plans and real air traffic data with historical commercial city-pair aircraft type and schedule data and global atmospheric data. The resulting capability extends the simulation and optimization functions of NASA's Future Air Traffic Management Concept Evaluation Tool (FACET) to global scale. This new capability is used to present results on the evolution of global air traffic patterns from a concentration of traffic inside US, Europe and across the Atlantic Ocean to a more diverse traffic pattern across the globe with accelerated growth in Asia, Australia, Africa and South America. The simulation analyzes seasonal variation in the long-haul wind-optimal traffic patterns in six major regions of the world and provides potential time-savings of wind-optimal routes compared with either great circle routes or current flight-plans if available.
Puig, V; Cembrano, G; Romera, J; Quevedo, J; Aznar, B; Ramón, G; Cabot, J
2009-01-01
This paper deals with the global control of the Riera Blanca catchment in the Barcelona sewer network using a predictive optimal control approach. This catchment has been modelled using a conceptual modelling approach based on decomposing the catchments in subcatchments and representing them as virtual tanks. This conceptual modelling approach allows real-time model calibration and control of the sewer network. The global control problem of the Riera Blanca catchment is solved using a optimal/predictive control algorithm. To implement the predictive optimal control of the Riera Blanca catchment, a software tool named CORAL is used. The on-line control is simulated by interfacing CORAL with a high fidelity simulator of sewer networks (MOUSE). CORAL interchanges readings from the limnimeters and gate commands with MOUSE as if it was connected with the real SCADA system. Finally, the global control results obtained using the predictive optimal control are presented and compared against the results obtained using current local control system. The results obtained using the global control are very satisfactory compared to those obtained using the local control.
Hull Form Design and Optimization Tool Development
2012-07-01
global minimum. The algorithm accomplishes this by using a method known as metaheuristics which allows the algorithm to examine a large area by...further development of these tools including the implementation and testing of a new optimization algorithm , the improvement of a rapid hull form...under the 2012 Naval Research Enterprise Intern Program. 15. SUBJECT TERMS hydrodynamic, hull form, generation, optimization, algorithm
An Automatic Medium to High Fidelity Low-Thrust Global Trajectory Toolchain; EMTG-GMAT
NASA Technical Reports Server (NTRS)
Beeson, Ryne T.; Englander, Jacob A.; Hughes, Steven P.; Schadegg, Maximillian
2015-01-01
Solving the global optimization, low-thrust, multiple-flyby interplanetary trajectory problem with high-fidelity dynamical models requires an unreasonable amount of computational resources. A better approach, and one that is demonstrated in this paper, is a multi-step process whereby the solution of the aforementioned problem is solved at a lower-fidelity and this solution is used as an initial guess for a higher-fidelity solver. The framework presented in this work uses two tools developed by NASA Goddard Space Flight Center: the Evolutionary Mission Trajectory Generator (EMTG) and the General Mission Analysis Tool (GMAT). EMTG is a medium to medium-high fidelity low-thrust interplanetary global optimization solver, which now has the capability to automatically generate GMAT script files for seeding a high-fidelity solution using GMAT's local optimization capabilities. A discussion of the dynamical models as well as thruster and power modeling for both EMTG and GMAT are given in this paper. Current capabilities are demonstrated with examples that highlight the toolchains ability to efficiently solve the difficult low-thrust global optimization problem with little human intervention.
ConvAn: a convergence analyzing tool for optimization of biochemical networks.
Kostromins, Andrejs; Mozga, Ivars; Stalidzans, Egils
2012-01-01
Dynamic models of biochemical networks usually are described as a system of nonlinear differential equations. In case of optimization of models for purpose of parameter estimation or design of new properties mainly numerical methods are used. That causes problems of optimization predictability as most of numerical optimization methods have stochastic properties and the convergence of the objective function to the global optimum is hardly predictable. Determination of suitable optimization method and necessary duration of optimization becomes critical in case of evaluation of high number of combinations of adjustable parameters or in case of large dynamic models. This task is complex due to variety of optimization methods, software tools and nonlinearity features of models in different parameter spaces. A software tool ConvAn is developed to analyze statistical properties of convergence dynamics for optimization runs with particular optimization method, model, software tool, set of optimization method parameters and number of adjustable parameters of the model. The convergence curves can be normalized automatically to enable comparison of different methods and models in the same scale. By the help of the biochemistry adapted graphical user interface of ConvAn it is possible to compare different optimization methods in terms of ability to find the global optima or values close to that as well as the necessary computational time to reach them. It is possible to estimate the optimization performance for different number of adjustable parameters. The functionality of ConvAn enables statistical assessment of necessary optimization time depending on the necessary optimization accuracy. Optimization methods, which are not suitable for a particular optimization task, can be rejected if they have poor repeatability or convergence properties. The software ConvAn is freely available on www.biosystems.lv/convan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Improving Environmental Model Calibration and Prediction
2011-01-18
REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13
Global Design Optimization for Fluid Machinery Applications
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Tucker, Kevin; Vaidyanathan, Raj; Griffin, Lisa
2000-01-01
Recent experiences in utilizing the global optimization methodology, based on polynomial and neural network techniques for fluid machinery design are summarized. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. Another advantage is that these methods do not need to calculate the sensitivity of each design variable locally. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables and methods for predicting the model performance. Examples of applications selected from rocket propulsion components including a supersonic turbine and an injector element and a turbulent flow diffuser are used to illustrate the usefulness of the global optimization method.
Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.
Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang
2016-11-01
Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.
Investing in breastfeeding - the world breastfeeding costing initiative.
Holla-Bhar, Radha; Iellamo, Alessandro; Gupta, Arun; Smith, Julie P; Dadhich, Jai Prakash
2015-01-01
Despite scientific evidence substantiating the importance of breastfeeding in child survival and development and its economic benefits, assessments show gaps in many countries' implementation of the 2003 WHO and UNICEF Global Strategy for Infant and Young Child Feeding (Global Strategy). Optimal breastfeeding is a particular example: initiation of breastfeeding within the first hour of birth, exclusive breastfeeding for the first six months; and continued breastfeeding for two years or more, together with safe, adequate, appropriate, responsive complementary feeding starting in the sixth month. While the understanding of "optimal" may vary among countries, there is a need for governments to facilitate an enabling environment for women to achieve optimal breastfeeding. Lack of financial resources for key programs is a major impediment, making economic perspectives important for implementation. Globally, while achieving optimal breastfeeding could prevent more than 800,000 under five deaths annually, in 2013, US$58 billion was spent on commercial baby food including milk formula. Support for improved breastfeeding is inadequately prioritized by policy and practice internationally. The World Breastfeeding Costing Initiative (WBCi) launched in 2013, attempts to determine the financial investment that is necessary to implement the Global Strategy, and to introduce a tool to estimate the costs for individual countries. The article presents detailed cost estimates for implementing the Global Strategy, and outlines the WBCi Financial Planning Tool. Estimates use demographic data from UNICEF's State of the World's Children 2013. The WBCi takes a programmatic approach to scaling up interventions, including policy and planning, health and nutrition care systems, community services and mother support, media promotion, maternity protection, WHO International Code of Marketing of Breastmilk Substitutes implementation, monitoring and research, for optimal breastfeeding practices. The financial cost of a program to implement the Global Strategy in 214 countries is estimated at US $17.5 billion ($130 per live birth). The major recurring cost is maternity entitlements. WBCi is a policy advocacy initiative to encourage integrated actions that enable breastfeeding. WBCi will help countries plan and prioritize actions and budget them accurately. International agencies and donors can also use the tool to calculate or track investments in breastfeeding.
Global Design Optimization for Aerodynamics and Rocket Propulsion Components
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)
2000-01-01
Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.
Optimization of Milling Parameters Employing Desirability Functions
NASA Astrophysics Data System (ADS)
Ribeiro, J. L. S.; Rubio, J. C. Campos; Abrão, A. M.
2011-01-01
The principal aim of this paper is to investigate the influence of tool material (one cermet and two coated carbide grades), cutting speed and feed rate on the machinability of hardened AISI H13 hot work steel, in order to identify the cutting conditions which lead to optimal performance. A multiple response optimization procedure based on tool life, surface roughness, milling forces and the machining time (required to produce a sample cavity) was employed. The results indicated that the TiCN-TiN coated carbide and cermet presented similar results concerning the global optimum values for cutting speed and feed rate per tooth, outperforming the TiN-TiCN-Al2O3 coated carbide tool.
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
NASA Astrophysics Data System (ADS)
Auluck, S. K. H.
2014-12-01
Dense plasma focus (DPF) is known to produce highly energetic ions, electrons and plasma environment which can be used for breeding short-lived isotopes, plasma nanotechnology and other material processing applications. Commercial utilization of DPF in such areas would need a design tool that can be deployed in an automatic search for the best possible device configuration for a given application. The recently revisited (Auluck 2013 Phys. Plasmas 20 112501) Gratton-Vargas (GV) two-dimensional analytical snowplow model of plasma focus provides a numerical formula for dynamic inductance of a Mather-type plasma focus fitted to thousands of automated computations, which enables the construction of such a design tool. This inductance formula is utilized in the present work to explore global optimization, based on first-principles optimality criteria, in a four-dimensional parameter-subspace of the zero-resistance GV model. The optimization process is shown to reproduce the empirically observed constancy of the drive parameter over eight decades in capacitor bank energy. The optimized geometry of plasma focus normalized to the anode radius is shown to be independent of voltage, while the optimized anode radius is shown to be related to capacitor bank inductance.
Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program
NASA Astrophysics Data System (ADS)
Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas
2016-04-01
The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.
NASA Astrophysics Data System (ADS)
Aittokoski, Timo; Miettinen, Kaisa
2008-07-01
Solving real-life engineering problems can be difficult because they often have multiple conflicting objectives, the objective functions involved are highly nonlinear and they contain multiple local minima. Furthermore, function values are often produced via a time-consuming simulation process. These facts suggest the need for an automated optimization tool that is efficient (in terms of number of objective function evaluations) and capable of solving global and multiobjective optimization problems. In this article, the requirements on a general simulation-based optimization system are discussed and such a system is applied to optimize the performance of a two-stroke combustion engine. In the example of a simulation-based optimization problem, the dimensions and shape of the exhaust pipe of a two-stroke engine are altered, and values of three conflicting objective functions are optimized. These values are derived from power output characteristics of the engine. The optimization approach involves interactive multiobjective optimization and provides a convenient tool to balance between conflicting objectives and to find good solutions.
Aeroelastic Optimization Study Based on X-56A Model
NASA Technical Reports Server (NTRS)
Li, Wesley; Pak, Chan-Gi
2014-01-01
A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. Two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center were presented. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. A hybrid and discretization optimization approach was implemented to improve accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study. The results provide guidance to modify the fabricated flexible wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished.
A multilevel control system for the large space telescope. [numerical analysis/optimal control
NASA Technical Reports Server (NTRS)
Siljak, D. D.; Sundareshan, S. K.; Vukcevic, M. B.
1975-01-01
A multilevel scheme was proposed for control of Large Space Telescope (LST) modeled by a three-axis-six-order nonlinear equation. Local controllers were used on the subsystem level to stabilize motions corresponding to the three axes. Global controllers were applied to reduce (and sometimes nullify) the interactions among the subsystems. A multilevel optimization method was developed whereby local quadratic optimizations were performed on the subsystem level, and global control was again used to reduce (nullify) the effect of interactions. The multilevel stabilization and optimization methods are presented as general tools for design and then used in the design of the LST Control System. The methods are entirely computerized, so that they can accommodate higher order LST models with both conceptual and numerical advantages over standard straightforward design techniques.
Application of the gravity search algorithm to multi-reservoir operation optimization
NASA Astrophysics Data System (ADS)
Bozorg-Haddad, Omid; Janbaz, Mahdieh; Loáiciga, Hugo A.
2016-12-01
Complexities in river discharge, variable rainfall regime, and drought severity merit the use of advanced optimization tools in multi-reservoir operation. The gravity search algorithm (GSA) is an evolutionary optimization algorithm based on the law of gravity and mass interactions. This paper explores the GSA's efficacy for solving benchmark functions, single reservoir, and four-reservoir operation optimization problems. The GSA's solutions are compared with those of the well-known genetic algorithm (GA) in three optimization problems. The results show that the GSA's results are closer to the optimal solutions than the GA's results in minimizing the benchmark functions. The average values of the objective function equal 1.218 and 1.746 with the GSA and GA, respectively, in solving the single-reservoir hydropower operation problem. The global solution equals 1.213 for this same problem. The GSA converged to 99.97% of the global solution in its average-performing history, while the GA converged to 97% of the global solution of the four-reservoir problem. Requiring fewer parameters for algorithmic implementation and reaching the optimal solution in fewer number of functional evaluations are additional advantages of the GSA over the GA. The results of the three optimization problems demonstrate a superior performance of the GSA for optimizing general mathematical problems and the operation of reservoir systems.
Recent Advances in Source Localisation Using Range Measurements
2015-10-01
Range Weighted SR- LS ............................................................................................ 5 GEOLOCATION USING SEMIDEFINITE... LS ) and the squared range least squares (SR- LS ) [3]. The R- LS -based formulation is of great interest and has been known for its optimal performance...to efficiently compute an R- LS position estimate. A number of optimization tools may be applied to globally solve the R- LS problem and are usually
Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories
NASA Technical Reports Server (NTRS)
Ng, Hok Kwan; Sridhar, Banavar
2016-01-01
This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.
Dynamic optimization of chemical processes using ant colony framework.
Rajesh, J; Gupta, K; Kusumakar, H S; Jayaraman, V K; Kulkarni, B D
2001-11-01
Ant colony framework is illustrated by considering dynamic optimization of six important bench marking examples. This new computational tool is simple to implement and can tackle problems with state as well as terminal constraints in a straightforward fashion. It requires fewer grid points to reach the global optimum at relatively very low computational effort. The examples with varying degree of complexities, analyzed here, illustrate its potential for solving a large class of process optimization problems in chemical engineering.
Generating community-built tools for data sharing and analysis in environmental networks
Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David
2016-01-01
Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.
PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.
Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter
2016-04-01
Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Fan, Mingyi; Hu, Jiwei; Cao, Rensheng; Ruan, Wenqian; Wei, Xionghui
2018-06-01
Water pollution occurs mainly due to inorganic and organic pollutants, such as nutrients, heavy metals and persistent organic pollutants. For the modeling and optimization of pollutants removal, artificial intelligence (AI) has been used as a major tool in the experimental design that can generate the optimal operational variables, since AI has recently gained a tremendous advance. The present review describes the fundamentals, advantages and limitations of AI tools. Artificial neural networks (ANNs) are the AI tools frequently adopted to predict the pollutants removal processes because of their capabilities of self-learning and self-adapting, while genetic algorithm (GA) and particle swarm optimization (PSO) are also useful AI methodologies in efficient search for the global optima. This article summarizes the modeling and optimization of pollutants removal processes in water treatment by using multilayer perception, fuzzy neural, radial basis function and self-organizing map networks. Furthermore, the results conclude that the hybrid models of ANNs with GA and PSO can be successfully applied in water treatment with satisfactory accuracies. Finally, the limitations of current AI tools and their new developments are also highlighted for prospective applications in the environmental protection. Copyright © 2018 Elsevier Ltd. All rights reserved.
Strategies for global optimization in photonics design.
Vukovic, Ana; Sewell, Phillip; Benson, Trevor M
2010-10-01
This paper reports on two important issues that arise in the context of the global optimization of photonic components where large problem spaces must be investigated. The first is the implementation of a fast simulation method and associated matrix solver for assessing particular designs and the second, the strategies that a designer can adopt to control the size of the problem design space to reduce runtimes without compromising the convergence of the global optimization tool. For this study an analytical simulation method based on Mie scattering and a fast matrix solver exploiting the fast multipole method are combined with genetic algorithms (GAs). The impact of the approximations of the simulation method on the accuracy and runtime of individual design assessments and the consequent effects on the GA are also examined. An investigation of optimization strategies for controlling the design space size is conducted on two illustrative examples, namely, 60° and 90° waveguide bends based on photonic microstructures, and their effectiveness is analyzed in terms of a GA's ability to converge to the best solution within an acceptable timeframe. Finally, the paper describes some particular optimized solutions found in the course of this work.
Optimal water networks in protein cavities with GAsol and 3D-RISM.
Fusani, Lucia; Wall, Ian; Palmer, David; Cortes, Alvaro
2018-06-01
Water molecules in protein binding sites play essential roles in biological processes. The popular 3D-RISM prediction method can calculate the solvent density distribution within minutes, but is difficult to convert it into explicit water molecules. We present GAsol, a tool that is capable of finding the network of water molecules that best fits a particular 3D-RISM density distribution in a fast and accurate manner and that outperforms other available tools by finding the globally optimal solution thanks to its genetic algorithm. https://github.com/accsc/GAsol. BSD 3-clauses license. alvaro.x.cortes@gsk.com. Supplementary data are available at Bioinformatics online.
System Risk Assessment and Allocation in Conceptual Design
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)
2003-01-01
As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.
Martian resource locations: Identification and optimization
NASA Astrophysics Data System (ADS)
Chamitoff, Gregory; James, George; Barker, Donald; Dershowitz, Adam
2005-04-01
The identification and utilization of in situ Martian natural resources is the key to enable cost-effective long-duration missions and permanent human settlements on Mars. This paper presents a powerful software tool for analyzing Martian data from all sources, and for optimizing mission site selection based on resource collocation. This program, called Planetary Resource Optimization and Mapping Tool (PROMT), provides a wide range of analysis and display functions that can be applied to raw data or imagery. Thresholds, contours, custom algorithms, and graphical editing are some of the various methods that can be used to process data. Output maps can be created to identify surface regions on Mars that meet any specific criteria. The use of this tool for analyzing data, generating maps, and collocating features is demonstrated using data from the Mars Global Surveyor and the Odyssey spacecraft. The overall mission design objective is to maximize a combination of scientific return and self-sufficiency based on utilization of local materials. Landing site optimization involves maximizing accessibility to collocated science and resource features within a given mission radius. Mission types are categorized according to duration, energy resources, and in situ resource utilization. Preliminary optimization results are shown for a number of mission scenarios.
A Cost Impact Assessment Tool for PFS Logistics Consulting.
1997-02-18
optimization is explored extensively in the mathematical programming literature. (Sengupta and Turnbull, 1996; Arntzen , Brown, Harrison, and Trafton, 1995...Addison-Wesley, 1974. Arntzen , B. C., G. G. Brown, T. P. Harrison, and L. Trafton. "Global Supply Chain Management at Digital Equipment Corporation
Integration of Linear Dynamic Emission and Climate Models with Air Traffic Simulations
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Ng, Hok K.; Chen, Neil Y.
2012-01-01
Future air traffic management systems are required to balance the conflicting objectives of maximizing safety and efficiency of traffic flows while minimizing the climate impact of aviation emissions and contrails. Integrating emission and climate models together with air traffic simulations improve the understanding of the complex interaction between the physical climate system, carbon and other greenhouse gas emissions and aviation activity. This paper integrates a national-level air traffic simulation and optimization capability with simple climate models and carbon cycle models, and climate metrics to assess the impact of aviation on climate. The capability can be used to make trade-offs between extra fuel cost and reduction in global surface temperature change. The parameters in the simulation can be used to evaluate the effect of various uncertainties in emission models and contrails and the impact of different decision horizons. Alternatively, the optimization results from the simulation can be used as inputs to other tools that monetize global climate impacts like the FAA s Aviation Environmental Portfolio Management Tool for Impacts.
A traveling salesman approach for predicting protein functions.
Johnson, Olin; Liu, Jing
2006-10-12
Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.
A traveling salesman approach for predicting protein functions
Johnson, Olin; Liu, Jing
2006-01-01
Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783
Composite Structure Optimization with Genetic Algorithm
NASA Astrophysics Data System (ADS)
Deslandes, Olivier
2014-06-01
In the frame of optimization studies in CNES launcher directorate structure, thermic and material department, the need of an optimization tool based on metaheuristic and finite element models for composite structural dimensioning was underlined.Indeed, composite structures need complex optimization methodologies in order to be really compared to metallic structures with regard to mass, static strength and stiffness constraints (metallic structures using optimization methods better known).After some bibliography research, the use of a genetic algorithm coupled with design of experiment to generate the initial population was chosen. Academic functions were used to validate the optimization process and then it was applied to an industrial study aiming to optimize an interstage skirt with regard to its mass, stiffness and stability (global buckling).
Inverse design of bulk morphologies in block copolymers using particle swarm optimization
NASA Astrophysics Data System (ADS)
Khadilkar, Mihir; Delaney, Kris; Fredrickson, Glenn
Multiblock polymers are a versatile platform for creating a large range of nanostructured materials with novel morphologies and properties. However, achieving desired structures or property combinations is difficult due to a vast design space comprised of parameters including monomer species, block sequence, block molecular weights and dispersity, copolymer architecture, and binary interaction parameters. Navigating through such vast design spaces to achieve an optimal formulation for a target structure or property set requires an efficient global optimization tool wrapped around a forward simulation technique such as self-consistent field theory (SCFT). We report on such an inverse design strategy utilizing particle swarm optimization (PSO) as the global optimizer and SCFT as the forward prediction engine. To avoid metastable states in forward prediction, we utilize pseudo-spectral variable cell SCFT initiated from a library of defect free seeds of known block copolymer morphologies. We demonstrate that our approach allows for robust identification of block copolymers and copolymer alloys that self-assemble into a targeted structure, optimizing parameters such as block fractions, blend fractions, and Flory chi parameters.
Optimization of Microelectronic Devices for Sensor Applications
NASA Technical Reports Server (NTRS)
Cwik, Tom; Klimeck, Gerhard
2000-01-01
The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.
A global carbon assimilation system based on a dual optimization method
NASA Astrophysics Data System (ADS)
Zheng, H.; Li, Y.; Chen, J. M.; Wang, T.; Huang, Q.; Huang, W. X.; Wang, L. H.; Li, S. M.; Yuan, W. P.; Zheng, X.; Zhang, S. P.; Chen, Z. Q.; Jiang, F.
2015-02-01
Ecological models are effective tools for simulating the distribution of global carbon sources and sinks. However, these models often suffer from substantial biases due to inaccurate simulations of complex ecological processes. We introduce a set of scaling factors (parameters) to an ecological model on the basis of plant functional type (PFT) and latitudes. A global carbon assimilation system (GCAS-DOM) is developed by employing a dual optimization method (DOM) to invert the time-dependent ecological model parameter state and the net carbon flux state simultaneously. We use GCAS-DOM to estimate the global distribution of the CO2 flux on 1° × 1° grid cells for the period from 2001 to 2007. Results show that land and ocean absorb -3.63 ± 0.50 and -1.82 ± 0.16 Pg C yr-1, respectively. North America, Europe and China contribute -0.98 ± 0.15, -0.42 ± 0.08 and -0.20 ± 0.29 Pg C yr-1, respectively. The uncertainties in the flux after optimization by GCAS-DOM have been remarkably reduced by more than 60%. Through parameter optimization, GCAS-DOM can provide improved estimates of the carbon flux for each PFT. Coniferous forest (-0.97 ± 0.27 Pg C yr-1) is the largest contributor to the global carbon sink. Fluxes of once-dominant deciduous forest generated by the Boreal Ecosystems Productivity Simulator (BEPS) are reduced to -0.78 ± 0.23 Pg C yr-1, the third largest carbon sink.
An R package for the design, analysis and operation of reservoir systems
NASA Astrophysics Data System (ADS)
Turner, Sean; Ng, Jia Yi; Galelli, Stefano
2016-04-01
We present a new R package - named "reservoir" - which has been designed for rapid and easy routing of runoff through storage. The package comprises well-established tools for capacity design (e.g., the sequent peak algorithm), performance analysis (storage-yield-reliability and reliability-resilience-vulnerability analysis) and release policy optimization (Stochastic Dynamic Programming). Operating rules can be optimized for water supply, flood control and amenity objectives, as well as for maximum hydropower production. Storage-depth-area relationships are in-built, allowing users to incorporate evaporation from the reservoir surface. We demonstrate the capabilities of the software for global studies using thousands of reservoirs from the Global Reservoir and Dam (GRanD) database fed by historical monthly inflow time series from a 0.5 degree gridded global runoff dataset. The package is freely available through the Comprehensive R Archive Network (CRAN).
NASA Technical Reports Server (NTRS)
Krasteva, Denitza T.
1998-01-01
Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.
Cascade Optimization Strategy for Aircraft and Air-Breathing Propulsion System Concepts
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Lavelle, Thomas M.; Hopkins, Dale A.; Coroneos, Rula M.
1996-01-01
Design optimization for subsonic and supersonic aircraft and for air-breathing propulsion engine concepts has been accomplished by soft-coupling the Flight Optimization System (FLOPS) and the NASA Engine Performance Program analyzer (NEPP), to the NASA Lewis multidisciplinary optimization tool COMETBOARDS. Aircraft and engine design problems, with their associated constraints and design variables, were cast as nonlinear optimization problems with aircraft weight and engine thrust as the respective merit functions. Because of the diversity of constraint types and the overall distortion of the design space, the most reliable single optimization algorithm available in COMETBOARDS could not produce a satisfactory feasible optimum solution. Some of COMETBOARDS' unique features, which include a cascade strategy, variable and constraint formulations, and scaling devised especially for difficult multidisciplinary applications, successfully optimized the performance of both aircraft and engines. The cascade method has two principal steps: In the first, the solution initiates from a user-specified design and optimizer, in the second, the optimum design obtained in the first step with some random perturbation is used to begin the next specified optimizer. The second step is repeated for a specified sequence of optimizers or until a successful solution of the problem is achieved. A successful solution should satisfy the specified convergence criteria and have several active constraints but no violated constraints. The cascade strategy available in the combined COMETBOARDS, FLOPS, and NEPP design tool converges to the same global optimum solution even when it starts from different design points. This reliable and robust design tool eliminates manual intervention in the design of aircraft and of air-breathing propulsion engines where it eases the cycle analysis procedures. The combined code is also much easier to use, which is an added benefit. This paper describes COMETBOARDS and its cascade strategy and illustrates the capability of the combined design tool through the optimization of a subsonic aircraft and a high-bypass-turbofan wave-rotor-topped engine.
Huang, Si-Da; Shang, Cheng; Zhang, Xiao-Jie; Liu, Zhi-Pan
2017-09-01
While the underlying potential energy surface (PES) determines the structure and other properties of a material, it has been frustrating to predict new materials from theory even with the advent of supercomputing facilities. The accuracy of the PES and the efficiency of PES sampling are two major bottlenecks, not least because of the great complexity of the material PES. This work introduces a "Global-to-Global" approach for material discovery by combining for the first time a global optimization method with neural network (NN) techniques. The novel global optimization method, named the stochastic surface walking (SSW) method, is carried out massively in parallel for generating a global training data set, the fitting of which by the atom-centered NN produces a multi-dimensional global PES; the subsequent SSW exploration of large systems with the analytical NN PES can provide key information on the thermodynamics and kinetics stability of unknown phases identified from global PESs. We describe in detail the current implementation of the SSW-NN method with particular focuses on the size of the global data set and the simultaneous energy/force/stress NN training procedure. An important functional material, TiO 2 , is utilized as an example to demonstrate the automated global data set generation, the improved NN training procedure and the application in material discovery. Two new TiO 2 porous crystal structures are identified, which have similar thermodynamics stability to the common TiO 2 rutile phase and the kinetics stability for one of them is further proved from SSW pathway sampling. As a general tool for material simulation, the SSW-NN method provides an efficient and predictive platform for large-scale computational material screening.
Van Derlinden, E; Bernaerts, K; Van Impe, J F
2010-05-21
Optimal experiment design for parameter estimation (OED/PE) has become a popular tool for efficient and accurate estimation of kinetic model parameters. When the kinetic model under study encloses multiple parameters, different optimization strategies can be constructed. The most straightforward approach is to estimate all parameters simultaneously from one optimal experiment (single OED/PE strategy). However, due to the complexity of the optimization problem or the stringent limitations on the system's dynamics, the experimental information can be limited and parameter estimation convergence problems can arise. As an alternative, we propose to reduce the optimization problem to a series of two-parameter estimation problems, i.e., an optimal experiment is designed for a combination of two parameters while presuming the other parameters known. Two different approaches can be followed: (i) all two-parameter optimal experiments are designed based on identical initial parameter estimates and parameters are estimated simultaneously from all resulting experimental data (global OED/PE strategy), and (ii) optimal experiments are calculated and implemented sequentially whereby the parameter values are updated intermediately (sequential OED/PE strategy). This work exploits OED/PE for the identification of the Cardinal Temperature Model with Inflection (CTMI) (Rosso et al., 1993). This kinetic model describes the effect of temperature on the microbial growth rate and encloses four parameters. The three OED/PE strategies are considered and the impact of the OED/PE design strategy on the accuracy of the CTMI parameter estimation is evaluated. Based on a simulation study, it is observed that the parameter values derived from the sequential approach deviate more from the true parameters than the single and global strategy estimates. The single and global OED/PE strategies are further compared based on experimental data obtained from design implementation in a bioreactor. Comparable estimates are obtained, but global OED/PE estimates are, in general, more accurate and reliable. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Mars Mission Optimization Based on Collocation of Resources
NASA Technical Reports Server (NTRS)
Chamitoff, G. E.; James, G. H.; Barker, D. C.; Dershowitz, A. L.
2003-01-01
This paper presents a powerful approach for analyzing Martian data and for optimizing mission site selection based on resource collocation. This approach is implemented in a program called PROMT (Planetary Resource Optimization and Mapping Tool), which provides a wide range of analysis and display functions that can be applied to raw data or imagery. Thresholds, contours, custom algorithms, and graphical editing are some of the various methods that can be used to process data. Output maps can be created to identify surface regions on Mars that meet any specific criteria. The use of this tool for analyzing data, generating maps, and collocating features is demonstrated using data from the Mars Global Surveyor and the Odyssey spacecraft. The overall mission design objective is to maximize a combination of scientific return and self-sufficiency based on utilization of local materials. Landing site optimization involves maximizing accessibility to collocated science and resource features within a given mission radius. Mission types are categorized according to duration, energy resources, and in-situ resource utilization. Optimization results are shown for a number of mission scenarios.
NASA Astrophysics Data System (ADS)
Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.
2015-08-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.
Aeroelastic Optimization Study Based on the X-56A Model
NASA Technical Reports Server (NTRS)
Li, Wesley W.; Pak, Chan-Gi
2014-01-01
One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.
Characterizing L1-norm best-fit subspaces
NASA Astrophysics Data System (ADS)
Brooks, J. Paul; Dulá, José H.
2017-05-01
Fitting affine objects to data is the basis of many tools and methodologies in statistics, machine learning, and signal processing. The L1 norm is often employed to produce subspaces exhibiting a robustness to outliers and faulty observations. The L1-norm best-fit subspace problem is directly formulated as a nonlinear, nonconvex, and nondifferentiable optimization problem. The case when the subspace is a hyperplane can be solved to global optimality efficiently by solving a series of linear programs. The problem of finding the best-fit line has recently been shown to be NP-hard. We present necessary conditions for optimality for the best-fit subspace problem, and use them to characterize properties of optimal solutions.
A global carbon assimilation system based on a dual optimization method
NASA Astrophysics Data System (ADS)
Zheng, H.; Li, Y.; Chen, J. M.; Wang, T.; Huang, Q.; Huang, W. X.; Li, S. M.; Yuan, W. P.; Zheng, X.; Zhang, S. P.; Chen, Z. Q.; Jiang, F.
2014-10-01
Ecological models are effective tools to simulate the distribution of global carbon sources and sinks. However, these models often suffer from substantial biases due to inaccurate simulations of complex ecological processes. We introduce a set of scaling factors (parameters) to an ecological model on the basis of plant functional type (PFT) and latitudes. A global carbon assimilation system (GCAS-DOM) is developed by employing a Dual Optimization Method (DOM) to invert the time-dependent ecological model parameter state and the net carbon flux state simultaneously. We use GCAS-DOM to estimate the global distribution of the CO2 flux on 1° ×1° grid cells for the period from 2000 to 2007. Results show that land and ocean absorb -3.69 ± 0.49 Pg C year-1 and -1.91 ± 0.16 Pg C year-1, respectively. North America, Europe and China contribut -0.96 ± 0.15 Pg C year-1, -0.42 ± 0.08 Pg C year-1 and -0.21 ± 0.28 Pg C year-1, respectively. The uncertainties in the flux after optimization by GCAS-DOM have been remarkably reduced by more than 60%. Through parameter optimization, GCAS-DOM can provide improved estimates of the carbon flux for each PFT. Coniferous forest (-0.97 ± 0.27 Pg C year-1) is the largest contributor to the global carbon sink. Fluxes of once-dominant deciduous forest generated by BEPS is reduced to -0.79 ± 0.22 Pg C year-1, being the third largest carbon sink.
NASA Astrophysics Data System (ADS)
de Pascale, P.; Vasile, M.; Casotto, S.
The design of interplanetary trajectories requires the solution of an optimization problem, which has been traditionally solved by resorting to various local optimization techniques. All such approaches, apart from the specific method employed (direct or indirect), require an initial guess, which deeply influences the convergence to the optimal solution. The recent developments in low-thrust propulsion have widened the perspectives of exploration of the Solar System, while they have at the same time increased the difficulty related to the trajectory design process. Continuous thrust transfers, typically characterized by multiple spiraling arcs, have a broad number of design parameters and thanks to the flexibility offered by such engines, they typically turn out to be characterized by a multi-modal domain, with a consequent larger number of optimal solutions. Thus the definition of the first guesses is even more challenging, particularly for a broad search over the design parameters, and it requires an extensive investigation of the domain in order to locate the largest number of optimal candidate solutions and possibly the global optimal one. In this paper a tool for the preliminary definition of interplanetary transfers with coast-thrust arcs and multiple swing-bys is presented. Such goal is achieved combining a novel methodology for the description of low-thrust arcs, with a global optimization algorithm based on a hybridization of an evolutionary step and a deterministic step. Low thrust arcs are described in a 3D model in order to account the beneficial effects of low-thrust propulsion for a change of inclination, resorting to a new methodology based on an inverse method. The two-point boundary values problem (TPBVP) associated with a thrust arc is solved by imposing a proper parameterized evolution of the orbital parameters, by which, the acceleration required to follow the given trajectory with respect to the constraints set is obtained simply through algebraic computation. By this method a low-thrust transfer satisfying the boundary conditions on position and velocity can be quickly assessed, with low computational effort since no numerical propagation is required. The hybrid global optimization algorithm is made of a double step. Through the evolutionary search a large number of optima, and eventually the global one, are located, while the deterministic step consists of a branching process that exhaustively partitions the domain in order to have an extensive characterization of such a complex space of solutions. Furthermore, the approach implements a novel direct constraint-handling technique allowing the treatment of mixed-integer nonlinear programming problems (MINLP) typical of multiple swingby trajectories. A low-thrust transfer to Mars is studied as a test bed for the low-thrust model, thus presenting the main characteristics of the different shapes proposed and the features of the possible sub-arcs segmentations between two planets with respect to different objective functions: minimum time and minimum fuel consumption transfers. Other various test cases are also shown and further optimized, proving the effective capability of the proposed tool.
NASA Astrophysics Data System (ADS)
Toker, C.; Gokdag, Y. E.; Arikan, F.; Arikan, O.
2012-04-01
Ionosphere is a very important part of Space Weather. Modeling and monitoring of ionospheric variability is a major part of satellite communication, navigation and positioning systems. Total Electron Content (TEC), which is defined as the line integral of the electron density along a ray path, is one of the parameters to investigate the ionospheric variability. Dual-frequency GPS receivers, with their world wide availability and efficiency in TEC estimation, have become a major source of global and regional TEC modeling. When Global Ionospheric Maps (GIM) of International GPS Service (IGS) centers (http://iono.jpl.nasa.gov/gim.html) are investigated, it can be observed that regional ionosphere along the midlatitude regions can be modeled as a constant, linear or a quadratic surface. Globally, especially around the magnetic equator, the TEC surfaces resemble twisted and dispersed single centered or double centered Gaussian functions. Particle Swarm Optimization (PSO) proved itself as a fast converging and an effective optimization tool in various diverse fields. Yet, in order to apply this optimization technique into TEC modeling, the method has to be modified for higher efficiency and accuracy in extraction of geophysical parameters such as model parameters of TEC surfaces. In this study, a modified PSO (mPSO) method is applied to regional and global synthetic TEC surfaces. The synthetic surfaces that represent the trend and small scale variability of various ionospheric states are necessary to compare the performance of mPSO over number of iterations, accuracy in parameter estimation and overall surface reconstruction. The Cramer-Rao bounds for each surface type and model are also investigated and performance of mPSO are tested with respect to these bounds. For global models, the sample points that are used in optimization are obtained using IGS receiver network. For regional TEC models, regional networks such as Turkish National Permanent GPS Network (TNPGN-Active) receiver sites are used. The regional TEC models are grouped into constant (one parameter), linear (two parameters), and quadratic (six parameters) surfaces which are functions of latitude and longitude. Global models require seven parameters for single centered Gaussian and 13 parameters for double centered Gaussian function. The error criterion is the normalized percentage error for both the surface and the parameters. It is observed that mPSO is very successful in parameter extraction of various regional and global models. The normalized reconstruction error varies from 10-4 for constant surfaces to 10-3 for quadratic surfaces in regional models, sampled with regional networks. Even for the cases of a severe geomagnetic storm that affects measurements globally, with IGS network, the reconstruction error is on the order of 10-1 even though individual parameters have higher normalized errors. The modified PSO technique proved itself to be a useful tool for parameter extraction of more complicated TEC models. This study is supported by TUBITAK EEEAG under Grant No: 109E055.
Leal, Miguel Costa; Pimentel, Tânia; Ricardo, Fernando; Rosa, Rui; Calado, Ricardo
2015-06-01
Market globalization and recurring food safety alerts have resulted in a growing consumer awareness of the need for food traceability. This is particularly relevant for seafood due to its perishable nature and importance as a key protein source for the population of the world. Here, we provide an overview of the current needs for seafood origin traceability, along with the limitations and challenges for its implementation. We focus on geochemical, biochemical, and molecular tools and how they should be optimized to be implemented globally and to address our societal needs. We suggest that seafood traceability is key to enforcing food safety regulations and fisheries control, combat fraud, and fulfill present and future expectations of conscientious producers, consumers, and authorities. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.
2015-04-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.
Modeling the Environmental Impact of Air Traffic Operations
NASA Technical Reports Server (NTRS)
Chen, Neil
2011-01-01
There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.
ABCluster: the artificial bee colony algorithm for cluster global optimization.
Zhang, Jun; Dolg, Michael
2015-10-07
Global optimization of cluster geometries is of fundamental importance in chemistry and an interesting problem in applied mathematics. In this work, we introduce a relatively new swarm intelligence algorithm, i.e. the artificial bee colony (ABC) algorithm proposed in 2005, to this field. It is inspired by the foraging behavior of a bee colony, and only three parameters are needed to control it. We applied it to several potential functions of quite different nature, i.e., the Coulomb-Born-Mayer, Lennard-Jones, Morse, Z and Gupta potentials. The benchmarks reveal that for long-ranged potentials the ABC algorithm is very efficient in locating the global minimum, while for short-ranged ones it is sometimes trapped into a local minimum funnel on a potential energy surface of large clusters. We have released an efficient, user-friendly, and free program "ABCluster" to realize the ABC algorithm. It is a black-box program for non-experts as well as experts and might become a useful tool for chemists to study clusters.
2013-01-01
The Global Fund is experiencing increased pressure to optimize results and improve its impact per dollar spent. It is also in transition from a provider of emergency funding, to a long-term, sustainable financing mechanism. This paper assesses the efficacy of current Global Fund investment and examines how health technology assessments (HTAs) can be used to provide guidance on the relative priority of health interventions currently subsidized by the Global Fund. In addition, this paper identifies areas where the application of HTAs can exert the greatest impact and proposes ways in which this tool could be incorporated, as a routine component, into application, decision, implementation, and monitoring and evaluation processes. Finally, it addresses the challenges facing the Global Fund in realizing the full potential of HTAs. PMID:23965222
Analysis and Design of Rotors at Ultra-Low Reynolds Numbers
NASA Technical Reports Server (NTRS)
Kunz, Peter J.; Strawn, Roger C.
2003-01-01
Design tools have been developed for ultra-low Reynolds number rotors, combining enhanced actuator-ring / blade-element theory with airfoil section data based on two-dimensional Navier-Stokes calculations. This performance prediction method is coupled with an optimizer for both design and analysis applications. Performance predictions from these tools have been compared with three-dimensional Navier Stokes analyses and experimental data for a 2.5 cm diameter rotor with chord Reynolds numbers below 10,000. Comparisons among the analyses and experimental data show reasonable agreement both in the global thrust and power required, but the spanwise distributions of these quantities exhibit significant deviations. The study also reveals that three-dimensional and rotational effects significantly change local airfoil section performance. The magnitude of this issue, unique to this operating regime, may limit the applicability of blade-element type methods for detailed rotor design at ultra-low Reynolds numbers, but these methods are still useful for evaluating concept feasibility and rapidly generating initial designs for further analysis and optimization using more advanced tools.
NASA Astrophysics Data System (ADS)
Hassan, Rania A.
In the design of complex large-scale spacecraft systems that involve a large number of components and subsystems, many specialized state-of-the-art design tools are employed to optimize the performance of various subsystems. However, there is no structured system-level concept-architecting process. Currently, spacecraft design is heavily based on the heritage of the industry. Old spacecraft designs are modified to adapt to new mission requirements, and feasible solutions---rather than optimal ones---are often all that is achieved. During the conceptual phase of the design, the choices available to designers are predominantly discrete variables describing major subsystems' technology options and redundancy levels. The complexity of spacecraft configurations makes the number of the system design variables that need to be traded off in an optimization process prohibitive when manual techniques are used. Such a discrete problem is well suited for solution with a Genetic Algorithm, which is a global search technique that performs optimization-like tasks. This research presents a systems engineering framework that places design requirements at the core of the design activities and transforms the design paradigm for spacecraft systems to a top-down approach rather than the current bottom-up approach. To facilitate decision-making in the early phases of the design process, the population-based search nature of the Genetic Algorithm is exploited to provide computationally inexpensive---compared to the state-of-the-practice---tools for both multi-objective design optimization and design optimization under uncertainty. In terms of computational cost, those tools are nearly on the same order of magnitude as that of standard single-objective deterministic Genetic Algorithm. The use of a multi-objective design approach provides system designers with a clear tradeoff optimization surface that allows them to understand the effect of their decisions on all the design objectives under consideration simultaneously. Incorporating uncertainties avoids large safety margins and unnecessary high redundancy levels. The focus on low computational cost for the optimization tools stems from the objective that improving the design of complex systems should not be achieved at the expense of a costly design methodology.
Global surgery: current evidence for improving surgical care.
Fuller, Jennifer C; Shaye, David A
2017-08-01
The field of global surgery is undergoing rapid transformation, owing to several recent prominent reports positioning it as a cost-effective means of relieving global disease burden. The purpose of this article is to review the recent advances in the field of global surgery. Efforts to grow the global surgical workforce and procedural capacity have focused on innovative methods to increase surgeon training, enhance international collaboration, leverage technology, optimize existing health systems, and safely implement task-sharing. Computer modeling offers a novel means of informing policy to optimize timely access to care, equitably promote health and financial protection, and efficiently grow infrastructure. Tools and checklists have recently been developed to enhance data collection and ensure methodologically rigorous publications to inform planning, benchmark surgical systems, promote accurate modeling, track key health indicators, and promote safety. Creation of institutional partnerships and trainee exchanges can enrich training, stimulate commitment to humanitarian work, and promote the equal exchange of ideas and expertise. The recent body of work creates a strong foundation upon which work toward the goal of universal access to safe, affordable surgical care can be built; however, further collection and analysis of country-specific data is necessary for accurate modeling and outcomes research into the efficacy of policies such as task-sharing is greatly needed.
Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization
NASA Technical Reports Server (NTRS)
Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.
2014-01-01
Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.
NASA Astrophysics Data System (ADS)
Le-Duc, Thang; Ho-Huu, Vinh; Nguyen-Thoi, Trung; Nguyen-Quoc, Hung
2016-12-01
In recent years, various types of magnetorheological brakes (MRBs) have been proposed and optimized by different optimization algorithms that are integrated in commercial software such as ANSYS and Comsol Multiphysics. However, many of these optimization algorithms often possess some noteworthy shortcomings such as the trap of solutions at local extremes, or the limited number of design variables or the difficulty of dealing with discrete design variables. Thus, to overcome these limitations and develop an efficient computation tool for optimal design of the MRBs, an optimization procedure that combines differential evolution (DE), a gradient-free global optimization method with finite element analysis (FEA) is proposed in this paper. The proposed approach is then applied to the optimal design of MRBs with different configurations including conventional MRBs and MRBs with coils placed on the side housings. Moreover, to approach a real-life design, some necessary design variables of MRBs are considered as discrete variables in the optimization process. The obtained optimal design results are compared with those of available optimal designs in the literature. The results reveal that the proposed method outperforms some traditional approaches.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.
2013-01-01
ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.
Enhancing Polyhedral Relaxations for Global Optimization
ERIC Educational Resources Information Center
Bao, Xiaowei
2009-01-01
During the last decade, global optimization has attracted a lot of attention due to the increased practical need for obtaining global solutions and the success in solving many global optimization problems that were previously considered intractable. In general, the central question of global optimization is to find an optimal solution to a given…
The mechanisms of labor division from the perspective of individual optimization
NASA Astrophysics Data System (ADS)
Zhu, Lirong; Chen, Jiawei; Di, Zengru; Chen, Liujun; Liu, Yan; Stanley, H. Eugene
2017-12-01
Although the tools of complexity research have been applied to the phenomenon of labor division, its underlying mechanisms are still unclear. Researchers have used evolutionary models to study labor division in terms of global optimization, but focusing on individual optimization is a more realistic, real-world approach. We do this by first developing a multi-agent model that takes into account information-sharing and learning-by-doing and by using simulations to demonstrate the emergence of labor division. We then use a master equation method and find that the computational results are consistent with the results of the simulation. Finally we find that the core underlying mechanisms that cause labor division are learning-by-doing, information cost, and random fluctuation.
Parasail: SIMD C library for global, semi-global, and local pairwise sequence alignments.
Daily, Jeff
2016-02-10
Sequence alignment algorithms are a key component of many bioinformatics applications. Though various fast Smith-Waterman local sequence alignment implementations have been developed for x86 CPUs, most are embedded into larger database search tools. In addition, fast implementations of Needleman-Wunsch global sequence alignment and its semi-global variants are not as widespread. This article presents the first software library for local, global, and semi-global pairwise intra-sequence alignments and improves the performance of previous intra-sequence implementations. A faster intra-sequence local pairwise alignment implementation is described and benchmarked, including new global and semi-global variants. Using a 375 residue query sequence a speed of 136 billion cell updates per second (GCUPS) was achieved on a dual Intel Xeon E5-2670 24-core processor system, the highest reported for an implementation based on Farrar's 'striped' approach. Rognes's SWIPE optimal database search application is still generally the fastest available at 1.2 to at best 2.4 times faster than Parasail for sequences shorter than 500 amino acids. However, Parasail was faster for longer sequences. For global alignments, Parasail's prefix scan implementation is generally the fastest, faster even than Farrar's 'striped' approach, however the opal library is faster for single-threaded applications. The software library is designed for 64 bit Linux, OS X, or Windows on processors with SSE2, SSE41, or AVX2. Source code is available from https://github.com/jeffdaily/parasail under the Battelle BSD-style license. Applications that require optimal alignment scores could benefit from the improved performance. For the first time, SIMD global, semi-global, and local alignments are available in a stand-alone C library.
A globally complete map of supraglacial debris cover and a new toolkit for debris cover research
NASA Astrophysics Data System (ADS)
Herreid, Sam; Pellicciotti, Francesca
2017-04-01
A growing canon of literature is focused on resolving the processes and implications of debris cover on glaciers. However, this work is often confined to a handful of glaciers that were likely selected based on criteria optimizing their suitability to test a specific hypothesis or logistical ease. The role of debris cover in a glacier system is likely to not go overlooked in forthcoming research, yet the magnitude of this role at a global scale has not yet been fully described. Here, we present a map of debris cover for all glacierized regions on Earth including the Greenland Ice Sheet using 30 m Landsat data. This dataset will begin to open a wider context to the high quality, localized findings from the debris-covered glacier research community and help inform large-scale modeling efforts. A global map of debris cover also facilitates analysis attempting to isolate first order geomorphological and climate controls of supraglacial debris production. Furthering the objective of expanding the inclusion of debris cover in forthcoming research, we also present an under development suite of open-source, Python based tools. Requiring minimal and often freely available input data, we have automated the mapping of: i) debris cover, ii) ice cliffs, iii) debris cover evolution over the Landsat era and iv) glacier flow instabilities from altered debris structures. At the present time, debris extent is the only globally complete quantity but with the expanding repository of high quality global datasets and further tool development minimizing manual tasks and computational cost, we foresee all of these tools being applied globally in the near future.
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, H.; Chen, X.; Wu, Q.; Wang, Z.
2016-12-01
The Global Nested Air Quality Prediction Modeling System for Hg (GNAQPMS-Hg) is a global chemical transport model coupled Hg transport module to investigate the mercury pollution. In this study, we present our work of transplanting the GNAQPMS model on Intel Xeon Phi processor, Knights Landing (KNL) to accelerate the model. KNL is the second-generation product adopting Many Integrated Core Architecture (MIC) architecture. Compared with the first generation Knight Corner (KNC), KNL has more new hardware features, that it can be used as unique processor as well as coprocessor with other CPU. According to the Vtune tool, the high overhead modules in GNAQPMS model have been addressed, including CBMZ gas chemistry, advection and convection module, and wet deposition module. These high overhead modules were accelerated by optimizing code and using new techniques of KNL. The following optimized measures was done: 1) Changing the pure MPI parallel mode to hybrid parallel mode with MPI and OpenMP; 2.Vectorizing the code to using the 512-bit wide vector computation unit. 3. Reducing unnecessary memory access and calculation. 4. Reducing Thread Local Storage (TLS) for common variables with each OpenMP thread in CBMZ. 5. Changing the way of global communication from files writing and reading to MPI functions. After optimization, the performance of GNAQPMS is greatly increased both on CPU and KNL platform, the single-node test showed that optimized version has 2.6x speedup on two sockets CPU platform and 3.3x speedup on one socket KNL platform compared with the baseline version code, which means the KNL has 1.29x speedup when compared with 2 sockets CPU platform.
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
Han, Dianwei; Zhang, Jun; Tang, Guiliang
2012-01-01
An accurate prediction of the pre-microRNA secondary structure is important in miRNA informatics. Based on a recently proposed model, nucleotide cyclic motifs (NCM), to predict RNA secondary structure, we propose and implement a Modified NCM (MNCM) model with a physics-based scoring strategy to tackle the problem of pre-microRNA folding. Our microRNAfold is implemented using a global optimal algorithm based on the bottom-up local optimal solutions. Our experimental results show that microRNAfold outperforms the current leading prediction tools in terms of True Negative rate, False Negative rate, Specificity, and Matthews coefficient ratio.
Swarm intelligence metaheuristics for enhanced data analysis and optimization.
Hanrahan, Grady
2011-09-21
The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.
Olugbara, Oludayo
2014-01-01
This paper presents an annual multiobjective crop-mix planning as a problem of concurrent maximization of net profit and maximization of crop production to determine an optimal cropping pattern. The optimal crop production in a particular planting season is a crucial decision making task from the perspectives of economic management and sustainable agriculture. A multiobjective optimal crop-mix problem is formulated and solved using the generalized differential evolution 3 (GDE3) metaheuristic to generate a globally optimal solution. The performance of the GDE3 metaheuristic is investigated by comparing its results with the results obtained using epsilon constrained and nondominated sorting genetic algorithms—being two representatives of state-of-the-art in evolutionary optimization. The performance metrics of additive epsilon, generational distance, inverted generational distance, and spacing are considered to establish the comparability. In addition, a graphical comparison with respect to the true Pareto front for the multiobjective optimal crop-mix planning problem is presented. Empirical results generally show GDE3 to be a viable alternative tool for solving a multiobjective optimal crop-mix planning problem. PMID:24883369
Adekanmbi, Oluwole; Olugbara, Oludayo; Adeyemo, Josiah
2014-01-01
This paper presents an annual multiobjective crop-mix planning as a problem of concurrent maximization of net profit and maximization of crop production to determine an optimal cropping pattern. The optimal crop production in a particular planting season is a crucial decision making task from the perspectives of economic management and sustainable agriculture. A multiobjective optimal crop-mix problem is formulated and solved using the generalized differential evolution 3 (GDE3) metaheuristic to generate a globally optimal solution. The performance of the GDE3 metaheuristic is investigated by comparing its results with the results obtained using epsilon constrained and nondominated sorting genetic algorithms-being two representatives of state-of-the-art in evolutionary optimization. The performance metrics of additive epsilon, generational distance, inverted generational distance, and spacing are considered to establish the comparability. In addition, a graphical comparison with respect to the true Pareto front for the multiobjective optimal crop-mix planning problem is presented. Empirical results generally show GDE3 to be a viable alternative tool for solving a multiobjective optimal crop-mix planning problem.
Optimizing the Attitude Control of Small Satellite Constellations for Rapid Response Imaging
NASA Astrophysics Data System (ADS)
Nag, S.; Li, A.
2016-12-01
Distributed Space Missions (DSMs) such as formation flight and constellations, are being recognized as important solutions to increase measurement samples over space and time. Given the increasingly accurate attitude control systems emerging in the commercial market, small spacecraft now have the ability to slew and point within few minutes of notice. In spite of hardware development in CubeSats at the payload (e.g. NASA InVEST) and subsystems (e.g. Blue Canyon Technologies), software development for tradespace analysis in constellation design (e.g. Goddard's TAT-C), planning and scheduling development in single spacecraft (e.g. GEO-CAPE) and aerial flight path optimizations for UAVs (e.g. NASA Sensor Web), there is a gap in open-source, open-access software tools for planning and scheduling distributed satellite operations in terms of pointing and observing targets. This paper will demonstrate results from a tool being developed for scheduling pointing operations of narrow field-of-view (FOV) sensors over mission lifetime to maximize metrics such as global coverage and revisit statistics. Past research has shown the need for at least fourteen satellites to cover the Earth globally everyday using a LandSat-like sensor. Increasing the FOV three times reduces the need to four satellites, however adds image distortion and BRDF complexities to the observed reflectance. If narrow FOV sensors on a small satellite constellation were commanded using robust algorithms to slew their sensor dynamically, they would be able to coordinately cover the global landmass much faster without compensating for spatial resolution or BRDF effects. Our algorithm to optimize constellation satellite pointing is based on a dynamic programming approach under the constraints of orbital mechanics and existing attitude control systems for small satellites. As a case study for our algorithm, we minimize the time required to cover the 17000 Landsat images with maximum signal to noise ratio fall-off and minimum image distortion among the satellites, using Landsat's specifications. Attitude-specific constraints such as power consumption, response time, and stability were factored into the optimality computations. The algorithm can integrate cloud cover predictions, specific ground and air assets and angular constraints.
Synthesis: Deriving a Core Set of Recommendations to Optimize Diabetes Care on a Global Scale.
Mechanick, Jeffrey I; Leroith, Derek
2015-01-01
Diabetes afflicts 382 million people worldwide, with increasing prevalence rates and adverse effects on health, well-being, and society in general. There are many drivers for the complex presentation of diabetes, including environmental and genetic/epigenetic factors. The aim was to synthesize a core set of recommendations from information from 14 countries that can be used to optimize diabetes care on a global scale. Information from 14 papers in this special issue of Annals of Global Health was reviewed, analyzed, and sorted to synthesize recommendations. PubMed was searched for relevant studies on diabetes and global health. Key findings are as follows: (1) Population-based transitions distinguish region-specific diabetes care; (2) biological drivers for diabetes differ among various populations and need to be clarified scientifically; (3) principal resource availability determines quality-of-care metrics; and (4) governmental involvement, independent of economic barriers, improves the contextualization of diabetes care. Core recommendations are as follows: (1) Each nation should assess region-specific epidemiology, the scientific evidence base, and population-based transitions to establish risk-stratified guidelines for diagnosis and therapeutic interventions; (2) each nation should establish a public health imperative to provide tools and funding to successfully implement these guidelines; and (3) each nation should commit to education and research to optimize recommendations for a durable effect. Systematic acquisition of information about diabetes care can be analyzed, extrapolated, and then used to provide a core set of actionable recommendations that may be further studied and implemented to improve diabetes care on a global scale. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
A Framework for Optimizing the Placement of Tidal Turbines
NASA Astrophysics Data System (ADS)
Nelson, K. S.; Roberts, J.; Jones, C.; James, S. C.
2013-12-01
Power generation with marine hydrokinetic (MHK) current energy converters (CECs), often in the form of underwater turbines, is receiving growing global interest. Because of reasonable investment, maintenance, reliability, and environmental friendliness, this technology can contribute to national (and global) energy markets and is worthy of research investment. Furthermore, in remote areas, small-scale MHK energy from river, tidal, or ocean currents can provide a local power supply. However, little is known about the potential environmental effects of CEC operation in coastal embayments, estuaries, or rivers, or of the cumulative impacts of these devices on aquatic ecosystems over years or decades of operation. There is an urgent need for practical, accessible tools and peer-reviewed publications to help industry and regulators evaluate environmental impacts and mitigation measures, while establishing best sitting and design practices. Sandia National Laboratories (SNL) and Sea Engineering, Inc. (SEI) have investigated the potential environmental impacts and performance of individual tidal energy converters (TECs) in Cobscook Bay, ME; TECs are a subset of CECs that are specifically deployed in tidal channels. Cobscook Bay is the first deployment location of Ocean Renewable Power Company's (ORPC) TidGenTM unit. One unit is currently in place with four more to follow. Together, SNL and SEI built a coarse-grid, regional-scale model that included Cobscook Bay and all other landward embayments using the modeling platform SNL-EFDC. Within SNL-EFDC tidal turbines are represented using a unique set of momentum extraction, turbulence generation, and turbulence dissipation equations at TEC locations. The global model was then coupled to a local-scale model that was centered on the proposed TEC deployment locations. An optimization frame work was developed that used the refined model to determine optimal device placement locations that maximized array performance. Within the framework, environmental effects are considered to minimize the possibility of altering flows to an extent that would affect fish-swimming behavior and sediment-transport trends. Simulation results were compared between model runs with the optimized array configuration, and the originally purposed deployment locations; the optimized array showed a 17% increase in power generation. The developed framework can provide regulators and developers with a tool for assessing environmental impacts and device-performance parameters for the deployment of MHK devices. The more thoroughly understood this promising technology, the more likely it will become a viable source of alternative energy.
OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods
NASA Technical Reports Server (NTRS)
Heath, Christopher M.; Gray, Justin S.
2012-01-01
The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.
IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.
Huang, Lihan
2017-12-04
The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.
The Global Hidden Hunger Indices and Maps: An Advocacy Tool for Action
Muthayya, Sumithra; Rah, Jee Hyun; Sugimoto, Jonathan D.; Roos, Franz F.; Kraemer, Klaus; Black, Robert E.
2013-01-01
The unified global efforts to mitigate the high burden of vitamin and mineral deficiency, known as hidden hunger, in populations around the world are crucial to the achievement of most of the Millennium Development Goals (MDGs). We developed indices and maps of global hidden hunger to help prioritize program assistance, and to serve as an evidence-based global advocacy tool. Two types of hidden hunger indices and maps were created based on i) national prevalence data on stunting, anemia due to iron deficiency, and low serum retinol levels among preschool-aged children in 149 countries; and ii) estimates of Disability Adjusted Life Years (DALYs) attributed to micronutrient deficiencies in 136 countries. A number of countries in sub-Saharan Africa, as well as India and Afghanistan, had an alarmingly high level of hidden hunger, with stunting, iron deficiency anemia, and vitamin A deficiency all being highly prevalent. The total DALY rates per 100,000 population, attributed to micronutrient deficiencies, were generally the highest in sub-Saharan African countries. In 36 countries, home to 90% of the world’s stunted children, deficiencies of micronutrients were responsible for 1.5-12% of the total DALYs. The pattern and magnitude of iodine deficiency did not conform to that of other micronutrients. The greatest proportions of children with iodine deficiency were in the Eastern Mediterranean (46.6%), European (44.2%), and African (40.4%) regions. The current indices and maps provide crucial data to optimize the prioritization of program assistance addressing global multiple micronutrient deficiencies. Moreover, the indices and maps serve as a useful advocacy tool in the call for increased commitments to scale up effective nutrition interventions. PMID:23776712
A Tool for Conditions Tag Management in ATLAS
NASA Astrophysics Data System (ADS)
Sharmazanashvili, A.; Batiashvili, G.; Gvaberidze, G.; Shekriladze, L.; Formica, A.; Atlas Collaboration
2014-06-01
ATLAS Conditions data include about 2 TB in a relational database and 400 GB of files referenced from the database. Conditions data is entered and retrieved using COOL, the API for accessing data in the LCG Conditions Database infrastructure. It is managed using an ATLAS-customized python based tool set. Conditions data are required for every reconstruction and simulation job, so access to them is crucial for all aspects of ATLAS data taking and analysis, as well as by preceding tasks to derive optimal corrections to reconstruction. Optimized sets of conditions for processing are accomplished using strict version control on those conditions: a process which assigns COOL Tags to sets of conditions, and then unifies those conditions over data-taking intervals into a COOL Global Tag. This Global Tag identifies the set of conditions used to process data so that the underlying conditions can be uniquely identified with 100% reproducibility should the processing be executed again. Understanding shifts in the underlying conditions from one tag to another and ensuring interval completeness for all detectors for a set of runs to be processed is a complex task, requiring tools beyond the above mentioned python utilities. Therefore, a JavaScript /PHP based utility called the Conditions Tag Browser (CTB) has been developed. CTB gives detector and conditions experts the possibility to navigate through the different databases and COOL folders; explore the content of given tags and the differences between them, as well as their extent in time; visualize the content of channels associated with leaf tags. This report describes the structure and PHP/ JavaScript classes of functions of the CTB.
NASA Astrophysics Data System (ADS)
Rakotomanga, Prisca; Soussen, Charles; Blondel, Walter C. P. M.
2017-03-01
Diffuse reflectance spectroscopy (DRS) has been acknowledged as a valuable optical biopsy tool for in vivo characterizing pathological modifications in epithelial tissues such as cancer. In spatially resolved DRS, accurate and robust estimation of the optical parameters (OP) of biological tissues is a major challenge due to the complexity of the physical models. Solving this inverse problem requires to consider 3 components: the forward model, the cost function, and the optimization algorithm. This paper presents a comparative numerical study of the performances in estimating OP depending on the choice made for each of the latter components. Mono- and bi-layer tissue models are considered. Monowavelength (scalar) absorption and scattering coefficients are estimated. As a forward model, diffusion approximation analytical solutions with and without noise are implemented. Several cost functions are evaluated possibly including normalized data terms. Two local optimization methods, Levenberg-Marquardt and TrustRegion-Reflective, are considered. Because they may be sensitive to the initial setting, a global optimization approach is proposed to improve the estimation accuracy. This algorithm is based on repeated calls to the above-mentioned local methods, with initial parameters randomly sampled. Two global optimization methods, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO), are also implemented. Estimation performances are evaluated in terms of relative errors between the ground truth and the estimated values for each set of unknown OP. The combination between the number of variables to be estimated, the nature of the forward model, the cost function to be minimized and the optimization method are discussed.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Why don’t you use Evolutionary Algorithms in Big Data?
NASA Astrophysics Data System (ADS)
Stanovov, Vladimir; Brester, Christina; Kolehmainen, Mikko; Semenkina, Olga
2017-02-01
In this paper we raise the question of using evolutionary algorithms in the area of Big Data processing. We show that evolutionary algorithms provide evident advantages due to their high scalability and flexibility, their ability to solve global optimization problems and optimize several criteria at the same time for feature selection, instance selection and other data reduction problems. In particular, we consider the usage of evolutionary algorithms with all kinds of machine learning tools, such as neural networks and fuzzy systems. All our examples prove that Evolutionary Machine Learning is becoming more and more important in data analysis and we expect to see the further development of this field especially in respect to Big Data.
Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: “bats approach their prey.” Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization. PMID:24967425
Developing a Hydrologic Assessment Tool for Designing Bioretention in a watershed
NASA Astrophysics Data System (ADS)
Baek, Sangsoo; Ligaray, Mayzonee; Park, Jeong-Pyo; Kwon, Yongsung; Cho, Kyung Hwa
2017-04-01
Continuous urbanization has negatively impacted the ecological and hydrological environments at the global, regional, and local scales. This issue was addressed by developing Low Impact Development (LID) practices to deliver better hydrologic function and improve the environmental, economic, social and cultural outcomes. This study developed a modeling software to simulate and optimize bioretentions among LID in a given watershed. The model calculated a detailed soil infiltration process in bioretention with hydrological conditions and hydraulic facilities (e.g. riser and underdrain) and also generated an optimized plan using Flow Duration Curve (FDC). The optimization result from the simulation demonstrated that the location and size of bioretention, as well as the soil texture, are important elements for an efficient bioretention. We hope that the developed software in this study could be useful for establishing an appropriate scheme of LID installment
Systems metabolic engineering: genome-scale models and beyond.
Blazeck, John; Alper, Hal
2010-07-01
The advent of high throughput genome-scale bioinformatics has led to an exponential increase in available cellular system data. Systems metabolic engineering attempts to use data-driven approaches--based on the data collected with high throughput technologies--to identify gene targets and optimize phenotypical properties on a systems level. Current systems metabolic engineering tools are limited for predicting and defining complex phenotypes such as chemical tolerances and other global, multigenic traits. The most pragmatic systems-based tool for metabolic engineering to arise is the in silico genome-scale metabolic reconstruction. This tool has seen wide adoption for modeling cell growth and predicting beneficial gene knockouts, and we examine here how this approach can be expanded for novel organisms. This review will highlight advances of the systems metabolic engineering approach with a focus on de novo development and use of genome-scale metabolic reconstructions for metabolic engineering applications. We will then discuss the challenges and prospects for this emerging field to enable model-based metabolic engineering. Specifically, we argue that current state-of-the-art systems metabolic engineering techniques represent a viable first step for improving product yield that still must be followed by combinatorial techniques or random strain mutagenesis to achieve optimal cellular systems.
Egea, Jose A; Henriques, David; Cokelaer, Thomas; Villaverde, Alejandro F; MacNamara, Aidan; Danciu, Diana-Patricia; Banga, Julio R; Saez-Rodriguez, Julio
2014-05-10
Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods.
2014-01-01
Background Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. Results We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. Conclusions MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods. PMID:24885957
NASA Astrophysics Data System (ADS)
Turner, D.
2014-12-01
Understanding the potential economic and physical impacts of climate change on coastal resources involves evaluating a number of distinct adaptive responses. This paper presents a tool for such analysis, a spatially-disaggregated optimization model for adaptation to sea level rise (SLR) and storm surge, the Coastal Impact and Adaptation Model (CIAM). This decision-making framework fills a gap between very detailed studies of specific locations and overly aggregate global analyses. While CIAM is global in scope, the optimal adaptation strategy is determined at the local level, evaluating over 12,000 coastal segments as described in the DIVA database (Vafeidis et al. 2006). The decision to pursue a given adaptation measure depends on local socioeconomic factors like income, population, and land values and how they develop over time, relative to the magnitude of potential coastal impacts, based on geophysical attributes like inundation zones and storm surge. For example, the model's decision to protect or retreat considers the costs of constructing and maintaining coastal defenses versus those of relocating people and capital to minimize damages from land inundation and coastal storms. Uncertain storm surge events are modeled with a generalized extreme value distribution calibrated to data on local surge extremes. Adaptation is optimized for the near-term outlook, in an "act then learn then act" framework that is repeated over the model time horizon. This framework allows the adaptation strategy to be flexibly updated, reflecting the process of iterative risk management. CIAM provides new estimates of the economic costs of SLR; moreover, these detailed results can be compactly represented in a set of adaptation and damage functions for use in integrated assessment models. Alongside the optimal result, CIAM evaluates suboptimal cases and finds that global costs could increase by an order of magnitude, illustrating the importance of adaptive capacity and coastal policy.
An Optimization-Based Method for Feature Ranking in Nonlinear Regression Problems.
Bravi, Luca; Piccialli, Veronica; Sciandrone, Marco
2017-04-01
In this paper, we consider the feature ranking problem, where, given a set of training instances, the task is to associate a score with the features in order to assess their relevance. Feature ranking is a very important tool for decision support systems, and may be used as an auxiliary step of feature selection to reduce the high dimensionality of real-world data. We focus on regression problems by assuming that the process underlying the generated data can be approximated by a continuous function (for instance, a feedforward neural network). We formally state the notion of relevance of a feature by introducing a minimum zero-norm inversion problem of a neural network, which is a nonsmooth, constrained optimization problem. We employ a concave approximation of the zero-norm function, and we define a smooth, global optimization problem to be solved in order to assess the relevance of the features. We present the new feature ranking method based on the solution of instances of the global optimization problem depending on the available training data. Computational experiments on both artificial and real data sets are performed, and point out that the proposed feature ranking method is a valid alternative to existing methods in terms of effectiveness. The obtained results also show that the method is costly in terms of CPU time, and this may be a limitation in the solution of large-dimensional problems.
2017-01-01
In this paper, we propose a new automatic hyperparameter selection approach for determining the optimal network configuration (network structure and hyperparameters) for deep neural networks using particle swarm optimization (PSO) in combination with a steepest gradient descent algorithm. In the proposed approach, network configurations were coded as a set of real-number m-dimensional vectors as the individuals of the PSO algorithm in the search procedure. During the search procedure, the PSO algorithm is employed to search for optimal network configurations via the particles moving in a finite search space, and the steepest gradient descent algorithm is used to train the DNN classifier with a few training epochs (to find a local optimal solution) during the population evaluation of PSO. After the optimization scheme, the steepest gradient descent algorithm is performed with more epochs and the final solutions (pbest and gbest) of the PSO algorithm to train a final ensemble model and individual DNN classifiers, respectively. The local search ability of the steepest gradient descent algorithm and the global search capabilities of the PSO algorithm are exploited to determine an optimal solution that is close to the global optimum. We constructed several experiments on hand-written characters and biological activity prediction datasets to show that the DNN classifiers trained by the network configurations expressed by the final solutions of the PSO algorithm, employed to construct an ensemble model and individual classifier, outperform the random approach in terms of the generalization performance. Therefore, the proposed approach can be regarded an alternative tool for automatic network structure and parameter selection for deep neural networks. PMID:29236718
LCA-based optimization of wood utilization under special consideration of a cascading use of wood.
Höglmeier, Karin; Steubing, Bernhard; Weber-Blaschke, Gabriele; Richter, Klaus
2015-04-01
Cascading, the use of the same unit of a resource in multiple successional applications, is considered as a viable means to improve the efficiency of resource utilization and to decrease environmental impacts. Wood, as a regrowing but nevertheless limited and increasingly in demand resource, can be used in cascades, thereby increasing the potential efficiency per unit of wood. This study aims to assess the influence of cascading wood utilization on optimizing the overall environmental impact of wood utilization. By combining a material flow model of existing wood applications - both for materials provision and energy production - with an algebraic optimization tool, the effects of the use of wood in cascades can be modelled and quantified based on life cycle impact assessment results for all production processes. To identify the most efficient wood allocation, the effects of a potential substitution of non-wood products were taken into account in a part of the model runs. The considered environmental indicators were global warming potential, particulate matter formation, land occupation and an aggregated single score indicator. We found that optimizing either the overall global warming potential or the value of the single score indicator of the system leads to a simultaneous relative decrease of all other considered environmental impacts. The relative differences between the impacts of the model run with and without the possibility of a cascading use of wood were 7% for global warming potential and the single score indicator, despite cascading only influencing a small part of the overall system, namely wood panel production. Cascading led to savings of up to 14% of the annual primary wood supply of the study area. We conclude that cascading can improve the overall performance of a wood utilization system. Copyright © 2015 Elsevier Ltd. All rights reserved.
A General-Purpose Optimization Engine for Multi-Disciplinary Design Applications
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Hopkins, Dale A.; Berke, Laszlo
1996-01-01
A general purpose optimization tool for multidisciplinary applications, which in the literature is known as COMETBOARDS, is being developed at NASA Lewis Research Center. The modular organization of COMETBOARDS includes several analyzers and state-of-the-art optimization algorithms along with their cascading strategy. The code structure allows quick integration of new analyzers and optimizers. The COMETBOARDS code reads input information from a number of data files, formulates a design as a set of multidisciplinary nonlinear programming problems, and then solves the resulting problems. COMETBOARDS can be used to solve a large problem which can be defined through multiple disciplines, each of which can be further broken down into several subproblems. Alternatively, a small portion of a large problem can be optimized in an effort to improve an existing system. Some of the other unique features of COMETBOARDS include design variable formulation, constraint formulation, subproblem coupling strategy, global scaling technique, analysis approximation, use of either sequential or parallel computational modes, and so forth. The special features and unique strengths of COMETBOARDS assist convergence and reduce the amount of CPU time used to solve the difficult optimization problems of aerospace industries. COMETBOARDS has been successfully used to solve a number of problems, including structural design of space station components, design of nozzle components of an air-breathing engine, configuration design of subsonic and supersonic aircraft, mixed flow turbofan engines, wave rotor topped engines, and so forth. This paper introduces the COMETBOARDS design tool and its versatility, which is illustrated by citing examples from structures, aircraft design, and air-breathing propulsion engine design.
Global Change adaptation in water resources management: the Water Change project.
Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine
2012-12-01
In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1995-01-01
The design of a High-Speed Civil Transport (HSCT) air-breathing propulsion system for multimission, variable-cycle operations was successfully optimized through a soft coupling of the engine performance analyzer NASA Engine Performance Program (NEPP) to a multidisciplinary optimization tool COMETBOARDS that was developed at the NASA Lewis Research Center. The design optimization of this engine was cast as a nonlinear optimization problem, with engine thrust as the merit function and the bypass ratios, r-values of fans, fuel flow, and other factors as important active design variables. Constraints were specified on factors including the maximum speed of the compressors, the positive surge margins for the compressors with specified safety factors, the discharge temperature, the pressure ratios, and the mixer extreme Mach number. Solving the problem by using the most reliable optimization algorithm available in COMETBOARDS would provide feasible optimum results only for a portion of the aircraft flight regime because of the large number of mission points (defined by altitudes, Mach numbers, flow rates, and other factors), diverse constraint types, and overall poor conditioning of the design space. Only the cascade optimization strategy of COMETBOARDS, which was devised especially for difficult multidisciplinary applications, could successfully solve a number of engine design problems for their flight regimes. Furthermore, the cascade strategy converged to the same global optimum solution even when it was initiated from different design points. Multiple optimizers in a specified sequence, pseudorandom damping, and reduction of the design space distortion via a global scaling scheme are some of the key features of the cascade strategy. HSCT engine concept, optimized solution for HSCT engine concept. A COMETBOARDS solution for an HSCT engine (Mach-2.4 mixed-flow turbofan) along with its configuration is shown. The optimum thrust is normalized with respect to NEPP results. COMETBOARDS added value in the design optimization of the HSCT engine.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Meta-heuristic algorithms as tools for hydrological science
NASA Astrophysics Data System (ADS)
Yoo, Do Guen; Kim, Joong Hoon
2014-12-01
In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.
Material Distribution Optimization for the Shell Aircraft Composite Structure
NASA Astrophysics Data System (ADS)
Shevtsov, S.; Zhilyaev, I.; Oganesyan, P.; Axenov, V.
2016-09-01
One of the main goal in aircraft structures designing isweight decreasing and stiffness increasing. Composite structures recently became popular in aircraft because of their mechanical properties and wide range of optimization possibilities.Weight distribution and lay-up are keys to creating lightweight stiff strictures. In this paperwe discuss optimization of specific structure that undergoes the non-uniform air pressure at the different flight conditions and reduce a level of noise caused by the airflowinduced vibrations at the constrained weight of the part. Initial model was created with CAD tool Siemens NX, finite element analysis and post processing were performed with COMSOL Multiphysicsr and MATLABr. Numerical solutions of the Reynolds averaged Navier-Stokes (RANS) equations supplemented by k-w turbulence model provide the spatial distributions of air pressure applied to the shell surface. At the formulation of optimization problem the global strain energy calculated within the optimized shell was assumed as the objective. Wall thickness has been changed using parametric approach by an initiation of auxiliary sphere with varied radius and coordinates of the center, which were the design variables. To avoid a local stress concentration, wall thickness increment was defined as smooth function on the shell surface dependent of auxiliary sphere position and size. Our study consists of multiple steps: CAD/CAE transformation of the model, determining wind pressure for different flow angles, optimizing wall thickness distribution for specific flow angles, designing a lay-up for optimal material distribution. The studied structure was improved in terms of maximum and average strain energy at the constrained expense ofweight growth. Developed methods and tools can be applied to wide range of shell-like structures made of multilayered quasi-isotropic laminates.
NASA Astrophysics Data System (ADS)
Dwyer, Linnea; Yadav, Kamini; Congalton, Russell G.
2017-04-01
Providing adequate food and water for a growing, global population continues to be a major challenge. Mapping and monitoring crops are useful tools for estimating the extent of crop productivity. GFSAD30 (Global Food Security Analysis Data at 30m) is a program, funded by NASA, that is producing global cropland maps by using field measurements and remote sensing images. This program studies 8 major crop types, and includes information on cropland area/extent, if crops are irrigated or rainfed, and the cropping intensities. Using results from the US and the extensive reference data available, CDL (USDA Crop Data Layer), we will experiment with various sampling simulations to determine optimal sampling for thematic map accuracy assessment. These simulations will include varying the sampling unit, the sampling strategy, and the sample number. Results of these simulations will allow us to recommend assessment approaches to handle different cropping scenarios.
Raja, Muhammad Asif Zahoor; Khan, Junaid Ali; Ahmad, Siraj-ul-Islam; Qureshi, Ijaz Mansoor
2012-01-01
A methodology for solution of Painlevé equation-I is presented using computational intelligence technique based on neural networks and particle swarm optimization hybridized with active set algorithm. The mathematical model of the equation is developed with the help of linear combination of feed-forward artificial neural networks that define the unsupervised error of the model. This error is minimized subject to the availability of appropriate weights of the networks. The learning of the weights is carried out using particle swarm optimization algorithm used as a tool for viable global search method, hybridized with active set algorithm for rapid local convergence. The accuracy, convergence rate, and computational complexity of the scheme are analyzed based on large number of independents runs and their comprehensive statistical analysis. The comparative studies of the results obtained are made with MATHEMATICA solutions, as well as, with variational iteration method and homotopy perturbation method. PMID:22919371
Optimal implicit 2-D finite differences to model wave propagation in poroelastic media
NASA Astrophysics Data System (ADS)
Itzá, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.
2016-08-01
Numerical modeling of seismic waves in heterogeneous porous reservoir rocks is an important tool for the interpretation of seismic surveys in reservoir engineering. We apply globally optimal implicit staggered-grid finite differences (FD) to model 2-D wave propagation in heterogeneous poroelastic media at a low-frequency range (<10 kHz). We validate the numerical solution by comparing it to an analytical-transient solution obtaining clear seismic wavefields including fast P and slow P and S waves (for a porous media saturated with fluid). The numerical dispersion and stability conditions are derived using von Neumann analysis, showing that over a wide range of porous materials the Courant condition governs the stability and this optimal implicit scheme improves the stability of explicit schemes. High-order explicit FD can be replaced by some lower order optimal implicit FD so computational cost will not be as expensive while maintaining the accuracy. Here, we compute weights for the optimal implicit FD scheme to attain an accuracy of γ = 10-8. The implicit spatial differentiation involves solving tridiagonal linear systems of equations through Thomas' algorithm.
Constraint Optimization Problem For The Cutting Of A Cobalt Chrome Refractory Material
NASA Astrophysics Data System (ADS)
Lebaal, Nadhir; Schlegel, Daniel; Folea, Milena
2011-05-01
This paper shows a complete approach to solve a given problem, from the experimentation to the optimization of different cutting parameters. In response to an industrial problem of slotting FSX 414, a Cobalt-based refractory material, we have implemented a design of experiment to determine the most influent parameters on the tool life, the surface roughness and the cutting forces. After theses trials, an optimization approach has been implemented to find the lowest manufacturing cost while respecting the roughness constraints and cutting force limitation constraints. The optimization approach is based on the Response Surface Method (RSM) using the Sequential Quadratic programming algorithm (SQP) for a constrained problem. To avoid a local optimum and to obtain an accurate solution at low cost, an efficient strategy, which allows improving the RSM accuracy in the vicinity of the global optimum, is presented. With these models and these trials, we could apply and compare our optimization methods in order to get the lowest cost for the best quality, i.e. a satisfying surface roughness and limited cutting forces.
Improving global CD uniformity by optimizing post-exposure bake and develop sequences
NASA Astrophysics Data System (ADS)
Osborne, Stephen P.; Mueller, Mark; Lem, Homer; Reyland, David; Baik, KiHo
2003-12-01
Improvements in the final uniformity of masks can be shrouded by error contributions from many sources. The final Global CD Uniformity (GCDU) of a mask is degraded by individual contributions of the writing tool, the Post Applied Bake (PAB), the Post Exposure Bake (PEB), the Develop sequence and the Etch step. Final global uniformity will improve by isolating and minimizing the variability of the PEB and Develop. We achieved this de-coupling of the PEB and Develop process from the whole process stream by using "dark loss" which is the loss of unexposed resist during the develop process. We confirmed a correspondence between Angstroms of dark loss and nanometer sized deviations in the chrome CD. A plate with a distinctive dark loss pattern was related to a nearly identical pattern in the chrome CD. This pattern was verified to have originated during the PEB process and displayed a [Δ(Final CD)/Δ(Dark Loss)] ratio of 6 for TOK REAP200 resist. Previous papers have reported a sensitive linkage between Angstroms of dark loss and nanometers in the final uniformity of the written plate. These initial studies reported using this method to improve the PAB of resists for greater uniformity of sensitivity and contrast. Similarly, this paper demonstrates an outstanding optimization of PEB and Develop processes.
Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu
2016-12-21
A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less
Multidisciplinary Optimization for Aerospace Using Genetic Optimization
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Hahn, Edward E.; Herrera, Claudia Y.
2007-01-01
In support of the ARMD guidelines NASA's Dryden Flight Research Center is developing a multidisciplinary design and optimization tool This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Optimization has made its way into many mainstream applications. For example NASTRAN(TradeMark) has its solution sequence 200 for Design Optimization, and MATLAB(TradeMark) has an Optimization Tool box. Other packages, such as ZAERO(TradeMark) aeroelastic panel code and the CFL3D(TradeMark) Navier-Stokes solver have no built in optimizer. The goal of the tool development is to generate a central executive capable of using disparate software packages ina cross platform network environment so as to quickly perform optimization and design tasks in a cohesive streamlined manner. A provided figure (Figure 1) shows a typical set of tools and their relation to the central executive. Optimization can take place within each individual too, or in a loop between the executive and the tool, or both.
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.; ...
2016-02-10
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
Model-based setup assistant for progressive tools
NASA Astrophysics Data System (ADS)
Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar
2018-05-01
In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
Sandia National Laboratories (Sandia) is in Phase 3 Sustainment of development of a prototype tool, currently referred to as the Contingency Contractor Optimization Tool - Prototype (CCOTP), under the direction of OSD Program Support. CCOT-P is intended to help provide senior Department of Defense (DoD) leaders with comprehensive insight into the global availability, readiness and capabilities of the Total Force Mix. The CCOT-P will allow senior decision makers to quickly and accurately assess the impacts, risks and mitigating strategies for proposed changes to force/capabilities assignments, apportionments and allocations options, focusing specifically on contingency contractor planning. During Phase 2 of themore » program, conducted during fiscal year 2012, Sandia developed an electronic storyboard prototype of the Contingency Contractor Optimization Tool that can be used for communication with senior decision makers and other Operational Contract Support (OCS) stakeholders. Phase 3 used feedback from demonstrations of the electronic storyboard prototype to develop an engineering prototype for planners to evaluate. Sandia worked with the DoD and Joint Chiefs of Staff strategic planning community to get feedback and input to ensure that the engineering prototype was developed to closely align with future planning needs. The intended deployment environment was also a key consideration as this prototype was developed. Initial release of the engineering prototype was done on servers at Sandia in the middle of Phase 3. In 2013, the tool was installed on a production pilot server managed by the OUSD(AT&L) eBusiness Center. The purpose of this document is to specify the CCOT-P engineering prototype platform requirements as of May 2016. Sandia developed the CCOT-P engineering prototype using common technologies to minimize the likelihood of deployment issues. CCOT-P engineering prototype was architected and designed to be as independent as possible of the major deployment components such as the server hardware, the server operating system, the database, and the web server. This document describes the platform requirements, the architecture, and the implementation details of the CCOT-P engineering prototype.« less
A coarse-grained model for DNA origami.
Reshetnikov, Roman V; Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D
2018-02-16
Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version.
A coarse-grained model for DNA origami
Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D
2018-01-01
Abstract Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version. PMID:29267876
Mechanick, Jeffrey I; Harrell, R Mack; Allende-Vigo, Myriam Z; Alvayero, Carlos; Arita-Melzer, Onix; Aschner, Pablo; Camacho, Pauline M; Castillo, Rogelio Zacarias; Cerdas, Sonia; Coutinho, Walmir F; Davidson, Jaime A; Garber, Jeffrey R; Garvey, W Timothy; González, Fernando Javier Lavalle; Granados, Denis O; Hamdy, Osama; Handelsman, Yehuda; Jiménez-Navarrete, Manuel Francisco; Lupo, Mark A; Mendoza, Enrique J; Jiménez-Montero, José G; Zangeneh, Farhad
2016-04-01
The American Association of Clinical Endocrinologists (AACE) and American College of Endocrinology (ACE) convened their first Workshop for recommendations to optimize Clinical Practice Algorithm (CPA) development for Latin America (LA) in diabetes (focusing on glycemic control), obesity (focusing on weight loss), thyroid (focusing on thyroid nodule diagnostics), and bone (focusing on postmenopausal osteoporosis) on February 28, 2015, in San Jose, Costa Rica. A standardized methodology is presented incorporating various transculturalization factors: resource availability (including imaging equipment and approved pharmaceuticals), health care professional and patient preferences, lifestyle variables, socio-economic parameters, web-based global accessibility, electronic implementation, and need for validation protocols. A standardized CPA template with node-specific recommendations to assist the local transculturalization process is provided. Participants unanimously agreed on the following five overarching principles for LA: (1) there is only one level of optimal endocrine care, (2) hemoglobin A1C should be utilized at every level of diabetes care, (3) nutrition education and increased pharmaceutical options are necessary to optimize the obesity care model, (4) quality neck ultrasound must be part of an optimal thyroid nodule care model, and (5) more scientific evidence is needed on osteoporosis prevalence and cost to justify intervention by governmental health care authorities. This 2015 AACE/ACE Workshop marks the beginning of a structured activity that assists local experts in creating culturally sensitive, evidence-based, and easy-to-implement tools for optimizing endocrine care on a global scale.
NASA Astrophysics Data System (ADS)
Dambreville, Frédéric
2013-10-01
While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.
LDRD Final Report: Global Optimization for Engineering Science Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
HART,WILLIAM E.
1999-12-01
For a wide variety of scientific and engineering problems the desired solution corresponds to an optimal set of objective function parameters, where the objective function measures a solution's quality. The main goal of the LDRD ''Global Optimization for Engineering Science Problems'' was the development of new robust and efficient optimization algorithms that can be used to find globally optimal solutions to complex optimization problems. This SAND report summarizes the technical accomplishments of this LDRD, discusses lessons learned and describes open research issues.
A meta-analysis of telemedicine success in Africa
Wamala, Dan S.; Augustine, Kaddu
2013-01-01
The use of information and communication technologies (ICT) tools to improve the efficiency of professionalism at work is increasing every time under the dynamic digital environment. Tools such as telemedicine, tele-education, and health informatics have of late been incorporated in the health sector to enable easy access to essential services, for example, in medical areas from referral centers by the patients on one hand and enabling the doctor to doctor consultations for the benefit of patients. Unfortunately, observations indicate dearth efforts and commitment to optimize use of the tools in the majority of the countries south of the Sahara. Sub-Saharan Africa has been left almost behind the rest of the world in terms of development going through decades of economic exploitation by especially the west through its natural and human resources. These factors, ethnic conflicts and endless wars have continued to ruin sub-Saharan Africa’s socio-economic development. Information was obtained through a network of telemedicine practitioners in different African countries using internet communication, through E-mail and reviewing existing literature of their activities. This information was compiled from representative countries in each African region and the previous authors’experiences as telemedicine practioners. Most of these countries have inadequate ICT infrastructure, which yet creates sub-optimal application. Sub-Saharan Africa, made up of 33 of the 48 global poorest countries has to extend its ICT diffusion and policy to match the ever developing global economy. In some countries such as Ethiopia and South Africa there is significant progress in Telemedicine while in countries such as Burkina Faso and Nigeria the progress is slow because of lack of political support. Almost all reference to Africa is made in due respect to sub-Saharan Africa, one with big social, economic, and political problems with resultant high morbidity and mortality rates. This also highlights the under-representation of African researchers in the global whelm of information system research. Telemedicine in Africa though has not attracted enough political support is potentially a very useful conduit of health-care given the fact that the continent is resource limited and still enduring the effects of scarce human resource especially in health. PMID:23858382
Visualization of Global Disease Burden for the Optimization of Patient Management and Treatment.
Schlee, Winfried; Hall, Deborah A; Edvall, Niklas K; Langguth, Berthold; Canlon, Barbara; Cederroth, Christopher R
2017-01-01
The assessment and treatment of complex disorders is challenged by the multiple domains and instruments used to evaluate clinical outcome. With the large number of assessment tools typically used in complex disorders comes the challenge of obtaining an integrative view of disease status to further evaluate treatment outcome both at the individual level and at the group level. Radar plots appear as an attractive visual tool to display multivariate data on a two-dimensional graphical illustration. Here, we describe the use of radar plots for the visualization of disease characteristics applied in the context of tinnitus, a complex and heterogeneous condition, the treatment of which has shown mixed success. Data from two different cohorts, the Swedish Tinnitus Outreach Project (STOP) and the Tinnitus Research Initiative (TRI) database, were used. STOP is a population-based cohort where cross-sectional data from 1,223 non-tinnitus and 933 tinnitus subjects were analyzed. By contrast, the TRI contained data from 571 patients who underwent various treatments and whose Clinical Global Impression (CGI) score was accessible to infer treatment outcome. In the latter, 34,560 permutations were tested to evaluate whether a particular ordering of the instruments could reflect better the treatment outcome measured with the CGI. Radar plots confirmed that tinnitus subtypes such as occasional and chronic tinnitus from the STOP cohort could be strikingly different, and helped appreciate a gender bias in tinnitus severity. Radar plots with greater surface areas were consistent with greater burden, and enabled a rapid appreciation of the global distress associated with tinnitus in patients categorized according to tinnitus severity. Permutations in the arrangement of instruments allowed to identify a configuration with minimal variance and maximized surface difference between CGI groups from the TRI database, thus affording a means of optimally evaluating the outcomes in individual patients. We anticipate such a tool to become a starting point for more sophisticated measures in clinical outcomes, applicable not only in the context of tinnitus but also in other complex diseases where the integration of multiple variables is needed for a comprehensive evaluation of treatment response.
Trajectory optimization for the National Aerospace Plane
NASA Technical Reports Server (NTRS)
Lu, Ping
1993-01-01
The objective of this second phase research is to investigate the optimal ascent trajectory for the National Aerospace Plane (NASP) from runway take-off to orbital insertion and address the unique problems associated with the hypersonic flight trajectory optimization. The trajectory optimization problem for an aerospace plane is a highly challenging problem because of the complexity involved. Previous work has been successful in obtaining sub-optimal trajectories by using energy-state approximation and time-scale decomposition techniques. But it is known that the energy-state approximation is not valid in certain portions of the trajectory. This research aims at employing full dynamics of the aerospace plane and emphasizing direct trajectory optimization methods. The major accomplishments of this research include the first-time development of an inverse dynamics approach in trajectory optimization which enables us to generate optimal trajectories for the aerospace plane efficiently and reliably, and general analytical solutions to constrained hypersonic trajectories that has wide application in trajectory optimization as well as in guidance and flight dynamics. Optimal trajectories in abort landing and ascent augmented with rocket propulsion and thrust vectoring control were also investigated. Motivated by this study, a new global trajectory optimization tool using continuous simulated annealing and a nonlinear predictive feedback guidance law have been under investigation and some promising results have been obtained, which may well lead to more significant development and application in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xingyuan; He, Zhili; Zhou, Jizhong
2005-10-30
The oligonucleotide specificity for microarray hybridizationcan be predicted by its sequence identity to non-targets, continuousstretch to non-targets, and/or binding free energy to non-targets. Mostcurrently available programs only use one or two of these criteria, whichmay choose 'false' specific oligonucleotides or miss 'true' optimalprobes in a considerable proportion. We have developed a software tool,called CommOligo using new algorithms and all three criteria forselection of optimal oligonucleotide probes. A series of filters,including sequence identity, free energy, continuous stretch, GC content,self-annealing, distance to the 3'-untranslated region (3'-UTR) andmelting temperature (Tm), are used to check each possibleoligonucleotide. A sequence identity is calculated based onmore » gapped globalalignments. A traversal algorithm is used to generate alignments for freeenergy calculation. The optimal Tm interval is determined based on probecandidates that have passed all other filters. Final probes are pickedusing a combination of user-configurable piece-wise linear functions andan iterative process. The thresholds for identity, stretch and freeenergy filters are automatically determined from experimental data by anaccessory software tool, CommOligo_PE (CommOligo Parameter Estimator).The program was used to design probes for both whole-genome and highlyhomologous sequence data. CommOligo and CommOligo_PE are freely availableto academic users upon request.« less
Optimization of the resources management in fighting wildfires.
Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J Manuel
2002-09-01
Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.
Optimization of the Resources Management in Fighting Wildfires
NASA Astrophysics Data System (ADS)
Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J. Manuel
2002-09-01
Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.
Collaborative Effort for a Centralized Worldwide Tuberculosis Relational Sequencing Data Platform.
Starks, Angela M; Avilés, Enrique; Cirillo, Daniela M; Denkinger, Claudia M; Dolinger, David L; Emerson, Claudia; Gallarda, Jim; Hanna, Debra; Kim, Peter S; Liwski, Richard; Miotto, Paolo; Schito, Marco; Zignol, Matteo
2015-10-15
Continued progress in addressing challenges associated with detection and management of tuberculosis requires new diagnostic tools. These tools must be able to provide rapid and accurate information for detecting resistance to guide selection of the treatment regimen for each patient. To achieve this goal, globally representative genotypic, phenotypic, and clinical data are needed in a standardized and curated data platform. A global partnership of academic institutions, public health agencies, and nongovernmental organizations has been established to develop a tuberculosis relational sequencing data platform (ReSeqTB) that seeks to increase understanding of the genetic basis of resistance by correlating molecular data with results from drug susceptibility testing and, optimally, associated patient outcomes. These data will inform development of new diagnostics, facilitate clinical decision making, and improve surveillance for drug resistance. ReSeqTB offers an opportunity for collaboration to achieve improved patient outcomes and to advance efforts to prevent and control this devastating disease. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
Clustering methods for the optimization of atomic cluster structure
NASA Astrophysics Data System (ADS)
Bagattini, Francesco; Schoen, Fabio; Tigli, Luca
2018-04-01
In this paper, we propose a revised global optimization method and apply it to large scale cluster conformation problems. In the 1990s, the so-called clustering methods were considered among the most efficient general purpose global optimization techniques; however, their usage has quickly declined in recent years, mainly due to the inherent difficulties of clustering approaches in large dimensional spaces. Inspired from the machine learning literature, we redesigned clustering methods in order to deal with molecular structures in a reduced feature space. Our aim is to show that by suitably choosing a good set of geometrical features coupled with a very efficient descent method, an effective optimization tool is obtained which is capable of finding, with a very high success rate, all known putative optima for medium size clusters without any prior information, both for Lennard-Jones and Morse potentials. The main result is that, beyond being a reliable approach, the proposed method, based on the idea of starting a computationally expensive deep local search only when it seems worth doing so, is capable of saving a huge amount of searches with respect to an analogous algorithm which does not employ a clustering phase. In this paper, we are not claiming the superiority of the proposed method compared to specific, refined, state-of-the-art procedures, but rather indicating a quite straightforward way to save local searches by means of a clustering scheme working in a reduced variable space, which might prove useful when included in many modern methods.
Li, Hongyu; Walker, David; Yu, Guoyu; Sayle, Andrew; Messelink, Wilhelmus; Evans, Rob; Beaucamp, Anthony
2013-01-14
Edge mis-figure is regarded as one of the most difficult technical issues for manufacturing the segments of extremely large telescopes, which can dominate key aspects of performance. A novel edge-control technique has been developed, based on 'Precessions' polishing technique and for which accurate and stable edge tool influence functions (TIFs) are crucial. In the first paper in this series [D. Walker Opt. Express 20, 19787-19798 (2012)], multiple parameters were experimentally optimized using an extended set of experiments. The first purpose of this new work is to 'short circuit' this procedure through modeling. This also gives the prospect of optimizing local (as distinct from global) polishing for edge mis-figure, now under separate development. This paper presents a model that can predict edge TIFs based on surface-speed profiles and pressure distributions over the polishing spot at the edge of the part, the latter calculated by finite element analysis and verified by direct force measurement. This paper also presents a hybrid-measurement method for edge TIFs to verify the simulation results. Experimental and simulation results show good agreement.
Dubrowski, Adam; Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan
2015-11-02
Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation - occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation - collectively, the building blocks of optimal healthcare.
Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines
Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan
2015-01-01
Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation – occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation – collectively, the building blocks of optimal healthcare. PMID:26677421
GLOBAL SOLUTIONS TO FOLDED CONCAVE PENALIZED NONCONVEX LEARNING
Liu, Hongcheng; Yao, Tao; Li, Runze
2015-01-01
This paper is concerned with solving nonconvex learning problems with folded concave penalty. Despite that their global solutions entail desirable statistical properties, there lack optimization techniques that guarantee global optimality in a general setting. In this paper, we show that a class of nonconvex learning problems are equivalent to general quadratic programs. This equivalence facilitates us in developing mixed integer linear programming reformulations, which admit finite algorithms that find a provably global optimal solution. We refer to this reformulation-based technique as the mixed integer programming-based global optimization (MIPGO). To our knowledge, this is the first global optimization scheme with a theoretical guarantee for folded concave penalized nonconvex learning with the SCAD penalty (Fan and Li, 2001) and the MCP penalty (Zhang, 2010). Numerical results indicate a significant outperformance of MIPGO over the state-of-the-art solution scheme, local linear approximation, and other alternative solution techniques in literature in terms of solution quality. PMID:27141126
Online optimal obstacle avoidance for rotary-wing autonomous unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Kang, Keeryun
This thesis presents an integrated framework for online obstacle avoidance of rotary-wing unmanned aerial vehicles (UAVs), which can provide UAVs an obstacle field navigation capability in a partially or completely unknown obstacle-rich environment. The framework is composed of a LIDAR interface, a local obstacle grid generation, a receding horizon (RH) trajectory optimizer, a global shortest path search algorithm, and a climb rate limit detection logic. The key feature of the framework is the use of an optimization-based trajectory generation in which the obstacle avoidance problem is formulated as a nonlinear trajectory optimization problem with state and input constraints over the finite range of the sensor. This local trajectory optimization is combined with a global path search algorithm which provides a useful initial guess to the nonlinear optimization solver. Optimization is the natural process of finding the best trajectory that is dynamically feasible, safe within the vehicle's flight envelope, and collision-free at the same time. The optimal trajectory is continuously updated in real time by the numerical optimization solver, Nonlinear Trajectory Generation (NTG), which is a direct solver based on the spline approximation of trajectory for dynamically flat systems. In fact, the overall approach of this thesis to finding the optimal trajectory is similar to the model predictive control (MPC) or the receding horizon control (RHC), except that this thesis followed a two-layer design; thus, the optimal solution works as a guidance command to be followed by the controller of the vehicle. The framework is implemented in a real-time simulation environment, the Georgia Tech UAV Simulation Tool (GUST), and integrated in the onboard software of the rotary-wing UAV test-bed at Georgia Tech. Initially, the 2D vertical avoidance capability of real obstacles was tested in flight. The flight test evaluations were extended to the benchmark tests for 3D avoidance capability over the virtual obstacles, and finally it was demonstrated on real obstacles located at the McKenna MOUT site in Fort Benning, Georgia. Simulations and flight test evaluations demonstrate the feasibility of the developed framework for UAV applications involving low-altitude flight in an urban area.
New Directions in the Economic Theory of the Environment
NASA Astrophysics Data System (ADS)
Carraro, Carlo; Siniscalco, Domenico
1998-01-01
This volume provides a broad survey of the recent developments in the new economics of the environment and reports the state of the art on a new set of environmental problems, analytical tools and economic policies. Throughout the volume environmental problems are analyzed in an open, generally noncompetitive economy with transnational or global externalities. The first part deals with the relationship between the environment, economic growth and technological innovation. The second part analyzes the optimal design of environmental taxation, while the third part considers the international dimension of environmental policy.
Numerical simulation of thermal stress distributions in Czochralski-grown silicon crystals
NASA Astrophysics Data System (ADS)
Kumar, M. Avinash; Srinivasan, M.; Ramasamy, P.
2018-04-01
Numerical simulation is one of the important tools in the investigation and optimization of the single-crystal silicon grown by the Czochralski (Cz) method. A 2D steady global heat transfer model was used to investigate the temperature distribution and the thermal stress distributions at particular crystal position during the Cz growth process. The computation determines the thermal stress such as von Mises stress and maximum shear stress distribution along grown crystal and shows possible reason for dislocation formation in the Cz-grown single-crystal silicon.
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1989-01-01
Problems associated with the sequential approach to multidisciplinary design are discussed. A blackboard model is suggested as a potential tool for implementing the multilevel decomposition approach to overcome these problems. The blackboard model serves as a global database for the solution with each discipline acting as a knowledge source for updating the solution. With this approach, it is possible for engineers to improve the coordination, communication, and cooperation in the conceptual design process, allowing them to achieve a more optimal design from an interdisciplinary standpoint.
NASA Astrophysics Data System (ADS)
Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta
2016-06-01
With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.
NASA Astrophysics Data System (ADS)
Roy, Satadru
Traditional approaches to design and optimize a new system, often, use a system-centric objective and do not take into consideration how the operator will use this new system alongside of other existing systems. This "hand-off" between the design of the new system and how the new system operates alongside other systems might lead to a sub-optimal performance with respect to the operator-level objective. In other words, the system that is optimal for its system-level objective might not be best for the system-of-systems level objective of the operator. Among the few available references that describe attempts to address this hand-off, most follow an MDO-motivated subspace decomposition approach of first designing a very good system and then provide this system to the operator who decides the best way to use this new system along with the existing systems. The motivating example in this dissertation presents one such similar problem that includes aircraft design, airline operations and revenue management "subspaces". The research here develops an approach that could simultaneously solve these subspaces posed as a monolithic optimization problem. The monolithic approach makes the problem a Mixed Integer/Discrete Non-Linear Programming (MINLP/MDNLP) problem, which are extremely difficult to solve. The presence of expensive, sophisticated engineering analyses further aggravate the problem. To tackle this challenge problem, the work here presents a new optimization framework that simultaneously solves the subspaces to capture the "synergism" in the problem that the previous decomposition approaches may not have exploited, addresses mixed-integer/discrete type design variables in an efficient manner, and accounts for computationally expensive analysis tools. The framework combines concepts from efficient global optimization, Kriging partial least squares, and gradient-based optimization. This approach then demonstrates its ability to solve an 11 route airline network problem consisting of 94 decision variables including 33 integer and 61 continuous type variables. This application problem is a representation of an interacting group of systems and provides key challenges to the optimization framework to solve the MINLP problem, as reflected by the presence of a moderate number of integer and continuous type design variables and expensive analysis tool. The result indicates simultaneously solving the subspaces could lead to significant improvement in the fleet-level objective of the airline when compared to the previously developed sequential subspace decomposition approach. In developing the approach to solve the MINLP/MDNLP challenge problem, several test problems provided the ability to explore performance of the framework. While solving these test problems, the framework showed that it could solve other MDNLP problems including categorically discrete variables, indicating that the framework could have broader application than the new aircraft design-fleet allocation-revenue management problem.
Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft
NASA Astrophysics Data System (ADS)
Boozer, Charles Maxwell
A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.
Simultaneous Intrinsic and Extrinsic Parameter Identification of a Hand-Mounted Laser-Vision Sensor
Lee, Jong Kwang; Kim, Kiho; Lee, Yongseok; Jeong, Taikyeong
2011-01-01
In this paper, we propose a simultaneous intrinsic and extrinsic parameter identification of a hand-mounted laser-vision sensor (HMLVS). A laser-vision sensor (LVS), consisting of a camera and a laser stripe projector, is used as a sensor component of the robotic measurement system, and it measures the range data with respect to the robot base frame using the robot forward kinematics and the optical triangulation principle. For the optimal estimation of the model parameters, we applied two optimization techniques: a nonlinear least square optimizer and a particle swarm optimizer. Best-fit parameters, including both the intrinsic and extrinsic parameters of the HMLVS, are simultaneously obtained based on the least-squares criterion. From the simulation and experimental results, it is shown that the parameter identification problem considered was characterized by a highly multimodal landscape; thus, the global optimization technique such as a particle swarm optimization can be a promising tool to identify the model parameters for a HMLVS, while the nonlinear least square optimizer often failed to find an optimal solution even when the initial candidate solutions were selected close to the true optimum. The proposed optimization method does not require good initial guesses of the system parameters to converge at a very stable solution and it could be applied to a kinematically dissimilar robot system without loss of generality. PMID:22164104
SPOTting model parameters using a ready-made Python package
NASA Astrophysics Data System (ADS)
Houska, Tobias; Kraft, Philipp; Breuer, Lutz
2015-04-01
The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.
Inspection planning development: An evolutionary approach using reliability engineering as a tool
NASA Technical Reports Server (NTRS)
Graf, David A.; Huang, Zhaofeng
1994-01-01
This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.
Cheng, Wen-Chang
2012-01-01
In this paper we propose a robust lane detection and tracking method by combining particle filters with the particle swarm optimization method. This method mainly uses the particle filters to detect and track the local optimum of the lane model in the input image and then seeks the global optimal solution of the lane model by a particle swarm optimization method. The particle filter can effectively complete lane detection and tracking in complicated or variable lane environments. However, the result obtained is usually a local optimal system status rather than the global optimal system status. Thus, the particle swarm optimization method is used to further refine the global optimal system status in all system statuses. Since the particle swarm optimization method is a global optimization algorithm based on iterative computing, it can find the global optimal lane model by simulating the food finding way of fish school or insects under the mutual cooperation of all particles. In verification testing, the test environments included highways and ordinary roads as well as straight and curved lanes, uphill and downhill lanes, lane changes, etc. Our proposed method can complete the lane detection and tracking more accurately and effectively then existing options. PMID:23235453
Structure and software tools of AIDA.
Duisterhout, J S; Franken, B; Witte, F
1987-01-01
AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write implementation-specific code which can be selected and loaded by a special source loader, being part of the AIDA software. This feature is also accessible for maintaining software on different sites and on different installations.
Siddiqua, Shaila; Mamun, Abdullah Al; Enayetul Babar, Sheikh Md
2015-01-01
Renewable biodiesels are needed as an alternative to petroleum-derived transport fuels, which contribute to global warming and are of limited availability. Algae biomass, are a potential source of renewable energy, and they can be converted into energy such as biofuels. This study introduces an integrated method for the production of biodiesel from Chara vulgaris algae collected from the coastal region of Bangladesh. The Box-Behnken design based on response surface methods (RSM) used as the statistical tool to optimize three variables for predicting the best performing conditions (calorific value and yield) of algae biodiesel. The three parameters for production condition were chloroform (X1), sodium chloride concentration (X2) and temperature (X3). Optimal conditions were estimated by the aid of statistical regression analysis and surface plot chart. The optimal condition of biodiesel production parameter for 12 g of dry algae biomass was observed to be 198 ml chloroform with 0.75 % sodium chloride at 65 °C temperature, where the calorific value of biodiesel is 9255.106 kcal/kg and yield 3.6 ml.
Photonic crystal enhanced silicon cell based thermophotovoltaic systems
Yeng, Yi Xiang; Chan, Walker R.; Rinnerbauer, Veronika; ...
2015-01-30
We report the design, optimization, and experimental results of large area commercial silicon solar cell based thermophotovoltaic (TPV) energy conversion systems. Using global non-linear optimization tools, we demonstrate theoretically a maximum radiative heat-to-electricity efficiency of 6.4% and a corresponding output electrical power density of 0.39 W cm⁻² at temperature T = 1660 K when implementing both the optimized two-dimensional (2D) tantalum photonic crystal (PhC) selective emitter, and the optimized 1D tantalum pentoxide – silicon dioxide PhC cold-side selective filter. In addition, we have developed an experimental large area TPV test setup that enables accurate measurement of radiative heat-to-electricity efficiency formore » any emitter-filter-TPV cell combination of interest. In fact, the experimental results match extremely well with predictions of our numerical models. Our experimental setup achieved a maximum output electrical power density of 0.10W cm⁻² and radiative heat-to-electricity efficiency of 1.18% at T = 1380 K using commercial wafer size back-contacted silicon solar cells.« less
Solar electricity supply isolines of generation capacity and storage.
Grossmann, Wolf; Grossmann, Iris; Steininger, Karl W
2015-03-24
The recent sharp drop in the cost of photovoltaic (PV) electricity generation accompanied by globally rapidly increasing investment in PV plants calls for new planning and management tools for large-scale distributed solar networks. Of major importance are methods to overcome intermittency of solar electricity, i.e., to provide dispatchable electricity at minimal costs. We find that pairs of electricity generation capacity G and storage S that give dispatchable electricity and are minimal with respect to S for a given G exhibit a smooth relationship of mutual substitutability between G and S. These isolines between G and S support the solving of several tasks, including the optimal sizing of generation capacity and storage, optimal siting of solar parks, optimal connections of solar parks across time zones for minimizing intermittency, and management of storage in situations of far below average insolation to provide dispatchable electricity. G-S isolines allow determining the cost-optimal pair (G,S) as a function of the cost ratio of G and S. G-S isolines provide a method for evaluating the effect of geographic spread and time zone coverage on costs of solar electricity.
Solar electricity supply isolines of generation capacity and storage
Grossmann, Wolf; Grossmann, Iris; Steininger, Karl W.
2015-01-01
The recent sharp drop in the cost of photovoltaic (PV) electricity generation accompanied by globally rapidly increasing investment in PV plants calls for new planning and management tools for large-scale distributed solar networks. Of major importance are methods to overcome intermittency of solar electricity, i.e., to provide dispatchable electricity at minimal costs. We find that pairs of electricity generation capacity G and storage S that give dispatchable electricity and are minimal with respect to S for a given G exhibit a smooth relationship of mutual substitutability between G and S. These isolines between G and S support the solving of several tasks, including the optimal sizing of generation capacity and storage, optimal siting of solar parks, optimal connections of solar parks across time zones for minimizing intermittency, and management of storage in situations of far below average insolation to provide dispatchable electricity. G−S isolines allow determining the cost-optimal pair (G,S) as a function of the cost ratio of G and S. G−S isolines provide a method for evaluating the effect of geographic spread and time zone coverage on costs of solar electricity. PMID:25755261
Automation of POST Cases via External Optimizer and "Artificial p2" Calculation
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.
2017-01-01
During early conceptual design of complex systems, speed and accuracy are often at odds with one another. While many characteristics of the design are fluctuating rapidly during this phase there is nonetheless a need to acquire accurate data from which to down-select designs as these decisions will have a large impact upon program life-cycle cost. Therefore enabling the conceptual designer to produce accurate data in a timely manner is tantamount to program viability. For conceptual design of launch vehicles, trajectory analysis and optimization is a large hurdle. Tools such as the industry standard Program to Optimize Simulated Trajectories (POST) have traditionally required an expert in the loop for setting up inputs, running the program, and analyzing the output. The solution space for trajectory analysis is in general non-linear and multi-modal requiring an experienced analyst to weed out sub-optimal designs in pursuit of the global optimum. While an experienced analyst presented with a vehicle similar to one which they have already worked on can likely produce optimal performance figures in a timely manner, as soon as the "experienced" or "similar" adjectives are invalid the process can become lengthy. In addition, an experienced analyst working on a similar vehicle may go into the analysis with preconceived ideas about what the vehicle's trajectory should look like which can result in sub-optimal performance being recorded. Thus, in any case but the ideal either time or accuracy can be sacrificed. In the authors' previous work a tool called multiPOST was created which captures the heuristics of a human analyst over the process of executing trajectory analysis with POST. However without the instincts of a human in the loop, this method relied upon Monte Carlo simulation to find successful trajectories. Overall the method has mixed results, and in the context of optimizing multiple vehicles it is inefficient in comparison to the method presented POST's internal optimizer functions like any other gradient-based optimizer. It has a specified variable to optimize whose value is represented as optval, a set of dependent constraints to meet with associated forms and tolerances whose value is represented as p2, and a set of independent variables known as the u-vector to modify in pursuit of optimality. Each of these quantities are calculated or manipulated at a certain phase within the trajectory. The optimizer is further constrained by the requirement that the input u-vector must result in a trajectory which proceeds through each of the prescribed events in the input file. For example, if the input u-vector causes the vehicle to crash before it can achieve the orbital parameters required for a parking orbit, then the run will fail without engaging the optimizer, and a p2 value of exactly zero is returned. This poses a problem, as this "non-connecting" region of the u-vector space is far larger than the "connecting" region which returns a non-zero value of p2 and can be worked on by the internal optimizer. Finding this connecting region and more specifically the global optimum within this region has traditionally required the use of an expert analyst.
Salceda, Susana; Barican, Arnaldo; Buscaino, Jacklyn; Goldman, Bruce; Klevenberg, Jim; Kuhn, Melissa; Lehto, Dennis; Lin, Frank; Nguyen, Phong; Park, Charles; Pearson, Francesca; Pittaro, Rick; Salodkar, Sayali; Schueren, Robert; Smith, Corey; Troup, Charles; Tsou, Dean; Vangbo, Mattias; Wunderle, Justus; King, David
2017-05-01
The RapidHIT ® ID is a fully automated sample-to-answer system for short tandem repeat (STR)-based human identification. The RapidHIT ID has been optimized for use in decentralized environments and processes presumed single source DNA samples, generating Combined DNA Index System (CODIS)-compatible DNA profiles in less than 90min. The system is easy to use, requiring less than one minute of hands-on time. Profiles are reviewed using centralized linking software, RapidLINK™ (IntegenX, Pleasanton, CA), a software tool designed to collate DNA profiles from single or multiple RapidHIT ID systems at different geographic locations. The RapidHIT ID has been designed to employ GlobalFiler ® Express and AmpFLSTR ® NGMSElect™, Thermo Fisher Scientific (Waltham, MA) STR chemistries. The Developmental Validation studies were performed using GlobalFiler ® Express with single source reference samples according to Scientific Working Group for DNA Analysis Methods guidelines. These results show that multiple RapidHIT ID systems networked with RapidLINK software form a highly reliable system for wide-scale deployment in locations such as police booking stations and border crossings enabling real-time testing of arrestees, potential human trafficking victims, and other instances where rapid turnaround is essential. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa
2008-01-01
Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.
Parameter estimation of a pulp digester model with derivative-free optimization strategies
NASA Astrophysics Data System (ADS)
Seiça, João C.; Romanenko, Andrey; Fernandes, Florbela P.; Santos, Lino O.; Fernandes, Natércia C. P.
2017-07-01
The work concerns the parameter estimation in the context of the mechanistic modelling of a pulp digester. The problem is cast as a box bounded nonlinear global optimization problem in order to minimize the mismatch between the model outputs with the experimental data observed at a real pulp and paper plant. MCSFilter and Simulated Annealing global optimization methods were used to solve the optimization problem. While the former took longer to converge to the global minimum, the latter terminated faster at a significantly higher value of the objective function and, thus, failed to find the global solution.
Optimizing energy growth as a tool for finding exact coherent structures
NASA Astrophysics Data System (ADS)
Olvera, D.; Kerswell, R. R.
2017-08-01
We discuss how searching for finite-amplitude disturbances of a given energy that maximize their subsequent energy growth after a certain later time T can be used to probe the phase space around a reference state and ultimately to find other nearby solutions. The procedure relies on the fact that of all the initial disturbances on a constant-energy hypersphere, the optimization procedure will naturally select the one that lies closest to the stable manifold of a nearby solution in phase space if T is large enough. Then, when in its subsequent evolution the optimal disturbance transiently approaches the new solution, a flow state at this point can be used as an initial guess to converge the solution to machine precision. We illustrate this approach in plane Couette flow by rediscovering the spanwise-localized "snake" solutions of Schneider et al. [Phys. Rev. Lett. 104, 104501 (2010), 10.1103/PhysRevLett.104.104501], probing phase space at very low Reynolds numbers (less than 127.7 ) where the constant-shear solution is believed to be the global attractor and examining how the edge between laminar and turbulent flow evolves when stable stratification eliminates the turbulence. We also show that the steady snake solution smoothly delocalizes as unstable stratification is gradually turned on until it connects (via an intermediary global three-dimensional solution) to two-dimensional Rayleigh-Bénard roll solutions.
Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.
NASA Astrophysics Data System (ADS)
Dong, Bo-Qing; Jia, Yan; Li, Jingna; Wu, Jiahong
2018-05-01
This paper focuses on a system of the 2D magnetohydrodynamic (MHD) equations with the kinematic dissipation given by the fractional operator (-Δ )^α and the magnetic diffusion by partial Laplacian. We are able to show that this system with any α >0 always possesses a unique global smooth solution when the initial data is sufficiently smooth. In addition, we make a detailed study on the large-time behavior of these smooth solutions and obtain optimal large-time decay rates. Since the magnetic diffusion is only partial here, some classical tools such as the maximal regularity property for the 2D heat operator can no longer be applied. A key observation on the structure of the MHD equations allows us to get around the difficulties due to the lack of full Laplacian magnetic diffusion. The results presented here are the sharpest on the global regularity problem for the 2D MHD equations with only partial magnetic diffusion.
Global Optimization of Low-Thrust Interplanetary Trajectories Subject to Operational Constraints
NASA Technical Reports Server (NTRS)
Englander, Jacob A.; Vavrina, Matthew A.; Hinckley, David
2016-01-01
Low-thrust interplanetary space missions are highly complex and there can be many locally optimal solutions. While several techniques exist to search for globally optimal solutions to low-thrust trajectory design problems, they are typically limited to unconstrained trajectories. The operational design community in turn has largely avoided using such techniques and has primarily focused on accurate constrained local optimization combined with grid searches and intuitive design processes at the expense of efficient exploration of the global design space. This work is an attempt to bridge the gap between the global optimization and operational design communities by presenting a mathematical framework for global optimization of low-thrust trajectories subject to complex constraints including the targeting of planetary landing sites, a solar range constraint to simplify the thermal design of the spacecraft, and a real-world multi-thruster electric propulsion system that must switch thrusters on and off as available power changes over the course of a mission.
NASA Technical Reports Server (NTRS)
Meyn, Larry A.
2018-01-01
One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use
A genetic algorithm approach in interface and surface structure optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jian
The thesis is divided into two parts. In the first part a global optimization method is developed for the interface and surface structures optimization. Two prototype systems are chosen to be studied. One is Si[001] symmetric tilted grain boundaries and the other is Ag/Au induced Si(111) surface. It is found that Genetic Algorithm is very efficient in finding lowest energy structures in both cases. Not only existing structures in the experiments can be reproduced, but also many new structures can be predicted using Genetic Algorithm. Thus it is shown that Genetic Algorithm is a extremely powerful tool for the materialmore » structures predictions. The second part of the thesis is devoted to the explanation of an experimental observation of thermal radiation from three-dimensional tungsten photonic crystal structures. The experimental results seems astounding and confusing, yet the theoretical models in the paper revealed the physics insight behind the phenomena and can well reproduced the experimental results.« less
I-FORCAST: Rapid Flight Planning Tool
NASA Technical Reports Server (NTRS)
Oaida, Bogdan; Khan, Mohammed; Mercury, Michael B.
2012-01-01
I-FORCAST (Instrument - Field of Regard Coverage Analysis and Simulation Tool) is a flight planning tool specifically designed for quickly verifying the feasibility and estimating the cost of airborne remote sensing campaigns (see figure). Flights are simulated by being broken into three predefined routing algorithms as necessary: mapping in a snaking pattern, mapping the area around a point target (like a volcano) with a star pattern, and mapping the area between a list of points. The tool has been used to plan missions for radar, lidar, and in-situ atmospheric measuring instruments for a variety of aircraft. It has also been used for global and regional scale campaigns and automatically includes landings when refueling is required. The software has been compared to the flight times of known commercial aircraft route travel times, as well as a UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar) campaign, and was within 15% of the actual flight time. Most of the discrepancy is due to non-optimal flight paths taken by actual aircraft to avoid restricted airspace and used to follow landing and take-off corridors.
Sghaier, W; Hergon, E; Desroches, A
2015-08-01
Risk management is a fundamental component of any successful company, whether it is in economic, societal or environmental aspect. Risk management is an especially important activity for companies that optimal security challenge of products and services is great. This is the case especially for the health sector institutions. Risk management is therefore a decision support tool and a means to ensure the sustainability of an organization. In this context, what methods and approaches implemented to manage the risks? Through this state of the art, we are interested in the concept of risk and risk management processes. Then we focus on the different methods of risk management and the criteria for choosing among these methods. Finally we highlight the need to supplement these methods by a systemic and global approach including through risk assessment by the audits. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
A new graph-based method for pairwise global network alignment
Klau, Gunnar W
2009-01-01
Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162
Energy efficiency drives the global seasonal distribution of birds.
Somveille, Marius; Rodrigues, Ana S L; Manica, Andrea
2018-06-01
The uneven distribution of biodiversity on Earth is one of the most general and puzzling patterns in ecology. Many hypotheses have been proposed to explain it, based on evolutionary processes or on constraints related to geography and energy. However, previous studies investigating these hypotheses have been largely descriptive due to the logistical difficulties of conducting controlled experiments on such large geographical scales. Here, we use bird migration-the seasonal redistribution of approximately 15% of bird species across the world-as a natural experiment for testing the species-energy relationship, the hypothesis that animal diversity is driven by energetic constraints. We develop a mechanistic model of bird distributions across the world, and across seasons, based on simple ecological and energetic principles. Using this model, we show that bird species distributions optimize the balance between energy acquisition and energy expenditure while taking into account competition with other species. These findings support, and provide a mechanistic explanation for, the species-energy relationship. The findings also provide a general explanation of migration as a mechanism that allows birds to optimize their energy budget in the face of seasonality and competition. Finally, our mechanistic model provides a tool for predicting how ecosystems will respond to global anthropogenic change.
Fractional Gaussian model in global optimization
NASA Astrophysics Data System (ADS)
Dimri, V. P.; Srivastava, R. P.
2009-12-01
Earth system is inherently non-linear and it can be characterized well if we incorporate no-linearity in the formulation and solution of the problem. General tool often used for characterization of the earth system is inversion. Traditionally inverse problems are solved using least-square based inversion by linearizing the formulation. The initial model in such inversion schemes is often assumed to follow posterior Gaussian probability distribution. It is now well established that most of the physical properties of the earth follow power law (fractal distribution). Thus, the selection of initial model based on power law probability distribution will provide more realistic solution. We present a new method which can draw samples of posterior probability density function very efficiently using fractal based statistics. The application of the method has been demonstrated to invert band limited seismic data with well control. We used fractal based probability density function which uses mean, variance and Hurst coefficient of the model space to draw initial model. Further this initial model is used in global optimization inversion scheme. Inversion results using initial models generated by our method gives high resolution estimates of the model parameters than the hitherto used gradient based liner inversion method.
Global Snow from Space: Development of a Satellite-based, Terrestrial Snow Mission Planning Tool
NASA Astrophysics Data System (ADS)
Forman, B. A.; Kumar, S.; LeMoigne, J.; Nag, S.
2017-12-01
A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary - or perhaps contradictory - information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?
Towards the Development of a Global, Satellite-based, Terrestrial Snow Mission Planning Tool
NASA Technical Reports Server (NTRS)
Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja
2017-01-01
A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASAs Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical orbital configuration.One objective the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include:1. What observational records are needed (in space and time) to maximize terrestrial snow experimental utility?2. How might observations be coordinated (in space and time) to maximize utility? 3. What is the additional utility associated with an additional observation?4. How can future mission costs being minimized while ensuring Science requirements are fulfilled?
Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool
NASA Technical Reports Server (NTRS)
Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja
2017-01-01
A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?
Global Futures: The Emerging Scenario.
ERIC Educational Resources Information Center
Seth, Satish C.
1983-01-01
Acknowledging global interdependence, especially in economics, may be the most important step toward resolving international conflicts. Describes seven major global dangers and gives scenarios for exploring likely global futures. As "tools of prescription" these global models are inadequate, but as "tools of analysis" they have…
NASA Astrophysics Data System (ADS)
Fuchs, Christian; Poulenard, Sylvain; Perlot, Nicolas; Riedi, Jerome; Perdigues, Josep
2017-02-01
Optical satellite communications play an increasingly important role in a number of space applications. However, if the system concept includes optical links to the surface of the Earth, the limited availability due to clouds and other atmospheric impacts need to be considered to give a reliable estimate of the system performance. An OGS network is required for increasing the availability to acceptable figures. In order to realistically estimate the performance and achievable throughput in various scenarios, a simulation tool has been developed under ESA contract. The tool is based on a database of 5 years of cloud data with global coverage and can thus easily simulate different optical ground station network topologies for LEO- and GEO-to-ground links. Further parameters, like e.g. limited availability due to sun blinding and atmospheric turbulence, are considered as well. This paper gives an overview about the simulation tool, the cloud database, as well as the modelling behind the simulation scheme. Several scenarios have been investigated: LEO-to-ground links, GEO feeder links, and GEO relay links. The key results of the optical ground station network optimization and throughput estimations will be presented. The implications of key technical parameters, as e.g. memory size aboard the satellite, will be discussed. Finally, potential system designs for LEO- and GEO-systems will be presented.
Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand
2007-07-01
This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATLmore » Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.« less
Clinical microbiology informatics.
Rhoads, Daniel D; Sintchenko, Vitali; Rauch, Carol A; Pantanowitz, Liron
2014-10-01
The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Clinical Microbiology Informatics
Sintchenko, Vitali; Rauch, Carol A.; Pantanowitz, Liron
2014-01-01
SUMMARY The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. PMID:25278581
Bankruptcy Prevention: New Effort to Reflect on Legal and Social Changes.
Kliestik, Tomas; Misankova, Maria; Valaskova, Katarina; Svabova, Lucia
2018-04-01
Every corporation has an economic and moral responsibility to its stockholders to perform well financially. However, the number of bankruptcies in Slovakia has been growing for several years without an apparent macroeconomic cause. To prevent a rapid denigration and to prevent the outflow of foreign capital, various efforts are being zealously implemented. Robust analysis using conventional bankruptcy prediction tools revealed that the existing models are adaptable to local conditions, particularly local legislation. Furthermore, it was confirmed that most of these outdated tools have sufficient capability to warn of impending financial problems several years in advance. A novel bankruptcy prediction tool that outperforms the conventional models was developed. However, it is increasingly challenging to predict bankruptcy risk as corporations have become more global and more complex and as they have developed sophisticated schemes to hide their actual situations under the guise of "optimization" for tax authorities. Nevertheless, scepticism remains because economic engineers have established bankruptcy as a strategy to limit the liability resulting from court-imposed penalties.
NASA Astrophysics Data System (ADS)
Lu, Yanrong; Liao, Fucheng; Deng, Jiamei; Liu, Huiyang
2017-09-01
This paper investigates the cooperative global optimal preview tracking problem of linear multi-agent systems under the assumption that the output of a leader is a previewable periodic signal and the topology graph contains a directed spanning tree. First, a type of distributed internal model is introduced, and the cooperative preview tracking problem is converted to a global optimal regulation problem of an augmented system. Second, an optimal controller, which can guarantee the asymptotic stability of the augmented system, is obtained by means of the standard linear quadratic optimal preview control theory. Third, on the basis of proving the existence conditions of the controller, sufficient conditions are given for the original problem to be solvable, meanwhile a cooperative global optimal controller with error integral and preview compensation is derived. Finally, the validity of theoretical results is demonstrated by a numerical simulation.
Dispositional optimism and sleep quality: a test of mediating pathways
Cribbet, Matthew; Kent de Grey, Robert G.; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W.
2016-01-01
Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways. PMID:27592128
Dispositional optimism and sleep quality: a test of mediating pathways.
Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W
2017-04-01
Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.
Non-adaptive and adaptive hybrid approaches for enhancing water quality management
NASA Astrophysics Data System (ADS)
Kalwij, Ineke M.; Peralta, Richard C.
2008-09-01
SummaryUsing optimization to help solve groundwater management problems cost-effectively is becoming increasingly important. Hybrid optimization approaches, that combine two or more optimization algorithms, will become valuable and common tools for addressing complex nonlinear hydrologic problems. Hybrid heuristic optimizers have capabilities far beyond those of a simple genetic algorithm (SGA), and are continuously improving. SGAs having only parent selection, crossover, and mutation are inefficient and rarely used for optimizing contaminant transport management. Even an advanced genetic algorithm (AGA) that includes elitism (to emphasize using the best strategies as parents) and healing (to help assure optimal strategy feasibility) is undesirably inefficient. Much more efficient than an AGA is the presented hybrid (AGCT), which adds comprehensive tabu search (TS) features to an AGA. TS mechanisms (TS probability, tabu list size, search coarseness and solution space size, and a TS threshold value) force the optimizer to search portions of the solution space that yield superior pumping strategies, and to avoid reproducing similar or inferior strategies. An AGCT characteristic is that TS control parameters are unchanging during optimization. However, TS parameter values that are ideal for optimization commencement can be undesirable when nearing assumed global optimality. The second presented hybrid, termed global converger (GC), is significantly better than the AGCT. GC includes AGCT plus feedback-driven auto-adaptive control that dynamically changes TS parameters during run-time. Before comparing AGCT and GC, we empirically derived scaled dimensionless TS control parameter guidelines by evaluating 50 sets of parameter values for a hypothetical optimization problem. For the hypothetical area, AGCT optimized both well locations and pumping rates. The parameters are useful starting values because using trial-and-error to identify an ideal combination of control parameter values for a new optimization problem can be time consuming. For comparison, AGA, AGCT, and GC are applied to optimize pumping rates for assumed well locations of a complex large-scale contaminant transport and remediation optimization problem at Blaine Naval Ammunition Depot (NAD). Both hybrid approaches converged more closely to the optimal solution than the non-hybrid AGA. GC averaged 18.79% better convergence than AGCT, and 31.9% than AGA, within the same computation time (12.5 days). AGCT averaged 13.1% better convergence than AGA. The GC can significantly reduce the burden of employing computationally intensive hydrologic simulation models within a limited time period and for real-world optimization problems. Although demonstrated for a groundwater quality problem, it is also applicable to other arenas, such as managing salt water intrusion and surface water contaminant loading.
Joint Meteorological Statistics of Observing Sites for the Event Horizon Telescope
NASA Astrophysics Data System (ADS)
Lope Córdova Rosado, Rodrigo Eduardo; Doeleman, Sheperd; Paine, Scott; Johnson, Michael; Event Horizon Telescope (EHT)
2018-01-01
The Event Horizon Telescope (EHT) aims to resolve the general relativistic shadow of Sgr A*, the supermassive black hole at the center of our galaxy, via Very Long Baseline Interferometry (VLBI) measurements with a multinational array of radio observatories. In order to optimize the scheduling of future observations, we have developed tools to model the atmospheric opacity at each EHT site using the past 10 years of Global Forecast System (GFS) data describing the atmospheric state. These tools allow us to determine the ideal observing windows for EHT observations and to assess the suitability and impact of new EHT sites. We describe our modeling framework, compare our models to in-situ measurements at EHT sites, and discuss the implications of weather limitations for planned extensions of the EHT to higher frequencies, as well as additional sites and observation windows.
Vaccine Design: Emerging Concepts and Renewed Optimism
Grimm, Sebastian K.; Ackerman, Margaret E.
2013-01-01
Arguably, vaccination represents the single most effective medical intervention ever developed. Yet, vaccines have failed to provide any or adequate protection against some of the most significant global diseases. The pathogens responsible for these vaccine-recalcitrant diseases have properties that allow them to evade immune surveillance and misdirect or eliminate the immune response. However, genomic and systems biology tools, novel adjuvants and delivery systems, and refined molecular insight into protective immunity have started to redefine the landscape, and results from recent efficacy trials of HIV and malaria vaccines have instilled hope that another golden age of vaccines may be on the horizon. PMID:23474232
Characterization of technical surfaces by structure function analysis
NASA Astrophysics Data System (ADS)
Kalms, Michael; Kreis, Thomas; Bergmann, Ralf B.
2018-03-01
The structure function is a tool for characterizing technical surfaces that exhibits a number of advantages over Fourierbased analysis methods. So it is optimally suited for analyzing the height distributions of surfaces measured by full-field non-contacting methods. The structure function is thus a useful method to extract global or local criteria like e. g. periodicities, waviness, lay, or roughness to analyze and evaluate technical surfaces. After the definition of line- and area-structure function and offering effective procedures for their calculation this paper presents examples using simulated and measured data of technical surfaces including aircraft parts.
Parameter Trade Studies For Coherent Lidar Wind Measurements of Wind from Space
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Frehlich, Rod G.
2007-01-01
The design of an orbiting wind profiling lidar requires selection of dozens of lidar, measurement scenario, and mission geometry parameters; in addition to prediction of atmospheric parameters. Typical mission designs do not include a thorough trade optimization of all of these parameters. We report here the integration of a recently published parameterization of coherent lidar wind velocity measurement performance with an orbiting coherent wind lidar computer simulation; and the use of these combined tools to perform some preliminary parameter trades. We use the 2006 NASA Global Wind Observing Sounder mission design as the starting point for the trades.
Falk, Marni J; Shen, Lishuang; Gonzalez, Michael; Leipzig, Jeremy; Lott, Marie T; Stassen, Alphons P M; Diroma, Maria Angela; Navarro-Gomez, Daniel; Yeske, Philip; Bai, Renkui; Boles, Richard G; Brilhante, Virginia; Ralph, David; DaRe, Jeana T; Shelton, Robert; Terry, Sharon F; Zhang, Zhe; Copeland, William C; van Oven, Mannis; Prokisch, Holger; Wallace, Douglas C; Attimonelli, Marcella; Krotoski, Danuta; Zuchner, Stephan; Gai, Xiaowu
2015-03-01
Success rates for genomic analyses of highly heterogeneous disorders can be greatly improved if a large cohort of patient data is assembled to enhance collective capabilities for accurate sequence variant annotation, analysis, and interpretation. Indeed, molecular diagnostics requires the establishment of robust data resources to enable data sharing that informs accurate understanding of genes, variants, and phenotypes. The "Mitochondrial Disease Sequence Data Resource (MSeqDR) Consortium" is a grass-roots effort facilitated by the United Mitochondrial Disease Foundation to identify and prioritize specific genomic data analysis needs of the global mitochondrial disease clinical and research community. A central Web portal (https://mseqdr.org) facilitates the coherent compilation, organization, annotation, and analysis of sequence data from both nuclear and mitochondrial genomes of individuals and families with suspected mitochondrial disease. This Web portal provides users with a flexible and expandable suite of resources to enable variant-, gene-, and exome-level sequence analysis in a secure, Web-based, and user-friendly fashion. Users can also elect to share data with other MSeqDR Consortium members, or even the general public, either by custom annotation tracks or through the use of a convenient distributed annotation system (DAS) mechanism. A range of data visualization and analysis tools are provided to facilitate user interrogation and understanding of genomic, and ultimately phenotypic, data of relevance to mitochondrial biology and disease. Currently available tools for nuclear and mitochondrial gene analyses include an MSeqDR GBrowse instance that hosts optimized mitochondrial disease and mitochondrial DNA (mtDNA) specific annotation tracks, as well as an MSeqDR locus-specific database (LSDB) that curates variant data on more than 1300 genes that have been implicated in mitochondrial disease and/or encode mitochondria-localized proteins. MSeqDR is integrated with a diverse array of mtDNA data analysis tools that are both freestanding and incorporated into an online exome-level dataset curation and analysis resource (GEM.app) that is being optimized to support needs of the MSeqDR community. In addition, MSeqDR supports mitochondrial disease phenotyping and ontology tools, and provides variant pathogenicity assessment features that enable community review, feedback, and integration with the public ClinVar variant annotation resource. A centralized Web-based informed consent process is being developed, with implementation of a Global Unique Identifier (GUID) system to integrate data deposited on a given individual from different sources. Community-based data deposition into MSeqDR has already begun. Future efforts will enhance capabilities to incorporate phenotypic data that enhance genomic data analyses. MSeqDR will fill the existing void in bioinformatics tools and centralized knowledge that are necessary to enable efficient nuclear and mtDNA genomic data interpretation by a range of shareholders across both clinical diagnostic and research settings. Ultimately, MSeqDR is focused on empowering the global mitochondrial disease community to better define and explore mitochondrial diseases. Copyright © 2014 Elsevier Inc. All rights reserved.
Falk, Marni J.; Shen, Lishuang; Gonzalez, Michael; Leipzig, Jeremy; Lott, Marie T.; Stassen, Alphons P.M.; Diroma, Maria Angela; Navarro-Gomez, Daniel; Yeske, Philip; Bai, Renkui; Boles, Richard G.; Brilhante, Virginia; Ralph, David; DaRe, Jeana T.; Shelton, Robert; Terry, Sharon; Zhang, Zhe; Copeland, William C.; van Oven, Mannis; Prokisch, Holger; Wallace, Douglas C.; Attimonelli, Marcella; Krotoski, Danuta; Zuchner, Stephan; Gai, Xiaowu
2014-01-01
Success rates for genomic analyses of highly heterogeneous disorders can be greatly improved if a large cohort of patient data is assembled to enhance collective capabilities for accurate sequence variant annotation, analysis, and interpretation. Indeed, molecular diagnostics requires the establishment of robust data resources to enable data sharing that informs accurate understanding of genes, variants, and phenotypes. The “Mitochondrial Disease Sequence Data Resource (MSeqDR) Consortium” is a grass-roots effort facilitated by the United Mitochondrial Disease Foundation to identify and prioritize specific genomic data analysis needs of the global mitochondrial disease clinical and research community. A central Web portal (https://mseqdr.org) facilitates the coherent compilation, organization, annotation, and analysis of sequence data from both nuclear and mitochondrial genomes of individuals and families with suspected mitochondrial disease. This Web portal provides users with a flexible and expandable suite of resources to enable variant-, gene-, and exome-level sequence analysis in a secure, Web-based, and user-friendly fashion. Users can also elect to share data with other MSeqDR Consortium members, or even the general public, either by custom annotation tracks or through use of a convenient distributed annotation system (DAS) mechanism. A range of data visualization and analysis tools are provided to facilitate user interrogation and understanding of genomic, and ultimately phenotypic, data of relevance to mitochondrial biology and disease. Currently available tools for nuclear and mitochondrial gene analyses include an MSeqDR GBrowse instance that hosts optimized mitochondrial disease and mitochondrial DNA (mtDNA) specific annotation tracks, as well as an MSeqDR locus-specific database (LSDB) that curates variant data on more than 1,300 genes that have been implicated in mitochondrial disease and/or encode mitochondria-localized proteins. MSeqDR is integrated with a diverse array of mtDNA data analysis tools that are both freestanding and incorporated into an online exome-level dataset curation and analysis resource (GEM.app) that is being optimized to support needs of the MSeqDR community. In addition, MSeqDR supports mitochondrial disease phenotyping and ontology tools, and provides variant pathogenicity assessment features that enable community review, feedback, and integration with the public ClinVar variant annotation resource. A centralized Web-based informed consent process is being developed, with implementation of a Global Unique Identifier (GUID) system to integrate data deposited on a given individual from different sources. Community-based data deposition into MSeqDR has already begun. Future efforts will enhance capabilities to incorporate phenotypic data that enhance genomic data analyses. MSeqDR will fill the existing void in bioinformatics tools and centralized knowledge that are necessary to enable efficient nuclear and mtDNA genomic data interpretation by a range of shareholders across both clinical diagnostic and research settings. Ultimately, MSeqDR is focused on empowering the global mitochondrial disease community to better define and explore mitochondrial disease. PMID:25542617
Multiple-copy state discrimination: Thinking globally, acting locally
NASA Astrophysics Data System (ADS)
Higgins, B. L.; Doherty, A. C.; Bartlett, S. D.; Pryde, G. J.; Wiseman, H. M.
2011-05-01
We theoretically investigate schemes to discriminate between two nonorthogonal quantum states given multiple copies. We consider a number of state discrimination schemes as applied to nonorthogonal, mixed states of a qubit. In particular, we examine the difference that local and global optimization of local measurements makes to the probability of obtaining an erroneous result, in the regime of finite numbers of copies N, and in the asymptotic limit as N→∞. Five schemes are considered: optimal collective measurements over all copies, locally optimal local measurements in a fixed single-qubit measurement basis, globally optimal fixed local measurements, locally optimal adaptive local measurements, and globally optimal adaptive local measurements. Here an adaptive measurement is one in which the measurement basis can depend on prior measurement results. For each of these measurement schemes we determine the probability of error (for finite N) and the scaling of this error in the asymptotic limit. In the asymptotic limit, it is known analytically (and we verify numerically) that adaptive schemes have no advantage over the optimal fixed local scheme. Here we show moreover that, in this limit, the most naive scheme (locally optimal fixed local measurements) is as good as any noncollective scheme except for states with less than 2% mixture. For finite N, however, the most sophisticated local scheme (globally optimal adaptive local measurements) is better than any other noncollective scheme for any degree of mixture.
Multiple-copy state discrimination: Thinking globally, acting locally
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higgins, B. L.; Pryde, G. J.; Wiseman, H. M.
2011-05-15
We theoretically investigate schemes to discriminate between two nonorthogonal quantum states given multiple copies. We consider a number of state discrimination schemes as applied to nonorthogonal, mixed states of a qubit. In particular, we examine the difference that local and global optimization of local measurements makes to the probability of obtaining an erroneous result, in the regime of finite numbers of copies N, and in the asymptotic limit as N{yields}{infinity}. Five schemes are considered: optimal collective measurements over all copies, locally optimal local measurements in a fixed single-qubit measurement basis, globally optimal fixed local measurements, locally optimal adaptive local measurements,more » and globally optimal adaptive local measurements. Here an adaptive measurement is one in which the measurement basis can depend on prior measurement results. For each of these measurement schemes we determine the probability of error (for finite N) and the scaling of this error in the asymptotic limit. In the asymptotic limit, it is known analytically (and we verify numerically) that adaptive schemes have no advantage over the optimal fixed local scheme. Here we show moreover that, in this limit, the most naive scheme (locally optimal fixed local measurements) is as good as any noncollective scheme except for states with less than 2% mixture. For finite N, however, the most sophisticated local scheme (globally optimal adaptive local measurements) is better than any other noncollective scheme for any degree of mixture.« less
NASA Astrophysics Data System (ADS)
Martins, T. M.; Kelman, R.; Metello, M.; Ciarlini, A.; Granville, A. C.; Hespanhol, P.; Castro, T. L.; Gottin, V. M.; Pereira, M. V. F.
2015-12-01
The hydroelectric potential of a river is proportional to its head and water flows. Selecting the best development alternative for Greenfield projects watersheds is a difficult task, since it must balance demands for infrastructure, especially in the developing world where a large potential remains unexplored, with environmental conservation. Discussions usually diverge into antagonistic views, as in recent projects in the Amazon forest, for example. This motivates the construction of a computational tool that will support a more qualified debate regarding development/conservation options. HERA provides the optimal head division partition of a river considering technical, economic and environmental aspects. HERA has three main components: (i) pre-processing GIS of topographic and hydrologic data; (ii) automatic engineering and equipment design and budget estimation for candidate projects; (iii) translation of division-partition problem into a mathematical programming model. By integrating an automatic calculation with geoprocessing tools, cloud computation and optimization techniques, HERA makes it possible countless head partition division alternatives to be intrinsically compared - a great advantage with respect to traditional field surveys followed by engineering design methods. Based on optimization techniques, HERA determines which hydro plants should be built, including location, design, technical data (e.g. water head, reservoir area and volume, engineering design (dam, spillways, etc.) and costs). The results can be visualized in the HERA interface, exported to GIS software, Google Earth or CAD systems. HERA has a global scope of application since the main input data area a Digital Terrain Model and water inflows at gauging stations. The objective is to contribute to an increased rationality of decisions by presenting to the stakeholders a clear and quantitative view of the alternatives, their opportunities and threats.
Nelson, Carl A; Miller, David J; Oleynikov, Dmitry
2008-01-01
As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.
NASA Astrophysics Data System (ADS)
Vu, Duy-Duc; Monies, Frédéric; Rubio, Walter
2018-05-01
A large number of studies, based on 3-axis end milling of free-form surfaces, seek to optimize tool path planning. Approaches try to optimize the machining time by reducing the total tool path length while respecting the criterion of the maximum scallop height. Theoretically, the tool path trajectories that remove the most material follow the directions in which the machined width is the largest. The free-form surface is often considered as a single machining area. Therefore, the optimization on the entire surface is limited. Indeed, it is difficult to define tool trajectories with optimal feed directions which generate largest machined widths. Another limiting point of previous approaches for effectively reduce machining time is the inadequate choice of the tool. Researchers use generally a spherical tool on the entire surface. However, the gains proposed by these different methods developed with these tools lead to relatively small time savings. Therefore, this study proposes a new method, using toroidal milling tools, for generating toolpaths in different regions on the machining surface. The surface is divided into several regions based on machining intervals. These intervals ensure that the effective radius of the tool, at each cutter-contact points on the surface, is always greater than the radius of the tool in an optimized feed direction. A parallel plane strategy is then used on the sub-surfaces with an optimal specific feed direction for each sub-surface. This method allows one to mill the entire surface with efficiency greater than with the use of a spherical tool. The proposed method is calculated and modeled using Maple software to find optimal regions and feed directions in each region. This new method is tested on a free-form surface. A comparison is made with a spherical cutter to show the significant gains obtained with a toroidal milling cutter. Comparisons with CAM software and experimental validations are also done. The results show the efficiency of the method.
Integrating NOE and RDC using sum-of-squares relaxation for protein structure determination.
Khoo, Y; Singer, A; Cowburn, D
2017-07-01
We revisit the problem of protein structure determination from geometrical restraints from NMR, using convex optimization. It is well-known that the NP-hard distance geometry problem of determining atomic positions from pairwise distance restraints can be relaxed into a convex semidefinite program (SDP). However, often the NOE distance restraints are too imprecise and sparse for accurate structure determination. Residual dipolar coupling (RDC) measurements provide additional geometric information on the angles between atom-pair directions and axes of the principal-axis-frame. The optimization problem involving RDC is highly non-convex and requires a good initialization even within the simulated annealing framework. In this paper, we model the protein backbone as an articulated structure composed of rigid units. Determining the rotation of each rigid unit gives the full protein structure. We propose solving the non-convex optimization problems using the sum-of-squares (SOS) hierarchy, a hierarchy of convex relaxations with increasing complexity and approximation power. Unlike classical global optimization approaches, SOS optimization returns a certificate of optimality if the global optimum is found. Based on the SOS method, we proposed two algorithms-RDC-SOS and RDC-NOE-SOS, that have polynomial time complexity in the number of amino-acid residues and run efficiently on a standard desktop. In many instances, the proposed methods exactly recover the solution to the original non-convex optimization problem. To the best of our knowledge this is the first time SOS relaxation is introduced to solve non-convex optimization problems in structural biology. We further introduce a statistical tool, the Cramér-Rao bound (CRB), to provide an information theoretic bound on the highest resolution one can hope to achieve when determining protein structure from noisy measurements using any unbiased estimator. Our simulation results show that when the RDC measurements are corrupted by Gaussian noise of realistic variance, both SOS based algorithms attain the CRB. We successfully apply our method in a divide-and-conquer fashion to determine the structure of ubiquitin from experimental NOE and RDC measurements obtained in two alignment media, achieving more accurate and faster reconstructions compared to the current state of the art.
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury
2015-04-01
Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or the SDDP methods. The independent use of surface and groundwater can be examined with and without the aquifer. The ESPAT_DET, ESPATR and ESPAT_SDP modules were executed for the surface system, while the ESPAT_RA and the ESPAT_DET modules were run for the surface-groundwater system. The surface system's results show a similar performance between the ESPAT_SDP and ESPATR modules, with outperform the one showed by the current policies besides being outperformed by the ESPAT_DET results, which have the advantage of the perfect foresight. The surface-groundwater system's results show a robust situation in which the differences between the module's results and the current policies are lower due the use of pumped groundwater in the XX century crops when surface water is scarce. The results are realistic, with the deterministic optimization outperforming the stochastic one, which at the same time outperforms the current policies; showing that the tool is able to stochastically optimize river-aquifer water resources systems. We are currently working in the application of these tools in the analysis of changes in systems' operation under global change conditions. ACKNOWLEDGEMENT: This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) funds.
Parasail: SIMD C library for global, semi-global, and local pairwise sequence alignments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.
Sequence alignment algorithms are a key component of many bioinformatics applications. Though various fast Smith-Waterman local sequence alignment implementations have been developed for x86 CPUs, most are embedded into larger database search tools. In addition, fast implementations of Needleman-Wunsch global sequence alignment and its semi-global variants are not as widespread. This article presents the first software library for local, global, and semi-global pairwise intra-sequence alignments and improves the performance of previous intra-sequence implementations. As a result, a faster intra-sequence pairwise alignment implementation is described and benchmarked. Using a 375 residue query sequence a speed of 136 billion cell updates permore » second (GCUPS) was achieved on a dual Intel Xeon E5-2670 12-core processor system, the highest reported for an implementation based on Farrar’s ’striped’ approach. When using only a single thread, parasail was 1.7 times faster than Rognes’s SWIPE. For many score matrices, parasail is faster than BLAST. The software library is designed for 64 bit Linux, OS X, or Windows on processors with SSE2, SSE41, or AVX2. Source code is available from https://github.com/jeffdaily/parasail under the Battelle BSD-style license. In conclusion, applications that require optimal alignment scores could benefit from the improved performance. For the first time, SIMD global, semi-global, and local alignments are available in a stand-alone C library.« less
Parasail: SIMD C library for global, semi-global, and local pairwise sequence alignments
Daily, Jeffrey A.
2016-02-10
Sequence alignment algorithms are a key component of many bioinformatics applications. Though various fast Smith-Waterman local sequence alignment implementations have been developed for x86 CPUs, most are embedded into larger database search tools. In addition, fast implementations of Needleman-Wunsch global sequence alignment and its semi-global variants are not as widespread. This article presents the first software library for local, global, and semi-global pairwise intra-sequence alignments and improves the performance of previous intra-sequence implementations. As a result, a faster intra-sequence pairwise alignment implementation is described and benchmarked. Using a 375 residue query sequence a speed of 136 billion cell updates permore » second (GCUPS) was achieved on a dual Intel Xeon E5-2670 12-core processor system, the highest reported for an implementation based on Farrar’s ’striped’ approach. When using only a single thread, parasail was 1.7 times faster than Rognes’s SWIPE. For many score matrices, parasail is faster than BLAST. The software library is designed for 64 bit Linux, OS X, or Windows on processors with SSE2, SSE41, or AVX2. Source code is available from https://github.com/jeffdaily/parasail under the Battelle BSD-style license. In conclusion, applications that require optimal alignment scores could benefit from the improved performance. For the first time, SIMD global, semi-global, and local alignments are available in a stand-alone C library.« less
Optimization of the propulsion for multistage solid rocket motor launchers
NASA Astrophysics Data System (ADS)
Calabro, M.; Dufour, A.; Macaire, A.
2002-02-01
Some tools focused on a rapid multidisciplinary optimization capability for multistage launch vehicle design were developed at EADS-LV. These tools may be broken down into two categories, those related to propulsion design optimization and a computer code devoted to trajectories and under constraints optimization. Both are linked in order to obtain optimal vehicle design after an iterative process. After a description of the two categories tools, an example of application is given on a small space launcher.
Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies
ERIC Educational Resources Information Center
Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.
2012-01-01
In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…
Globally optimal trial design for local decision making.
Eckermann, Simon; Willan, Andrew R
2009-02-01
Value of information methods allows decision makers to identify efficient trial design following a principle of maximizing the expected value to decision makers of information from potential trial designs relative to their expected cost. However, in health technology assessment (HTA) the restrictive assumption has been made that, prospectively, there is only expected value of sample information from research commissioned within jurisdiction. This paper extends the framework for optimal trial design and decision making within jurisdiction to allow for optimal trial design across jurisdictions. This is illustrated in identifying an optimal trial design for decision making across the US, the UK and Australia for early versus late external cephalic version for pregnant women presenting in the breech position. The expected net gain from locally optimal trial designs of US$0.72M is shown to increase to US$1.14M with a globally optimal trial design. In general, the proposed method of globally optimal trial design improves on optimal trial design within jurisdictions by: (i) reflecting the global value of non-rival information; (ii) allowing optimal allocation of trial sample across jurisdictions; (iii) avoiding market failure associated with free-rider effects, sub-optimal spreading of fixed costs and heterogeneity of trial information with multiple trials. Copyright (c) 2008 John Wiley & Sons, Ltd.
Experimental investigation on ignition schemes of partially covered cavities in a supersonic flow
NASA Astrophysics Data System (ADS)
Cai, Zun; Sun, Mingbo; Wang, Hongbo; Wang, Zhenguo
2016-04-01
In this study, ignition schemes of the partially covered cavity in a scramjet combustor were investigated under inflow conditions of Ma=2.1 with stagnation pressure P0=0.7 Mpa and stagnation temperature T0=947 K. It reveals that the ignition scheme of the partially covered cavity has a great impact on the ignition and flame stabilization process. There always exists an optimized global equivalence ratio of a fixed ignition scheme, and the optimized global equivalence ratio of ignition in the partially covered cavity is lower than that of the uncovered cavity. For tandem dual-cavities, ignition in the partially covered cavity could be enhanced with the optimization of global equivalence ratio. However, ignition in the partially covered cavity would be exacerbated with further increasing the global equivalence ratio. The global equivalence ratio and the jet penetration height have a strong coupling with the combustion flow-field. For multi-cavities, it is assured that fuel injection on the opposite side could hardly be ignited after ignition in the partially covered cavity even with the optimized global equivalence ratio. It is possible to realize ignition enhancement in the partially covered cavity with the optimization of global equivalence ratio, but it is not beneficial for thrust increment during the steady combustion process.
Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Oliker, Leonid; Sohn, Andrew
1996-01-01
Dynamic mesh adaption on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load imbalance among processors on a parallel machine. This paper describes the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution cost is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35% of the mesh is randomly adapted. For large-scale scientific computations, our load balancing strategy gives almost a sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remapper yields processor assignments that are less than 3% off the optimal solutions but requires only 1% of the computational time.
Global optimization methods for engineering design
NASA Technical Reports Server (NTRS)
Arora, Jasbir S.
1990-01-01
The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lonchampt, J.; Fessart, K.
2013-07-01
The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancements or logistic investments such as spare parts purchases. The two methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP tool is synthesised in themore » Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-Markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a safety requirement limiting the residual risk of failure of a component or group of component, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description of the features of the software a test case is presented showing the influence of the optimization algorithm parameters on its efficiency to find an optimal investments planning. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azunre, P.
Here in this paper, two novel techniques for bounding the solutions of parametric weakly coupled second-order semilinear parabolic partial differential equations are developed. The first provides a theorem to construct interval bounds, while the second provides a theorem to construct lower bounds convex and upper bounds concave in the parameter. The convex/concave bounds can be significantly tighter than the interval bounds because of the wrapping effect suffered by interval analysis in dynamical systems. Both types of bounds are computationally cheap to construct, requiring solving auxiliary systems twice and four times larger than the original system, respectively. An illustrative numerical examplemore » of bound construction and use for deterministic global optimization within a simple serial branch-and-bound algorithm, implemented numerically using interval arithmetic and a generalization of McCormick's relaxation technique, is presented. Finally, problems within the important class of reaction-diffusion systems may be optimized with these tools.« less
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
Fast globally optimal segmentation of 3D prostate MRI with axial symmetry prior.
Qiu, Wu; Yuan, Jing; Ukwatta, Eranga; Sun, Yue; Rajchl, Martin; Fenster, Aaron
2013-01-01
We propose a novel global optimization approach to segmenting a given 3D prostate T2w magnetic resonance (MR) image, which enforces the inherent axial symmetry of the prostate shape and simultaneously performs a sequence of 2D axial slice-wise segmentations with a global 3D coherence prior. We show that the proposed challenging combinatorial optimization problem can be solved globally and exactly by means of convex relaxation. With this regard, we introduce a novel coupled continuous max-flow model, which is dual to the studied convex relaxed optimization formulation and leads to an efficient multiplier augmented algorithm based on the modern convex optimization theory. Moreover, the new continuous max-flow based algorithm was implemented on GPUs to achieve a substantial improvement in computation. Experimental results using public and in-house datasets demonstrate great advantages of the proposed method in terms of both accuracy and efficiency.
Advances in Optimizing Weather Driven Electric Power Systems.
NASA Astrophysics Data System (ADS)
Clack, C.; MacDonald, A. E.; Alexander, A.; Dunbar, A. D.; Xie, Y.; Wilczak, J. M.
2014-12-01
The importance of weather-driven renewable energies for the United States (and global) energy portfolio is growing. The main perceived problems with weather-driven renewable energies are their intermittent nature, low power density, and high costs. The National Energy with Weather System Simulator (NEWS) is a mathematical optimization tool that allows the construction of weather-driven energy sources that will work in harmony with the needs of the system. For example, it will match the electric load, reduce variability, decrease costs, and abate carbon emissions. One important test run included existing US carbon-free power sources, natural gas power when needed, and a High Voltage Direct Current power transmission network. This study shows that the costs and carbon emissions from an optimally designed national system decrease with geographic size. It shows that with achievable estimates of wind and solar generation costs, that the US could decrease its carbon emissions by up to 80% by the early 2030s, without an increase in electric costs. The key requirement would be a 48 state network of HVDC transmission, creating a national market for electricity not possible in the current AC grid. These results were found without the need for storage. Further, we tested the effect of changing natural gas fuel prices on the optimal configuration of the national electric power system. Another test that was carried out was an extension to global regions. The extension study shows that the same properties found in the US study extend to the most populous regions of the planet. The extra test is a simplified version of the US study, and is where much more research can be carried out. We compare our results to other model results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Bo; Abdelaziz, Omar; Shrestha, Som S
Oak Ridge National laboratory (ORNL) recently conducted extensive laboratory, drop-in investigations for lower Global Warming Potential (GWP) refrigerants to replace R-22 and R-410A. ORNL studied propane, DR-3, ARM-20B, N-20B and R-444B as lower GWP refrigerant replacement for R-22 in a mini-split room air conditioner (RAC) originally designed for R-22; and, R-32, DR-55, ARM-71A, and L41-2, in a mini-split RAC designed for R-410A. We obtained laboratory testing results with very good energy balance and nominal measurement uncertainty. Drop-in studies are not enough to judge the overall performance of the alternative refrigerants since their thermodynamic and transport properties might favor different heatmore » exchanger configurations, e.g. cross-flow, counter flow, etc. This study compares optimized performances of individual refrigerants using a physics-based system model tools. The DOE/ORNL Heat Pump Design Model (HPDM) was used to model the mini-split RACs by inputting detailed heat exchangers geometries, compressor displacement and efficiencies as well as other relevant system components. The RAC models were calibrated against the lab data for each individual refrigerant. The calibrated models were then used to conduct a design optimization for the cooling performance by varying the compressor displacement to match the required capacity, and changing the number of circuits, refrigerant flow direction, tube diameters, air flow rates in the condenser and evaporator at 100% and 50% cooling capacities. This paper compares the optimized performance results for all alternative refrigerants and highlights best candidates for R-22 and R-410A replacement.« less
Acceleration techniques in the univariate Lipschitz global optimization
NASA Astrophysics Data System (ADS)
Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.; De Franco, Angela
2016-10-01
Univariate box-constrained Lipschitz global optimization problems are considered in this contribution. Geometric and information statistical approaches are presented. The novel powerful local tuning and local improvement techniques are described in the contribution as well as the traditional ways to estimate the Lipschitz constant. The advantages of the presented local tuning and local improvement techniques are demonstrated using the operational characteristics approach for comparing deterministic global optimization algorithms on the class of 100 widely used test functions.
Mehri, Mehran
2014-07-01
The optimization algorithm of a model may have significant effects on the final optimal values of nutrient requirements in poultry enterprises. In poultry nutrition, the optimal values of dietary essential nutrients are very important for feed formulation to optimize profit through minimizing feed cost and maximizing bird performance. This study was conducted to introduce a novel multi-objective algorithm, desirability function, for optimization the bird response models based on response surface methodology (RSM) and artificial neural network (ANN). The growth databases on the central composite design (CCD) were used to construct the RSM and ANN models and optimal values for 3 essential amino acids including lysine, methionine, and threonine in broiler chicks have been reevaluated using the desirable function in both analytical approaches from 3 to 16 d of age. Multi-objective optimization results showed that the most desirable function was obtained for ANN-based model (D = 0.99) where the optimal levels of digestible lysine (dLys), digestible methionine (dMet), and digestible threonine (dThr) for maximum desirability were 13.2, 5.0, and 8.3 g/kg of diet, respectively. However, the optimal levels of dLys, dMet, and dThr in the RSM-based model were estimated at 11.2, 5.4, and 7.6 g/kg of diet, respectively. This research documented that the application of ANN in the broiler chicken model along with a multi-objective optimization algorithm such as desirability function could be a useful tool for optimization of dietary amino acids in fractional factorial experiments, in which the use of the global desirability function may be able to overcome the underestimations of dietary amino acids resulting from the RSM model. © 2014 Poultry Science Association Inc.
MOGO: Model-Oriented Global Optimization of Petascale Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D.; Shende, Sameer S.
The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less
NASA Astrophysics Data System (ADS)
Palanivel, M.; Priyan, S.; Mala, P.
2017-11-01
In the current global market, organizations use many promotional tools to increase their sales. One such tool is sales teams' initiatives or promotional policies, i.e., free gifts, discounts, packaging, etc. This phenomenon motivates the retailer/or buyer to order a large inventory lot so as to take full benefit of promotional policies. In view of this the present paper considers a two-warehouse (owned and rented) inventory problem for a non-instantaneous deteriorating item with inflation and time value of money over a finite planning horizon. Here, demand depends on the sales team's initiatives and shortages are partially backlogged at a rate dependent on the duration of waiting time up to the arrival of next lot. We design an algorithm to obtain the optimal replenishment strategies. Numerical analysis is also given to show the applicability of the proposed model in real-world two-warehouse inventory problems.
NASA Technical Reports Server (NTRS)
Franks, Shannon; Masek, Jeffrey G.; Headley, Rachel M.; Gasch, John; Arvidson, Terry
2009-01-01
The Global Land Survey (GLS) 2005 is a cloud-free, orthorectified collection of Landsat imagery acquired during the 2004-2007 epoch intended to support global land-cover and ecological monitoring. Due to the numerous complexities in selecting imagery for the GLS2005, NASA and the U.S. Geological Survey (USGS) sponsored the development of an automated scene selection tool, the Large Area Scene Selection Interface (LASSI), to aid in the selection of imagery for this data set. This innovative approach to scene selection applied a user-defined weighting system to various scene parameters: image cloud cover, image vegetation greenness, choice of sensor, and the ability of the Landsat 7 Scan Line Corrector (SLC)-off pair to completely fill image gaps, among others. The parameters considered in scene selection were weighted according to their relative importance to the data set, along with the algorithm's sensitivity to that weight. This paper describes the methodology and analysis that established the parameter weighting strategy, as well as the post-screening processes used in selecting the optimal data set for GLS2005.
Teixeira, Ana P; Carinhas, Nuno; Dias, João M L; Cruz, Pedro; Alves, Paula M; Carrondo, Manuel J T; Oliveira, Rui
2007-12-01
Systems biology is an integrative science that aims at the global characterization of biological systems. Huge amounts of data regarding gene expression, proteins activity and metabolite concentrations are collected by designing systematic genetic or environmental perturbations. Then the challenge is to integrate such data in a global model in order to provide a global picture of the cell. The analysis of these data is largely dominated by nonparametric modelling tools. In contrast, classical bioprocess engineering has been primarily founded on first principles models, but it has systematically overlooked the details of the embedded biological system. The full complexity of biological systems is currently assumed by systems biology and this knowledge can now be taken by engineers to decide how to optimally design and operate their processes. This paper discusses possible methodologies for the integration of systems biology and bioprocess engineering with emphasis on applications involving animal cell cultures. At the mathematical systems level, the discussion is focused on hybrid semi-parametric systems as a way to bridge systems biology and bioprocess engineering.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga; Stephens, Philip; Iijima, Bryron A.
2013-01-01
Modeling and imaging the Earth's ionosphere as well as understanding its structures, inhomogeneities, and disturbances is a key part of NASA's Heliophysics Directorate science roadmap. This invention provides a design tool for scientific missions focused on the ionosphere. It is a scientifically important and technologically challenging task to assess the impact of a new observation system quantitatively on our capability of imaging and modeling the ionosphere. This question is often raised whenever a new satellite system is proposed, a new type of data is emerging, or a new modeling technique is developed. The proposed constellation would be part of a new observation system with more low-Earth orbiters tracking more radio occultation signals broadcast by Global Navigation Satellite System (GNSS) than those offered by the current GPS and COSMIC observation system. A simulation system was developed to fulfill this task. The system is composed of a suite of software that combines the Global Assimilative Ionospheric Model (GAIM) including first-principles and empirical ionospheric models, a multiple- dipole geomagnetic field model, data assimilation modules, observation simulator, visualization software, and orbit design, simulation, and optimization software.
Measuring Spatial Dependence for Infectious Disease Epidemiology
Grabowski, M. Kate; Cummings, Derek A. T.
2016-01-01
Global spatial clustering is the tendency of points, here cases of infectious disease, to occur closer together than expected by chance. The extent of global clustering can provide a window into the spatial scale of disease transmission, thereby providing insights into the mechanism of spread, and informing optimal surveillance and control. Here the authors present an interpretable measure of spatial clustering, τ, which can be understood as a measure of relative risk. When biological or temporal information can be used to identify sets of potentially linked and likely unlinked cases, this measure can be estimated without knowledge of the underlying population distribution. The greater our ability to distinguish closely related (i.e., separated by few generations of transmission) from more distantly related cases, the more closely τ will track the true scale of transmission. The authors illustrate this approach using examples from the analyses of HIV, dengue and measles, and provide an R package implementing the methods described. The statistic presented, and measures of global clustering in general, can be powerful tools for analysis of spatially resolved data on infectious diseases. PMID:27196422
NASA Astrophysics Data System (ADS)
Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod
2015-10-01
In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.
Global Optimal Trajectory in Chaos and NP-Hardness
NASA Astrophysics Data System (ADS)
Latorre, Vittorio; Gao, David Yang
This paper presents an unconventional theory and method for solving general nonlinear dynamical systems. Instead of the direct iterative methods, the discretized nonlinear system is first formulated as a global optimization problem via the least squares method. A newly developed canonical duality theory shows that this nonconvex minimization problem can be solved deterministically in polynomial time if a global optimality condition is satisfied. The so-called pseudo-chaos produced by linear iterative methods are mainly due to the intrinsic numerical error accumulations. Otherwise, the global optimization problem could be NP-hard and the nonlinear system can be really chaotic. A conjecture is proposed, which reveals the connection between chaos in nonlinear dynamics and NP-hardness in computer science. The methodology and the conjecture are verified by applications to the well-known logistic equation, a forced memristive circuit and the Lorenz system. Computational results show that the canonical duality theory can be used to identify chaotic systems and to obtain realistic global optimal solutions in nonlinear dynamical systems. The method and results presented in this paper should bring some new insights into nonlinear dynamical systems and NP-hardness in computational complexity theory.
An Evaluation of the Sniffer Global Optimization Algorithm Using Standard Test Functions
NASA Astrophysics Data System (ADS)
Butler, Roger A. R.; Slaminka, Edward E.
1992-03-01
The performance of Sniffer—a new global optimization algorithm—is compared with that of Simulated Annealing. Using the number of function evaluations as a measure of efficiency, the new algorithm is shown to be significantly better at finding the global minimum of seven standard test functions. Several of the test functions used have many local minima and very steep walls surrounding the global minimum. Such functions are intended to thwart global minimization algorithms.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-30
... Limited Shijiazhuang Global New Century Tools Co., Ltd. Sichuan Huili Tools Co. Task Tools & Abrasives... Global Logistics (Shanghai) Co., Ltd. APS Qingdao Cangshan Qingshui Vegetable Foods Co., Ltd. Chengwu...
Page, Tessa; Nguyen, Huong Thi Huynh; Hilts, Lindsey; Ramos, Lorena; Hanrahan, Grady
2012-06-01
This work reveals a computational framework for parallel electrophoretic separation of complex biological macromolecules and model urinary metabolites. More specifically, the implementation of a particle swarm optimization (PSO) algorithm on a neural network platform for multiparameter optimization of multiplexed 24-capillary electrophoresis technology with UV detection is highlighted. Two experimental systems were examined: (1) separation of purified rabbit metallothioneins and (2) separation of model toluene urinary metabolites and selected organic acids. Results proved superior to the use of neural networks employing standard back propagation when examining training error, fitting response, and predictive abilities. Simulation runs were obtained as a result of metaheuristic examination of the global search space with experimental responses in good agreement with predicted values. Full separation of selected analytes was realized after employing optimal model conditions. This framework provides guidance for the application of metaheuristic computational tools to aid in future studies involving parallel chemical separation and screening. Adaptable pseudo-code is provided to enable users of varied software packages and modeling framework to implement the PSO algorithm for their desired use.
Particle Swarm Optimization with Double Learning Patterns.
Shen, Yuanxia; Wei, Linna; Zeng, Chuanhua; Chen, Jian
2016-01-01
Particle Swarm Optimization (PSO) is an effective tool in solving optimization problems. However, PSO usually suffers from the premature convergence due to the quick losing of the swarm diversity. In this paper, we first analyze the motion behavior of the swarm based on the probability characteristic of learning parameters. Then a PSO with double learning patterns (PSO-DLP) is developed, which employs the master swarm and the slave swarm with different learning patterns to achieve a trade-off between the convergence speed and the swarm diversity. The particles in the master swarm and the slave swarm are encouraged to explore search for keeping the swarm diversity and to learn from the global best particle for refining a promising solution, respectively. When the evolutionary states of two swarms interact, an interaction mechanism is enabled. This mechanism can help the slave swarm in jumping out of the local optima and improve the convergence precision of the master swarm. The proposed PSO-DLP is evaluated on 20 benchmark functions, including rotated multimodal and complex shifted problems. The simulation results and statistical analysis show that PSO-DLP obtains a promising performance and outperforms eight PSO variants.
NASA Astrophysics Data System (ADS)
Xie, Yan; Li, Mu; Zhou, Jin; Zheng, Chang-zheng
2009-07-01
Agricultural machinery total power is an important index to reflex and evaluate the level of agricultural mechanization. It is the power source of agricultural production, and is the main factors to enhance the comprehensive agricultural production capacity expand production scale and increase the income of the farmers. Its demand is affected by natural, economic, technological and social and other "grey" factors. Therefore, grey system theory can be used to analyze the development of agricultural machinery total power. A method based on genetic algorithm optimizing grey modeling process is introduced in this paper. This method makes full use of the advantages of the grey prediction model and characteristics of genetic algorithm to find global optimization. So the prediction model is more accurate. According to data from a province, the GM (1, 1) model for predicting agricultural machinery total power was given based on the grey system theories and genetic algorithm. The result indicates that the model can be used as agricultural machinery total power an effective tool for prediction.
Carter, Patrick M; Desmond, Jeffery S; Akanbobnaab, Christopher; Oteng, Rockefeller A; Rominski, Sarah D; Barsan, William G; Cunningham, Rebecca M
2012-03-01
Although many global health programs focus on providing clinical care or medical education, improving clinical operations can have a significant effect on patient care delivery, especially in developing health systems without high-level operations management. Lean manufacturing techniques have been effective in decreasing emergency department (ED) length of stay, patient waiting times, numbers of patients leaving without being seen, and door-to-balloon times for ST-elevation myocardial infarction in developed health systems, but use of Lean in low to middle income countries with developing emergency medicine (EM) systems has not been well characterized. To describe the application of Lean manufacturing techniques to improve clinical operations at Komfo Anokye Teaching Hospital (KATH) in Ghana and to identify key lessons learned to aid future global EM initiatives. A 3-week Lean improvement program focused on the hospital admissions process at KATH was completed by a 14-person team in six stages: problem definition, scope of project planning, value stream mapping, root cause analysis, future state planning, and implementation planning. The authors identified eight lessons learned during our use of Lean to optimize the operations of an ED in a global health setting: 1) the Lean process aided in building a partnership with Ghanaian colleagues; 2) obtaining and maintaining senior institutional support is necessary and challenging; 3) addressing power differences among the team to obtain feedback from all team members is critical to successful Lean analysis; 4) choosing a manageable initial project is critical to influence long-term Lean use in a new environment; 5) data intensive Lean tools can be adapted and are effective in a less resourced health system; 6) several Lean tools focused on team problem-solving techniques worked well in a low-resource system without modification; 7) using Lean highlighted that important changes do not require an influx of resources; and 8) despite different levels of resources, root causes of system inefficiencies are often similar across health care systems, but require unique solutions appropriate to the clinical setting. Lean manufacturing techniques can be successfully adapted for use in developing health systems. Lessons learned from this Lean project will aid future introduction of advanced operations management techniques in low- to middle-income countries. © 2012 by the Society for Academic Emergency Medicine.
Carter, Patrick M.; Desmond, Jeffery S.; Akanbobnaab, Christopher; Oteng, Rockefeller A.; Rominski, Sarah; Barsan, William G.; Cunningham, Rebecca
2012-01-01
Background Although many global health programs focus on providing clinical care or medical education, improving clinical operations can have a significant effect on patient care delivery, especially in developing health systems without high-level operations management. Lean manufacturing techniques have been effective in decreasing emergency department (ED) length of stay, patient waiting times, numbers of patients leaving without being seen, and door-to-balloon times for ST-elevation myocardial infarction in developed health systems; but use of Lean in low to middle income countries with developing emergency medicine systems has not been well characterized. Objectives To describe the application of Lean manufacturing techniques to improve clinical operations at Komfo Anokye Teaching Hospital in Ghana and to identify key lessons learned to aid future global EM initiatives. Methods A three-week Lean improvement program focused on the hospital admissions process at Komfo Anokye Teaching Hospital was completed by a 14-person team in six stages: problem definition, scope of project planning, value stream mapping, root cause analysis, future state planning, and implementation planning. Results The authors identified eight lessons learned during our use of Lean to optimize the operations of an ED in a global health setting: 1) the Lean process aided in building a partnership with Ghanaian colleagues; 2) obtaining and maintaining senior institutional support is necessary and challenging; 3) addressing power differences among the team to obtain feedback from all team members is critical to successful Lean analysis; 4) choosing a manageable initial project is critical to influence long-term Lean use in a new environment; 5) data intensive Lean tools can be adapted and are effective in a less resourced health system; 6) several Lean tools focused on team problem solving techniques worked well in a low resource system without modification; 7) using Lean highlighted that important changes do not require an influx of resources; 8) despite different levels of resources, root causes of system inefficiencies are often similar across health care systems, but require unique solutions appropriate to the clinical setting. Conclusions Lean manufacturing techniques can be successfully adapted for use in developing health systems. Lessons learned from this Lean project will aid future introduction of advanced operations management techniques in low to middle income countries. PMID:22435868
Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.
Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun
2015-11-07
In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.
GeneWiz browser: An Interactive Tool for Visualizing Sequenced Chromosomes.
Hallin, Peter F; Stærfeldt, Hans-Henrik; Rotenberg, Eva; Binnewies, Tim T; Benham, Craig J; Ussery, David W
2009-09-25
We present an interactive web application for visualizing genomic data of prokaryotic chromosomes. The tool (GeneWiz browser) allows users to carry out various analyses such as mapping alignments of homologous genes to other genomes, mapping of short sequencing reads to a reference chromosome, and calculating DNA properties such as curvature or stacking energy along the chromosome. The GeneWiz browser produces an interactive graphic that enables zooming from a global scale down to single nucleotides, without changing the size of the plot. Its ability to disproportionally zoom provides optimal readability and increased functionality compared to other browsers. The tool allows the user to select the display of various genomic features, color setting and data ranges. Custom numerical data can be added to the plot allowing, for example, visualization of gene expression and regulation data. Further, standard atlases are pre-generated for all prokaryotic genomes available in GenBank, providing a fast overview of all available genomes, including recently deposited genome sequences. The tool is available online from http://www.cbs.dtu.dk/services/gwBrowser. Supplemental material including interactive atlases is available online at http://www.cbs.dtu.dk/services/gwBrowser/suppl/.
Helping the Warfighter Become Green! (Briefing Charts)
2011-02-01
CHAIN LEADERSHIPARFIGHTER-FOCUSED, GLOBALLY RESPONSIVE, FISCALLY RESPONSIBLE SUPPLY CHAIN LEADERSHIP DOD EMALL DOD’s Online Shopping Tool • Web self...LEADERSHIP DOD EMALL DOD’s Online Shopping Tool WARFIGHTER FOCUSED, GLOBALLY RESPONSIVE SUPPLY CHAIN LEADERSHIPARFIGHTER FOCUSED, GLOBALLY RESPONSIVE
Optimizing human activity patterns using global sensitivity analysis.
Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M
2014-12-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.
Optimizing human activity patterns using global sensitivity analysis
Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.
2014-01-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Thermodynamical analysis of a quantum heat engine based on harmonic oscillators.
Insinga, Andrea; Andresen, Bjarne; Salamon, Peter
2016-07-01
Many models of heat engines have been studied with the tools of finite-time thermodynamics and an ensemble of independent quantum systems as the working fluid. Because of their convenient analytical properties, harmonic oscillators are the most frequently used example of a quantum system. We analyze different thermodynamical aspects with the final aim of the optimization of the performance of the engine in terms of the mechanical power provided during a finite-time Otto cycle. The heat exchange mechanism between the working fluid and the thermal reservoirs is provided by the Lindblad formalism. We describe an analytical method to find the limit cycle and give conditions for a stable limit cycle to exist. We explore the power production landscape as the duration of the four branches of the cycle are varied for short times, intermediate times, and special frictionless times. For short times we find a periodic structure with atolls of purely dissipative operation surrounding islands of divergent behavior where, rather than tending to a limit cycle, the working fluid accumulates more and more energy. For frictionless times the periodic structure is gone and we come very close to the global optimal operation. The global optimum is found and interestingly comes with a particular value of the cycle time.
Method for using global optimization to the estimation of surface-consistent residual statics
Reister, David B.; Barhen, Jacob; Oblow, Edward M.
2001-01-01
An efficient method for generating residual statics corrections to compensate for surface-consistent static time shifts in stacked seismic traces. The method includes a step of framing the residual static corrections as a global optimization problem in a parameter space. The method also includes decoupling the global optimization problem involving all seismic traces into several one-dimensional problems. The method further utilizes a Stochastic Pijavskij Tunneling search to eliminate regions in the parameter space where a global minimum is unlikely to exist so that the global minimum may be quickly discovered. The method finds the residual statics corrections by maximizing the total stack power. The stack power is a measure of seismic energy transferred from energy sources to receivers.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
Optimal visual-haptic integration with articulated tools.
Takahashi, Chie; Watt, Simon J
2017-05-01
When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.
Analysis and Optimization of Pulse Dynamics for Magnetic Stimulation
Goetz, Stefan M.; Truong, Cong Nam; Gerhofer, Manuel G.; Peterchev, Angel V.; Herzog, Hans-Georg; Weyh, Thomas
2013-01-01
Magnetic stimulation is a standard tool in brain research and has found important clinical applications in neurology, psychiatry, and rehabilitation. Whereas coil designs and the spatial field properties have been intensively studied in the literature, the temporal dynamics of the field has received less attention. Typically, the magnetic field waveform is determined by available device circuit topologies rather than by consideration of what is optimal for neural stimulation. This paper analyzes and optimizes the waveform dynamics using a nonlinear model of a mammalian axon. The optimization objective was to minimize the pulse energy loss. The energy loss drives power consumption and heating, which are the dominating limitations of magnetic stimulation. The optimization approach is based on a hybrid global-local method. Different coordinate systems for describing the continuous waveforms in a limited parameter space are defined for numerical stability. The optimization results suggest that there are waveforms with substantially higher efficiency than that of traditional pulse shapes. One class of optimal pulses is analyzed further. Although the coil voltage profile of these waveforms is almost rectangular, the corresponding current shape presents distinctive characteristics, such as a slow low-amplitude first phase which precedes the main pulse and reduces the losses. Representatives of this class of waveforms corresponding to different maximum voltages are linked by a nonlinear transformation. The main phase, however, scales with time only. As with conventional magnetic stimulation pulses, briefer pulses result in lower energy loss but require higher coil voltage than longer pulses. PMID:23469168
New Laboratory Tools for Emerging Bacterial Challenges.
Fournier, Pierre-Edouard; Drancourt, Michel; Raoult, Didier
2017-08-15
Since its creation, the Méditerranée-Infection foundation has aimed at optimizing the management of infectious diseases and surveying the local and global epidemiology. This pivotal role was permitted by the development of rational sampling, point-of-care tests, and extended automation as well as new technologies, including mass spectrometry for colony identification, real-time genomics for isolate characterization, and the development of versatile and permissive culture systems. By identifying and characterizing emerging microbial pathogens, these developments provided significant breakthroughs in infectious diseases. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
Stahl, Randal; Waters, W. Ray; Palmer, Mitchell V.; Nol, Pauline; Rhyan, Jack C.; VerCauteren, Kurt C.; Koziel, Jacek A.
2017-01-01
Bovine tuberculosis is a zoonotic disease of global public health concern. Development of diagnostic tools to improve test accuracy and efficiency in domestic livestock and enable surveillance of wildlife reservoirs would improve disease management and eradication efforts. Use of volatile organic compound analysis in breath and fecal samples is being developed and optimized as a means to detect disease in humans and animals. In this study we demonstrate that VOCs present in fecal samples can be used to discriminate between non-vaccinated and BCG-vaccinated cattle prior to and after Mycobacterium bovis challenge. PMID:28686691
NASA Astrophysics Data System (ADS)
Gopalakrishnan, T.; Saravanan, R.
2017-03-01
Powerful management concepts step-up the quality of the product, time saving in producing the product thereby increase the production rate, improves tools and techniques, work culture, work place and employee motivation and morale. In this paper discussed about the case study of optimizing the tool design, tool parameters to cast off expansion plan according ECRS technique. The proposed designs and optimal tool parameters yielded best results and meet the customer demand without expansion plan. Hence the work yielded huge savings of money (direct and indirect cost), time and improved the motivation and more of employees significantly.
A Memetic Algorithm for Global Optimization of Multimodal Nonseparable Problems.
Zhang, Geng; Li, Yangmin
2016-06-01
It is a big challenging issue of avoiding falling into local optimum especially when facing high-dimensional nonseparable problems where the interdependencies among vector elements are unknown. In order to improve the performance of optimization algorithm, a novel memetic algorithm (MA) called cooperative particle swarm optimizer-modified harmony search (CPSO-MHS) is proposed in this paper, where the CPSO is used for local search and the MHS for global search. The CPSO, as a local search method, uses 1-D swarm to search each dimension separately and thus converges fast. Besides, it can obtain global optimum elements according to our experimental results and analyses. MHS implements the global search by recombining different vector elements and extracting global optimum elements. The interaction between local search and global search creates a set of local search zones, where global optimum elements reside within the search space. The CPSO-MHS algorithm is tested and compared with seven other optimization algorithms on a set of 28 standard benchmarks. Meanwhile, some MAs are also compared according to the results derived directly from their corresponding references. The experimental results demonstrate a good performance of the proposed CPSO-MHS algorithm in solving multimodal nonseparable problems.
NASA Astrophysics Data System (ADS)
Miyauchi, T.; Machimura, T.
2013-12-01
In the simulation using an ecosystem process model, the adjustment of parameters is indispensable for improving the accuracy of prediction. This procedure, however, requires much time and effort for approaching the simulation results to the measurements on models consisting of various ecosystem processes. In this study, we tried to apply a general purpose optimization tool in the parameter optimization of an ecosystem model, and examined its validity by comparing the simulated and measured biomass growth of a woody plantation. A biometric survey of tree biomass growth was performed in 2009 in an 11-year old Eucommia ulmoides plantation in Henan Province, China. Climate of the site was dry temperate. Leaf, above- and below-ground woody biomass were measured from three cut trees and converted into carbon mass per area by measured carbon contents and stem density. Yearly woody biomass growth of the plantation was calculated according to allometric relationships determined by tree ring analysis of seven cut trees. We used Biome-BGC (Thornton, 2002) to reproduce biomass growth of the plantation. Air temperature and humidity from 1981 to 2010 was used as input climate condition. The plant functional type was deciduous broadleaf, and non-optimizing parameters were left default. 11-year long normal simulations were performed following a spin-up run. In order to select optimizing parameters, we analyzed the sensitivity of leaf, above- and below-ground woody biomass to eco-physiological parameters. Following the selection, optimization of parameters was performed by using the Dakota optimizer. Dakota is an optimizer developed by Sandia National Laboratories for providing a systematic and rapid means to obtain optimal designs using simulation based models. As the object function, we calculated the sum of relative errors between simulated and measured leaf, above- and below-ground woody carbon at each of eleven years. In an alternative run, errors at the last year (at the field survey) were weighted for priority. We compared some gradient-based global optimization methods of Dakota starting with the default parameters of Biome-BGC. In the result of sensitive analysis, carbon allocation parameters between coarse root and leaf, between stem and leaf, and SLA had high contribution on both leaf and woody biomass changes. These parameters were selected to be optimized. The measured leaf, above- and below-ground woody biomass carbon density at the last year were 0.22, 1.81 and 0.86 kgC m-2, respectively, whereas those simulated in the non-optimized control case using all default parameters were 0.12, 2.26 and 0.52 kgC m-2, respectively. After optimizing the parameters, the simulated values were improved to 0.19, 1.81 and 0.86 kgC m-2, respectively. The coliny global optimization method gave the better fitness than efficient global and ncsu direct method. The optimized parameters showed the higher carbon allocation rates to coarse roots and leaves and the lower SLA than the default parameters, which were consistent to the general water physiological response in a dry climate. The simulation using the weighted object function resulted in the closer simulations to the measurements at the last year with the lower fitness during the previous years.
NASA Technical Reports Server (NTRS)
Englander, Arnold C.; Englander, Jacob A.
2017-01-01
Interplanetary trajectory optimization problems are highly complex and are characterized by a large number of decision variables and equality and inequality constraints as well as many locally optimal solutions. Stochastic global search techniques, coupled with a large-scale NLP solver, have been shown to solve such problems but are inadequately robust when the problem constraints become very complex. In this work, we present a novel search algorithm that takes advantage of the fact that equality constraints effectively collapse the solution space to lower dimensionality. This new approach walks the filament'' of feasibility to efficiently find the global optimal solution.
Development of Regional Excel-Based Stormwater/Nutrient BMP Optimization Tool (Opti-Tool)
During 2014, EPA Region 1 contracted with Tetra Tech, Inc. to work with a regional technical Advisory Committee to develop an Excel-based stormwater/nutrient BMP optimization tool (Opti-Tool) using regional precipitation data and regionally calibrated BMP performance data from UN...
NASA Astrophysics Data System (ADS)
Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.
2014-12-01
The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.
Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup; Weber, Tilmann
2016-08-27
Covering: 2012 to 2016Metabolic engineering using systems biology tools is increasingly applied to overproduce secondary metabolites for their potential industrial production. In this Highlight, recent relevant metabolic engineering studies are analyzed with emphasis on host selection and engineering approaches for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites. The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production.
The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox
NASA Astrophysics Data System (ADS)
Harris, A. T., III; Goodman, J.; Justice, B.
2014-12-01
As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.
CalFitter: a web server for analysis of protein thermal denaturation data.
Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri
2018-05-14
Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.
A Guiding Evolutionary Algorithm with Greedy Strategy for Global Optimization Problems
Cao, Leilei; Xu, Lihong; Goodman, Erik D.
2016-01-01
A Guiding Evolutionary Algorithm (GEA) with greedy strategy for global optimization problems is proposed. Inspired by Particle Swarm Optimization, the Genetic Algorithm, and the Bat Algorithm, the GEA was designed to retain some advantages of each method while avoiding some disadvantages. In contrast to the usual Genetic Algorithm, each individual in GEA is crossed with the current global best one instead of a randomly selected individual. The current best individual served as a guide to attract offspring to its region of genotype space. Mutation was added to offspring according to a dynamic mutation probability. To increase the capability of exploitation, a local search mechanism was applied to new individuals according to a dynamic probability of local search. Experimental results show that GEA outperformed the other three typical global optimization algorithms with which it was compared. PMID:27293421
A Guiding Evolutionary Algorithm with Greedy Strategy for Global Optimization Problems.
Cao, Leilei; Xu, Lihong; Goodman, Erik D
2016-01-01
A Guiding Evolutionary Algorithm (GEA) with greedy strategy for global optimization problems is proposed. Inspired by Particle Swarm Optimization, the Genetic Algorithm, and the Bat Algorithm, the GEA was designed to retain some advantages of each method while avoiding some disadvantages. In contrast to the usual Genetic Algorithm, each individual in GEA is crossed with the current global best one instead of a randomly selected individual. The current best individual served as a guide to attract offspring to its region of genotype space. Mutation was added to offspring according to a dynamic mutation probability. To increase the capability of exploitation, a local search mechanism was applied to new individuals according to a dynamic probability of local search. Experimental results show that GEA outperformed the other three typical global optimization algorithms with which it was compared.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Solving geosteering inverse problems by stochastic Hybrid Monte Carlo method
Shen, Qiuyang; Wu, Xuqing; Chen, Jiefu; ...
2017-11-20
The inverse problems arise in almost all fields of science where the real-world parameters are extracted from a set of measured data. The geosteering inversion plays an essential role in the accurate prediction of oncoming strata as well as a reliable guidance to adjust the borehole position on the fly to reach one or more geological targets. This mathematical treatment is not easy to solve, which requires finding an optimum solution among a large solution space, especially when the problem is non-linear and non-convex. Nowadays, a new generation of logging-while-drilling (LWD) tools has emerged on the market. The so-called azimuthalmore » resistivity LWD tools have azimuthal sensitivity and a large depth of investigation. Hence, the associated inverse problems become much more difficult since the earth model to be inverted will have more detailed structures. The conventional deterministic methods are incapable to solve such a complicated inverse problem, where they suffer from the local minimum trap. Alternatively, stochastic optimizations are in general better at finding global optimal solutions and handling uncertainty quantification. In this article, we investigate the Hybrid Monte Carlo (HMC) based statistical inversion approach and suggest that HMC based inference is more efficient in dealing with the increased complexity and uncertainty faced by the geosteering problems.« less
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
NASA Astrophysics Data System (ADS)
McKenney, D.; Pedlar, J.
2011-12-01
Climate is one of the major influences on forests and much effort has gone into projecting the impacts of rapid climate change on forest distribution and productivity. Such efforts are premised on the notion that the current generation of Global Climate Models (GCMs) provide reasonably accurate representations of future climate. But what is the appropriate level of faith to put in these projections when making relatively fine-scale resource management decisions such as the movement of plant genetic material? In this talk we review recent outcomes of climate envelope models for North American tree species that suggest optimal climate regimes could move on average ~700km within the next 100 years. Newer generation GCMs seem to confirm these results but much uncertainty remains for practical decision-making. Despite these uncertainties, assisted migration has been suggested as a climate change adaptation tool wherein populations of trees are moved up to a few hundred kilometers north (or a few hundred meters upslope) to keep pace with the anticipated changes in optimal climate regimes. A continent-wide web based tool (SEEDWHERE) is presented, which assists in identifying appropriate translocation distances for assisted migration initiatives. We finish with some suggestions for future work on the topic of forest regeneration decisions under an evolving and uncertain future climate.
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
3Drefine: an interactive web server for efficient protein structure refinement
Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin
2016-01-01
3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371
Automated parameterization of intermolecular pair potentials using global optimization techniques
NASA Astrophysics Data System (ADS)
Krämer, Andreas; Hülsmann, Marco; Köddermann, Thorsten; Reith, Dirk
2014-12-01
In this work, different global optimization techniques are assessed for the automated development of molecular force fields, as used in molecular dynamics and Monte Carlo simulations. The quest of finding suitable force field parameters is treated as a mathematical minimization problem. Intricate problem characteristics such as extremely costly and even abortive simulations, noisy simulation results, and especially multiple local minima naturally lead to the use of sophisticated global optimization algorithms. Five diverse algorithms (pure random search, recursive random search, CMA-ES, differential evolution, and taboo search) are compared to our own tailor-made solution named CoSMoS. CoSMoS is an automated workflow. It models the parameters' influence on the simulation observables to detect a globally optimal set of parameters. It is shown how and why this approach is superior to other algorithms. Applied to suitable test functions and simulations for phosgene, CoSMoS effectively reduces the number of required simulations and real time for the optimization task.
Optimization of turning process through the analytic flank wear modelling
NASA Astrophysics Data System (ADS)
Del Prete, A.; Franchi, R.; De Lorenzis, D.
2018-05-01
In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.
Dynamic optimization case studies in DYNOPT tool
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.
Seasonal-Scale Optimization of Conventional Hydropower Operations in the Upper Colorado System
NASA Astrophysics Data System (ADS)
Bier, A.; Villa, D.; Sun, A.; Lowry, T. S.; Barco, J.
2011-12-01
Sandia National Laboratories is developing the Hydropower Seasonal Concurrent Optimization for Power and the Environment (Hydro-SCOPE) tool to examine basin-wide conventional hydropower operations at seasonal time scales. This tool is part of an integrated, multi-laboratory project designed to explore different aspects of optimizing conventional hydropower operations. The Hydro-SCOPE tool couples a one-dimensional reservoir model with a river routing model to simulate hydrology and water quality. An optimization engine wraps around this model framework to solve for long-term operational strategies that best meet the specific objectives of the hydrologic system while honoring operational and environmental constraints. The optimization routines are provided by Sandia's open source DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) software. Hydro-SCOPE allows for multi-objective optimization, which can be used to gain insight into the trade-offs that must be made between objectives. The Hydro-SCOPE tool is being applied to the Upper Colorado Basin hydrologic system. This system contains six reservoirs, each with its own set of objectives (such as maximizing revenue, optimizing environmental indicators, meeting water use needs, or other objectives) and constraints. This leads to a large optimization problem with strong connectedness between objectives. The systems-level approach used by the Hydro-SCOPE tool allows simultaneous analysis of these objectives, as well as understanding of potential trade-offs related to different objectives and operating strategies. The seasonal-scale tool will be tightly integrated with the other components of this project, which examine day-ahead and real-time planning, environmental performance, hydrologic forecasting, and plant efficiency.
NASA Astrophysics Data System (ADS)
Liu, Y.; Engel, B.; Collingsworth, P.; Pijanowski, B. C.
2017-12-01
Nutrient loading from the Maumee River watershed is a significant reason for the harmful algal blooms (HABs) problem in Lake Erie. Strategies to reduce nutrient loading from agricultural areas in the Maumee River watershed need to be explored. Best management practices (BMPs) are popular approaches for improving hydrology and water quality. Various scenarios of BMP implementation were simulated in the AXL watershed (an agricultural watershed in Maumee River watershed) using Soil and Water Assessment Tool (SWAT) and a new BMP cost tool to explore the cost-effectiveness of the practices. BMPs of interest included vegetative filter strips, grassed waterways, blind inlets, grade stabilization structures, wetlands, no-till, nutrient management, residue management, and cover crops. The following environmental concerns were considered: streamflow, Total Phosphorous (TP), Dissolved Reactive Phosphorus (DRP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). To obtain maximum hydrological and water quality benefits with minimum cost, an optimization tool was developed to optimally select and place BMPs by connecting SWAT, the BMP cost tool, and optimization algorithms. The optimization tool was then applied in AXL watershed to explore optimization focusing on critical areas (top 25% of areas with highest runoff volume/pollutant loads per area) vs. all areas of the watershed, optimization using weather data for spring (March to July, due to the goal of reducing spring phosphorus in watershed management plan) vs. full year, and optimization results of implementing BMPs to achieve the watershed management plan goal (reducing 2008 TP levels by 40%). The optimization tool and BMP optimization results can be used by watershed groups and communities to solve hydrology and water quality problems.
Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry
NASA Technical Reports Server (NTRS)
Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.
2004-01-01
Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.
Subsystem design in aircraft power distribution systems using optimization
NASA Astrophysics Data System (ADS)
Chandrasekaran, Sriram
2000-10-01
The research reported in this dissertation focuses on the development of optimization tools for the design of subsystems in a modern aircraft power distribution system. The baseline power distribution system is built around a 270V DC bus. One of the distinguishing features of this power distribution system is the presence of regenerative power from the electrically driven flight control actuators and structurally integrated smart actuators back to the DC bus. The key electrical components of the power distribution system are bidirectional switching power converters, which convert, control and condition electrical power between the sources and the loads. The dissertation is divided into three parts. Part I deals with the formulation of an optimization problem for a sample system consisting of a regulated DC-DC buck converter preceded by an input filter. The individual subsystems are optimized first followed by the integrated optimization of the sample system. It is shown that the integrated optimization provides better results than that obtained by integrating the individually optimized systems. Part II presents a detailed study of piezoelectric actuators. This study includes modeling, optimization of the drive amplifier and the development of a current control law for piezoelectric actuators coupled to a simple mechanical structure. Linear and nonlinear methods to study subsystem interaction and stability are studied in Part III. A multivariable impedance ratio criterion applicable to three phase systems is proposed. Bifurcation methods are used to obtain global stability characteristics of interconnected systems. The application of a nonlinear design methodology, widely used in power systems, to incrementally improve the robustness of a system to Hopf bifurcation instability is discussed.
Automation of POST Cases via External Optimizer and "Artificial p2" Calculation
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.; Michelson, Diane K.
2017-01-01
During conceptual design speed and accuracy are often at odds. Specifically in the realm of launch vehicles, optimizing the ascent trajectory requires a larger pool of analytical power and expertise. Experienced analysts working on familiar vehicles can produce optimal trajectories in a short time frame, however whenever either "experienced" or "familiar " is not applicable the optimization process can become quite lengthy. In order to construct a vehicle agnostic method an established global optimization algorithm is needed. In this work the authors develop an "artificial" error term to map arbitrary control vectors to non-zero error by which a global method can operate. Two global methods are compared alongside Design of Experiments and random sampling and are shown to produce comparable results to analysis done by a human expert.
SPOT-A SENSOR PLACEMENT OPTIMIZATION TOOL FOR ...
journal article This paper presents SPOT, a Sensor Placement Optimization Tool. SPOT provides a toolkit that facilitates research in sensor placement optimization and enables the practical application of sensor placement solvers to real-world CWS design applications. This paper provides an overview of SPOT’s key features, and then illustrates how this tool can be flexibly applied to solve a variety of different types of sensor placement problems.
Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2015-01-01
HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.
Watershed Management Optimization Support Tool (WMOST) Workshop.
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green i...
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
Salehi, Mojtaba; Bahreininejad, Ardeshir
2011-08-01
Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.
Salehi, Mojtaba
2010-01-01
Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020
NASA Astrophysics Data System (ADS)
Ariyarit, Atthaphon; Sugiura, Masahiko; Tanabe, Yasutada; Kanazaki, Masahiro
2018-06-01
A multi-fidelity optimization technique by an efficient global optimization process using a hybrid surrogate model is investigated for solving real-world design problems. The model constructs the local deviation using the kriging method and the global model using a radial basis function. The expected improvement is computed to decide additional samples that can improve the model. The approach was first investigated by solving mathematical test problems. The results were compared with optimization results from an ordinary kriging method and a co-kriging method, and the proposed method produced the best solution. The proposed method was also applied to aerodynamic design optimization of helicopter blades to obtain the maximum blade efficiency. The optimal shape obtained by the proposed method achieved performance almost equivalent to that obtained using the high-fidelity, evaluation-based single-fidelity optimization. Comparing all three methods, the proposed method required the lowest total number of high-fidelity evaluation runs to obtain a converged solution.
Wroe, Emily B; McBain, Ryan K; Michaelis, Annie; Dunbar, Elizabeth L; Hirschhorn, Lisa R; Cancedda, Corrado
2017-08-01
Despite rapid growth in the number of physicians and academic institutions entering the field of global health, there are few tools that inform global health curricula and assess physician readiness for this field. To address this gap, we describe the development and pilot testing of a new tool to assess nontechnical competencies and values in global health. Competencies assessed include systems-based practice, interpersonal and cross-cultural communication, professionalism and self-care, patient care, mentoring, teaching, management, and personal motivation and experience. The Global Health Delivery Competency Assessment Tool presents 15 case vignettes and open-ended questions related to situations a global health practitioner might encounter, and grades the quality of responses on a 6-point ordinal scale. We interviewed 17 of 18 possible global health residents (94%), matched with 17 residents not training in global health, for a total of 34 interviews. A second reviewer independently scored recordings of 13 interviews for reliability. Pilot testing indicated a high degree of discriminant validity, as measured by the instrument's ability to distinguish between residents who were and were not enrolled in a global health program ( P < .001). It also demonstrated acceptable consistency, as assessed by interrater reliability (κ = 0.53), with a range of item-level agreement from 84%-96%. The tool has potential applicability to a variety of academic and programmatic activities, including evaluation of candidates for global health positions and evaluating the success of training programs in equipping practitioners for entry into this field.
Coupled Low-thrust Trajectory and System Optimization via Multi-Objective Hybrid Optimal Control
NASA Technical Reports Server (NTRS)
Vavrina, Matthew A.; Englander, Jacob Aldo; Ghosh, Alexander R.
2015-01-01
The optimization of low-thrust trajectories is tightly coupled with the spacecraft hardware. Trading trajectory characteristics with system parameters ton identify viable solutions and determine mission sensitivities across discrete hardware configurations is labor intensive. Local independent optimization runs can sample the design space, but a global exploration that resolves the relationships between the system variables across multiple objectives enables a full mapping of the optimal solution space. A multi-objective, hybrid optimal control algorithm is formulated using a multi-objective genetic algorithm as an outer loop systems optimizer around a global trajectory optimizer. The coupled problem is solved simultaneously to generate Pareto-optimal solutions in a single execution. The automated approach is demonstrated on two boulder return missions.
Levy-Mendelovich, Sarina; Barg, Assaf Arie; Rosenberg, Nurit; Avishai, Einat; Luboshitz, Jacob; Misgav, Mudi; Kenet, Gili; Livnat, Tami
2018-07-01
Congenital factor V deficiency (FVD) is a rare bleeding disorder with an estimated incidence of 1 in 1000,000 in the general population. Since the common coagulation tests do not correlate with the bleeding tendency there is an unmet need to predict FVD patients' bleeding hazard prior to surgical interventions. To optimize treatment prior to surgical interventions, using global coagulation assays, thrombin generation (TG) and rotating thromboelastogram (ROTEM). Our cohort included 5 patients with FVD, 4 severe and one mild. Two of them underwent TG and ROTEM prior to surgical interventions, including ex vivo spiking assays using bypass agents and platelets spiking. All five patients exhibited prolonged PT and PTT, non-dependent on their bleeding tendency. Patient 1, who demonstrated severe bleeding phenotype, underwent surgery treated by combination of APCC (FEIBA) and platelet transfusion. Therapy was guided by global tests (TG as well as ROTEM) results. During the pre and post-operative period neither excessive bleeding nor any thrombosis was noted. In contrast, TG and ROTEM analysis of patient 4 has lead us to perform the surgery without any blood products' support. Indeed, the patient did not encounter any bleeding. Global coagulation assays may be useful ancillary tools guiding treatment decisions in FVD patients undergoing surgical procedures. Copyright © 2018 Elsevier Inc. All rights reserved.
Additive manufacturing: Toward holistic design
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...
2017-03-18
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
Watershed Management Optimization Support Tool v3
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
NASA Astrophysics Data System (ADS)
Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.
2018-06-01
The necessity to find the global optimum of multiextremal functions arises in many applied problems where finding local solutions is insufficient. One of the desirable properties of global optimization methods is strong homogeneity meaning that a method produces the same sequences of points where the objective function is evaluated independently both of multiplication of the function by a scaling constant and of adding a shifting constant. In this paper, several aspects of global optimization using strongly homogeneous methods are considered. First, it is shown that even if a method possesses this property theoretically, numerically very small and large scaling constants can lead to ill-conditioning of the scaled problem. Second, a new class of global optimization problems where the objective function can have not only finite but also infinite or infinitesimal Lipschitz constants is introduced. Third, the strong homogeneity of several Lipschitz global optimization algorithms is studied in the framework of the Infinity Computing paradigm allowing one to work numerically with a variety of infinities and infinitesimals. Fourth, it is proved that a class of efficient univariate methods enjoys this property for finite, infinite and infinitesimal scaling and shifting constants. Finally, it is shown that in certain cases the usage of numerical infinities and infinitesimals can avoid ill-conditioning produced by scaling. Numerical experiments illustrating theoretical results are described.
Watershed Management Optimization Support Tool (WMOST) ...
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green infrastructure), wastewater, drinking water, and land conservation programs to find the least cost solutions. The pdf version of these presentations accompany the recorded webinar with closed captions to be posted on the WMOST web page. The webinar was recorded at the time a training workshop took place for EPA's Watershed Management Optimization Support Tool (WMOST, v2).
Analysis of the trade-off between high crop yield and low yield instability at the global scale
NASA Astrophysics Data System (ADS)
Ben-Ari, Tamara; Makowski, David
2016-10-01
Yield dynamics of major crops species vary remarkably among continents. Worldwide distribution of cropland influences both the expected levels and the interannual variability of global yields. An expansion of cultivated land in the most productive areas could theoretically increase global production, but also increase global yield instability if the most productive regions are characterized by high interannual yield variability. In this letter, we use portfolio analysis to quantify the tradeoff between the expected values and the interannual variance of global yield. We compute optimal frontiers for four crop species i.e., maize, rice, soybean and wheat and show how the distribution of cropland among large world regions can be optimized to either increase expected global crop production or decrease its interannual variability. We also show that a preferential allocation of cropland in the most productive regions can increase global expected yield at the expense of yield stability. Theoretically, optimizing the distribution of a small fraction of total cultivated areas can help find a good compromise between low instability and high crop yields at the global scale.
Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A
2008-01-01
Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194
Optimization of Regenerators for AMRR Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nellis, Gregory; Klein, Sanford; Brey, William
Active Magnetic Regenerative Refrigeration (AMRR) systems have no direct global warming potential or ozone depletion potential and hold the potential for providing refrigeration with efficiencies that are equal to or greater than the vapor compression systems used today. The work carried out in this project has developed and improved modeling tools that can be used to optimize and evaluate the magnetocaloric materials and geometric structure of the regenerator beds required for AMRR Systems. There has been an explosion in the development of magnetocaloric materials for AMRR systems over the past few decades. The most attractive materials, based on the magnitudemore » of the measured magnetocaloric effect, tend to also have large amounts of hysteresis. This project has provided for the first time a thermodynamically consistent method for evaluating these hysteretic materials in the context of an AMRR cycle. An additional, practical challenge that has been identified for AMRR systems is related to the participation of the regenerator wall in the cyclic process. The impact of housing heat capacity on both passive and active regenerative systems has been studied and clarified within this project. This report is divided into two parts corresponding to these two efforts. Part 1 describes the work related to modeling magnetic hysteresis while Part 2 discusses the modeling of the heat capacity of the housing. A key outcome of this project is the development of a publically available modeling tool that allows researchers to identify a truly optimal magnetocaloric refrigerant. Typically, the refrigeration potential of a magnetocaloric material is judged entirely based on the magnitude of the magnetocaloric effect and other properties of the material that are deemed unimportant. This project has shown that a material with a large magnetocaloric effect (as evidenced, for example, by a large adiabatic temperature change) may not be optimal when it is accompanied by a large hysteresis. The trade-off between these various material properties and the proper design of an AMRR system can only be evaluated correctly using the comprehensive, physics-based model developed by this project. The development of these modeling tools and optimization studies will provide the knowledge base that is required to achieve transformational discoveries. The widespread adoption of AMRR technology will change the character of energy demand in this country and provide manufacturing jobs as well as employment associated with retrofitting existing HVAC&R applications.« less
A water management decision support system contributing to sustainability
NASA Astrophysics Data System (ADS)
Horváth, Klaudia; van Esch, Bart; Baayen, Jorn; Pothof, Ivo; Talsma, Jan; van Heeringen, Klaas-Jan
2017-04-01
Deltares and Eindhoven University of Technology are developing a new decision support system (DSS) for regional water authorities. In order to maintain water levels in the Dutch polder system, water should be drained and pumped out from the polders to the sea. The time and amount of pumping depends on the current sea level, the water level in the polder, the weather forecast and the electricity price forecast and possibly local renewable power production. This is a multivariable optimisation problem, where the goal is to keep the water level in the polder within certain bounds. By optimizing the operation of the pumps the energy usage and costs can be reduced, hence the operation of the regional water authorities can be more sustainable, while also anticipating on increasing share of renewables in the energy mix in a cost-effective way. The decision support system, based on Delft-FEWS as operational data-integration platform, is running an optimization model built in RTC-Tools 2, which is performing real-time optimization in order to calculate the pumping strategy. It is taking into account the present and future circumstances. As being the core of the real time decision support system, RTC-Tools 2 fulfils the key requirements to a DSS: it is fast, robust and always finds the optimal solution. These properties are associated with convex optimization. In such problems the global optimum can always be found. The challenge in the development is to maintain the convex formulation of all the non-linear components in the system, i.e. open channels, hydraulic structures, and pumps. The system is introduced through 4 pilot projects, one of which is a pilot of the Dutch Water Authority Rivierenland. This is a typical Dutch polder system: several polders are drained to the main water system, the Linge. The water from the Linge can be released to the main rivers that are subject to tidal fluctuations. In case of low tide, water can be released via the gates. In case of high tide, water should be pumped. The goal of the pilot is to make the operation of the regional water authority more sustainable and cost-efficient. Sustainability can be achieved by minimizing the CO2 production trough minimizing the energy used for pumping. This work is showing the functionalities of the new decision support system, using RTC-Tools 2, through the example of a pilot project.
Kastner, Monika; Perrier, Laure; Hamid, Jemila; Tricco, Andrea C; Cardoso, Roberta; Ivers, Noah M; Liu, Barbara; Marr, Sharon; Holroyd-Leduc, Jayna; Wong, Geoff; Graves, Lisa; Straus, Sharon E
2015-01-01
Introduction The burden of chronic disease is a global phenomenon, particularly among people aged 65 years and older. More than half of older adults have more than one chronic disease and their care is not optimal. Chronic disease management (CDM) tools have the potential to meet this challenge but they are primarily focused on a single disease, which fails to address the growing number of seniors with multiple chronic conditions. Methods and analysis We will conduct a systematic review alongside a realist review to identify effective CDM tools that integrate one or more high-burden chronic diseases affecting older adults and to better understand for whom, under what circumstances, how and why they produce their outcomes. We will search MEDLINE, EMBASE, CINAHL, AgeLine and the Cochrane Library for experimental, quasi-experimental, observational and qualitative studies in any language investigating CDM tools that facilitate optimal disease management in one or more high-burden chronic diseases affecting adults aged ≥65 years. Study selection will involve calibration of reviewers to ensure reliability of screening and duplicate assessment of articles. Data abstraction and risk of bias assessment will also be performed independently. Analysis will include descriptive summaries of study and appraisal characteristics, effectiveness of each CDM tool (meta-analysis if appropriate); and a realist programme theory will be developed and refined to explain the outcome patterns within the included studies. Ethics and dissemination Ethics approval is not required for this study. We anticipate that our findings, pertaining to gaps in care across high-burden chronic diseases affecting seniors and highlighting specific areas that may require more research, will be of interest to a wide range of knowledge users and stakeholders. We will publish and present our findings widely, and also plan more active dissemination strategies such as workshops with our key stakeholders. Trial registration number Our protocol is registered with PROSPERO (registration number CRD42014014489). PMID:25649215
NASA Astrophysics Data System (ADS)
Wu, J.; Yang, Y.; Luo, Q.; Wu, J.
2012-12-01
This study presents a new hybrid multi-objective evolutionary algorithm, the niched Pareto tabu search combined with a genetic algorithm (NPTSGA), whereby the global search ability of niched Pareto tabu search (NPTS) is improved by the diversification of candidate solutions arose from the evolving nondominated sorting genetic algorithm II (NSGA-II) population. Also, the NPTSGA coupled with the commonly used groundwater flow and transport codes, MODFLOW and MT3DMS, is developed for multi-objective optimal design of groundwater remediation systems. The proposed methodology is then applied to a large-scale field groundwater remediation system for cleanup of large trichloroethylene (TCE) plume at the Massachusetts Military Reservation (MMR) in Cape Cod, Massachusetts. Furthermore, a master-slave (MS) parallelization scheme based on the Message Passing Interface (MPI) is incorporated into the NPTSGA to implement objective function evaluations in distributed processor environment, which can greatly improve the efficiency of the NPTSGA in finding Pareto-optimal solutions to the real-world application. This study shows that the MS parallel NPTSGA in comparison with the original NPTS and NSGA-II can balance the tradeoff between diversity and optimality of solutions during the search process and is an efficient and effective tool for optimizing the multi-objective design of groundwater remediation systems under complicated hydrogeologic conditions.
Global optimization framework for solar building design
NASA Astrophysics Data System (ADS)
Silva, N.; Alves, N.; Pascoal-Faria, P.
2017-07-01
The generative modeling paradigm is a shift from static models to flexible models. It describes a modeling process using functions, methods and operators. The result is an algorithmic description of the construction process. Each evaluation of such an algorithm creates a model instance, which depends on its input parameters (width, height, volume, roof angle, orientation, location). These values are normally chosen according to aesthetic aspects and style. In this study, the model's parameters are automatically generated according to an objective function. A generative model can be optimized according to its parameters, in this way, the best solution for a constrained problem is determined. Besides the establishment of an overall framework design, this work consists on the identification of different building shapes and their main parameters, the creation of an algorithmic description for these main shapes and the formulation of the objective function, respecting a building's energy consumption (solar energy, heating and insulation). Additionally, the conception of an optimization pipeline, combining an energy calculation tool with a geometric scripting engine is presented. The methods developed leads to an automated and optimized 3D shape generation for the projected building (based on the desired conditions and according to specific constrains). The approach proposed will help in the construction of real buildings that account for less energy consumption and for a more sustainable world.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
2016-05-01
The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.
Zeidán-Chuliá, Fares; Gürsoy, Mervi; Neves de Oliveira, Ben-Hur; Özdemir, Vural; Könönen, Eija; Gürsoy, Ulvi K
2015-01-01
Periodontitis, a formidable global health burden, is a common chronic disease that destroys tooth-supporting tissues. Biomarkers of the early phase of this progressive disease are of utmost importance for global health. In this context, saliva represents a non-invasive biosample. By using systems biology tools, we aimed to (1) identify an integrated interactome between matrix metalloproteinase (MMP)-REDOX/nitric oxide (NO) and apoptosis upstream pathways of periodontal inflammation, and (2) characterize the attendant topological network properties to uncover putative biomarkers to be tested in saliva from patients with periodontitis. Hence, we first generated a protein-protein network model of interactions ("BIOMARK" interactome) by using the STRING 10 database, a search tool for the retrieval of interacting genes/proteins, with "Experiments" and "Databases" as input options and a confidence score of 0.400. Second, we determined the centrality values (closeness, stress, degree or connectivity, and betweenness) for the "BIOMARK" members by using the Cytoscape software. We found Ubiquitin C (UBC), Jun proto-oncogene (JUN), and matrix metalloproteinase-14 (MMP14) as the most central hub- and non-hub-bottlenecks among the 211 genes/proteins of the whole interactome. We conclude that UBC, JUN, and MMP14 are likely an optimal candidate group of host-derived biomarkers, in combination with oral pathogenic bacteria-derived proteins, for detecting periodontitis at its early phase by using salivary samples from patients. These findings therefore have broader relevance for systems medicine in global health as well.
PS-FW: A Hybrid Algorithm Based on Particle Swarm and Fireworks for Global Optimization
Chen, Shuangqing; Wei, Lixin; Guan, Bing
2018-01-01
Particle swarm optimization (PSO) and fireworks algorithm (FWA) are two recently developed optimization methods which have been applied in various areas due to their simplicity and efficiency. However, when being applied to high-dimensional optimization problems, PSO algorithm may be trapped in the local optima owing to the lack of powerful global exploration capability, and fireworks algorithm is difficult to converge in some cases because of its relatively low local exploitation efficiency for noncore fireworks. In this paper, a hybrid algorithm called PS-FW is presented, in which the modified operators of FWA are embedded into the solving process of PSO. In the iteration process, the abandonment and supplement mechanism is adopted to balance the exploration and exploitation ability of PS-FW, and the modified explosion operator and the novel mutation operator are proposed to speed up the global convergence and to avoid prematurity. To verify the performance of the proposed PS-FW algorithm, 22 high-dimensional benchmark functions have been employed, and it is compared with PSO, FWA, stdPSO, CPSO, CLPSO, FIPS, Frankenstein, and ALWPSO algorithms. Results show that the PS-FW algorithm is an efficient, robust, and fast converging optimization method for solving global optimization problems. PMID:29675036
Organizational Readiness Tools for Global Health Intervention: A Review
Dearing, James W.
2018-01-01
The ability of non-governmental organizations, government agencies, and corporations to deliver and support the availability and use of interventions for improved global public health depends on their readiness to do so. Yet readiness has proven to be a rather fluid concept in global public health, perhaps due to its multidimensional nature and because scholars and practitioners have applied the concept at different levels such as the individual, organization, and community. This review concerns 30 publically available tools created for the purpose of organizational readiness assessment in order to carry out global public health objectives. Results suggest that these tools assess organizational capacity in the absence of measuring organizational motivation, thus overlooking a key aspect of organizational readiness. Moreover, the tools reviewed are mostly untested by their developers to establish whether the tools do, in fact, measure capacity. These results suggest opportunities for implementation science researchers. PMID:29552552
Meng, Qing-chun; Rong, Xiao-xia; Zhang, Yi-min; Wan, Xiao-le; Liu, Yuan-yuan; Wang, Yu-zhi
2016-01-01
CO2 emission influences not only global climate change but also international economic and political situations. Thus, reducing the emission of CO2, a major greenhouse gas, has become a major issue in China and around the world as regards preserving the environmental ecology. Energy consumption from coal, oil, and natural gas is primarily responsible for the production of greenhouse gases and air pollutants such as SO2 and NOX, which are the main air pollutants in China. In this study, a mathematical multi-objective optimization method was adopted to analyze the collaborative emission reduction of three kinds of gases on the basis of their common restraints in different ways of energy consumption to develop an economic, clean, and efficient scheme for energy distribution. The first part introduces the background research, the collaborative emission reduction for three kinds of gases, the multi-objective optimization, the main mathematical modeling, and the optimization method. The second part discusses the four mathematical tools utilized in this study, which include the Granger causality test to analyze the causality between air quality and pollutant emission, a function analysis to determine the quantitative relation between energy consumption and pollutant emission, a multi-objective optimization to set up the collaborative optimization model that considers energy consumption, and an optimality condition analysis for the multi-objective optimization model to design the optimal-pole algorithm and obtain an efficient collaborative reduction scheme. In the empirical analysis, the data of pollutant emission and final consumption of energies of Tianjin in 1996-2012 was employed to verify the effectiveness of the model and analyze the efficient solution and the corresponding dominant set. In the last part, several suggestions for collaborative reduction are recommended and the drawn conclusions are stated.
Zhang, Yi-min; Wan, Xiao-le; Liu, Yuan-yuan; Wang, Yu-zhi
2016-01-01
CO2 emission influences not only global climate change but also international economic and political situations. Thus, reducing the emission of CO2, a major greenhouse gas, has become a major issue in China and around the world as regards preserving the environmental ecology. Energy consumption from coal, oil, and natural gas is primarily responsible for the production of greenhouse gases and air pollutants such as SO2 and NOX, which are the main air pollutants in China. In this study, a mathematical multi-objective optimization method was adopted to analyze the collaborative emission reduction of three kinds of gases on the basis of their common restraints in different ways of energy consumption to develop an economic, clean, and efficient scheme for energy distribution. The first part introduces the background research, the collaborative emission reduction for three kinds of gases, the multi-objective optimization, the main mathematical modeling, and the optimization method. The second part discusses the four mathematical tools utilized in this study, which include the Granger causality test to analyze the causality between air quality and pollutant emission, a function analysis to determine the quantitative relation between energy consumption and pollutant emission, a multi-objective optimization to set up the collaborative optimization model that considers energy consumption, and an optimality condition analysis for the multi-objective optimization model to design the optimal-pole algorithm and obtain an efficient collaborative reduction scheme. In the empirical analysis, the data of pollutant emission and final consumption of energies of Tianjin in 1996–2012 was employed to verify the effectiveness of the model and analyze the efficient solution and the corresponding dominant set. In the last part, several suggestions for collaborative reduction are recommended and the drawn conclusions are stated. PMID:27010658
Watershed Management Optimization Support Tool (WMOST) v3: User Guide
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...
Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Outlook: directed development: catalysing a global biotech industry.
Sun, Anthony; Perkins, Tom
2005-09-01
Governments are increasingly relying on directed development tools or proactive public-policy approaches to stimulate scientific and economic development for their biotechnology industries. This article will discuss the four main tools of directed development in biotechnology and the lessons learned from current global efforts utilizing these tools.
Global mHealth policy arena: status check and future directions
Slovensky, Donna J.
2017-01-01
In this review, we examine an important piece of the mHealth puzzle that has received scant attention—health policy. The question is whether health policy ultimately will serve to unite nations in advancing global mHealth or, as Mars and Scott suggested in 2010, keep nations isolated and ultimately making their policy decisions in “eHealth silos”. Such a non-collaborative approach seriously hampers the potential for using mobile health technologies to deliver health care across borders, assuring individuals access to affordable, convenient, and quality healthcare in underserved regions. From a global perspective, mHealth policy review is difficult as some important policies may be subsumed in comprehensive planning and strategy documents. Political, environmental, economic, organizational, and technology disparities across nations represent a significant impediment to developing mHealth products and services that can be deployed globally. To date, there is modest evidence that such challenges are being addressed. Even though payers can encourage adoption of mHealth with financial incentives for use, it appears that payment or reimbursement tends to be a roadblock for almost all nations, whether they are emerging or developed. If payment for mHealth services is not guaranteed, business models will not be sustainable and providers will have fewer opportunities for scalability. Furthermore, because mHealth policies typically are subject to some type of government scrutiny and oversight, many product developers and entrepreneurs may turn elsewhere for their investments. Global resource scarcity also challenges optimal mHealth deployment, and governments seek to ensure improved population health outcomes as return on their mHealth investments. Unfortunately, such justification is difficult as evaluation methods simply have not kept pace with mHealth technology capability. Requisite measurement tools are sorely lacking when it comes to evaluating efficacy of mHealth interventions, due in part to insufficient research to inform development of needed measurement tools. Because most robust mHealth research trials have been conducted in the developed world with its impressive technology infrastructure and not in developing nations where the health needs are greatest, evaluation of mobile technology intervention from a global perspective tends to be insufficient to inform policy decisions. PMID:29184893
Global mHealth policy arena: status check and future directions.
Malvey, Donna M; Slovensky, Donna J
2017-01-01
In this review, we examine an important piece of the mHealth puzzle that has received scant attention-health policy. The question is whether health policy ultimately will serve to unite nations in advancing global mHealth or, as Mars and Scott suggested in 2010, keep nations isolated and ultimately making their policy decisions in "eHealth silos". Such a non-collaborative approach seriously hampers the potential for using mobile health technologies to deliver health care across borders, assuring individuals access to affordable, convenient, and quality healthcare in underserved regions. From a global perspective, mHealth policy review is difficult as some important policies may be subsumed in comprehensive planning and strategy documents. Political, environmental, economic, organizational, and technology disparities across nations represent a significant impediment to developing mHealth products and services that can be deployed globally. To date, there is modest evidence that such challenges are being addressed. Even though payers can encourage adoption of mHealth with financial incentives for use, it appears that payment or reimbursement tends to be a roadblock for almost all nations, whether they are emerging or developed. If payment for mHealth services is not guaranteed, business models will not be sustainable and providers will have fewer opportunities for scalability. Furthermore, because mHealth policies typically are subject to some type of government scrutiny and oversight, many product developers and entrepreneurs may turn elsewhere for their investments. Global resource scarcity also challenges optimal mHealth deployment, and governments seek to ensure improved population health outcomes as return on their mHealth investments. Unfortunately, such justification is difficult as evaluation methods simply have not kept pace with mHealth technology capability. Requisite measurement tools are sorely lacking when it comes to evaluating efficacy of mHealth interventions, due in part to insufficient research to inform development of needed measurement tools. Because most robust mHealth research trials have been conducted in the developed world with its impressive technology infrastructure and not in developing nations where the health needs are greatest, evaluation of mobile technology intervention from a global perspective tends to be insufficient to inform policy decisions.
NASA Astrophysics Data System (ADS)
Thomson, A. M.; Izaurralde, R. C.; Calvin, K.; Zhang, X.; Wise, M.; West, T. O.
2010-12-01
Climate change and food security are global issues increasingly linked through human decision making that takes place across all scales from on-farm management actions to international climate negotiations. Understanding how agricultural systems can respond to climate change, through mitigation or adaptation, while still supplying sufficient food to feed a growing global population, thus requires a multi-sector tool in a global economic framework. Integrated assessment models are one such tool, however they are typically driven by historical aggregate statistics of production in combination with exogenous assumptions of future trends in agricultural productivity; they are not yet capable of exploring agricultural management practices as climate adaptation or mitigation strategies. Yet there are agricultural models capable of detailed biophysical modeling of farm management and climate impacts on crop yield, soil erosion and C and greenhouse gas emissions, although these are typically applied at point scales that are incompatible with coarse resolution integrated assessment modeling. To combine the relative strengths of these modeling systems, we are using the agricultural model EPIC (Environmental Policy Integrated Climate), applied in a geographic data framework for regional analyses, to provide input to the global economic model GCAM (Global Change Assessment Model). The initial phase of our approach focuses on a pilot region of the Midwest United States, a highly productive agricultural area. We apply EPIC, a point based biophysical process model, at 60 m spatial resolution within this domain and aggregate the results to GCAM agriculture and land use subregions for the United States. GCAM is then initialized with multiple management options for key food and bioenergy crops. Using EPIC to distinguish these management options based on grain yield, residue yield, soil C change and cost differences, GCAM then simulates the optimum distribution of the available management options to meet demands for food and energy over the next century. The coupled models provide a new platform for evaluating future changes in agricultural management based on food demand, bioenergy demand, and changes in crop yield and soil C under a changing climate. This framework can be applied to evaluate the economically and biophysically optimal distribution of management under future climates.
Simultaneous optimization of micro-heliostat geometry and field layout using a genetic algorithm
NASA Astrophysics Data System (ADS)
Lazardjani, Mani Yousefpour; Kronhardt, Valentina; Dikta, Gerhard; Göttsche, Joachim
2016-05-01
A new optimization tool for micro-heliostat (MH) geometry and field layout is presented. The method intends simultaneous performance improvement and cost reduction through iteration of heliostat geometry and field layout parameters. This tool was developed primarily for the optimization of a novel micro-heliostat concept, which was developed at Solar-Institut Jülich (SIJ). However, the underlying approach for the optimization can be used for any heliostat type. During the optimization the performance is calculated using the ray-tracing tool SolCal. The costs of the heliostats are calculated by use of a detailed cost function. A genetic algorithm is used to change heliostat geometry and field layout in an iterative process. Starting from an initial setup, the optimization tool generates several configurations of heliostat geometries and field layouts. For each configuration a cost-performance ratio is calculated. Based on that, the best geometry and field layout can be selected in each optimization step. In order to find the best configuration, this step is repeated until no significant improvement in the results is observed.
2011-01-01
Background Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task. PMID:21867520
Research on particle swarm optimization algorithm based on optimal movement probability
NASA Astrophysics Data System (ADS)
Ma, Jianhong; Zhang, Han; He, Baofeng
2017-01-01
The particle swarm optimization algorithm to improve the control precision, and has great application value training neural network and fuzzy system control fields etc.The traditional particle swarm algorithm is used for the training of feed forward neural networks,the search efficiency is low, and easy to fall into local convergence.An improved particle swarm optimization algorithm is proposed based on error back propagation gradient descent. Particle swarm optimization for Solving Least Squares Problems to meme group, the particles in the fitness ranking, optimization problem of the overall consideration, the error back propagation gradient descent training BP neural network, particle to update the velocity and position according to their individual optimal and global optimization, make the particles more to the social optimal learning and less to its optimal learning, it can avoid the particles fall into local optimum, by using gradient information can accelerate the PSO local search ability, improve the multi beam particle swarm depth zero less trajectory information search efficiency, the realization of improved particle swarm optimization algorithm. Simulation results show that the algorithm in the initial stage of rapid convergence to the global optimal solution can be near to the global optimal solution and keep close to the trend, the algorithm has faster convergence speed and search performance in the same running time, it can improve the convergence speed of the algorithm, especially the later search efficiency.
Wehmeyer, Christoph; Falk von Rudorff, Guido; Wolf, Sebastian; Kabbe, Gabriel; Schärf, Daniel; Kühne, Thomas D; Sebastiani, Daniel
2012-11-21
We present a stochastic, swarm intelligence-based optimization algorithm for the prediction of global minima on potential energy surfaces of molecular cluster structures. Our optimization approach is a modification of the artificial bee colony (ABC) algorithm which is inspired by the foraging behavior of honey bees. We apply our modified ABC algorithm to the problem of global geometry optimization of molecular cluster structures and show its performance for clusters with 2-57 particles and different interatomic interaction potentials.
NASA Astrophysics Data System (ADS)
Wehmeyer, Christoph; Falk von Rudorff, Guido; Wolf, Sebastian; Kabbe, Gabriel; Schärf, Daniel; Kühne, Thomas D.; Sebastiani, Daniel
2012-11-01
We present a stochastic, swarm intelligence-based optimization algorithm for the prediction of global minima on potential energy surfaces of molecular cluster structures. Our optimization approach is a modification of the artificial bee colony (ABC) algorithm which is inspired by the foraging behavior of honey bees. We apply our modified ABC algorithm to the problem of global geometry optimization of molecular cluster structures and show its performance for clusters with 2-57 particles and different interatomic interaction potentials.
NASA Astrophysics Data System (ADS)
Pérez-Jordán, wG; Castro-Almazán, J. A.; Muñoz-Tuñón, C.
2018-07-01
We validate the Weather Research and Forecasting (WRF) model for precipitable water vapour (PWV) forecasting as a fully operational tool for optimizing astronomical infrared observations at Roque de los Muchachos Observatory (ORM). For the model validation, we used GNSS-based (Global Navigation Satellite System) data from the PWV monitor located at the ORM. We have run WRF every 24 h for near two months, with a horizon of 48 h (hourly forecasts), from 2016 January 11 to March 04. These runs represent 1296 hourly forecast points. The validation is carried out using different approaches: performance as a function of the forecast range, time horizon accuracy, performance as a function of the PWV value, and performance of the operational WRF time series with 24- and 48-h horizons. Excellent agreement was found between the model forecasts and observations, with R = 0.951 and 0.904 for the 24- and 48-h forecast time series, respectively. The 48-h forecast was further improved by correcting a time lag of 2 h found in the predictions. The final errors, taking into account all the uncertainties involved, are 1.75 mm for the 24-h forecasts and 1.99 mm for 48 h. We found linear trends in both the correlation and root-mean-square error of the residuals (measurements - forecasts) as a function of the forecast range within the horizons analysed (up to 48 h). In summary, the WRF performance is excellent and accurate, thus allowing it to be implemented as an operational tool at the ORM.
NASA Astrophysics Data System (ADS)
Pérez-Jordán, G.; Castro-Almazán, J. A.; Muñoz-Tuñón, C.
2018-04-01
We validate the Weather Research and Forecasting (WRF) model for precipitable water vapour (PWV) forecasting as a fully operational tool for optimizing astronomical infrared (IR) observations at Roque de los Muchachos Observatory (ORM). For the model validation we used GNSS-based (Global Navigation Satellite System) data from the PWV monitor located at the ORM. We have run WRF every 24 h for near two months, with a horizon of 48 hours (hourly forecasts), from 2016 January 11 to 2016 March 4. These runs represent 1296 hourly forecast points. The validation is carried out using different approaches: performance as a function of the forecast range, time horizon accuracy, performance as a function of the PWV value, and performance of the operational WRF time series with 24- and 48-hour horizons. Excellent agreement was found between the model forecasts and observations, with R =0.951 and R =0.904 for the 24- and 48-h forecast time series respectively. The 48-h forecast was further improved by correcting a time lag of 2 h found in the predictions. The final errors, taking into account all the uncertainties involved, are 1.75 mm for the 24-h forecasts and 1.99 mm for 48 h. We found linear trends in both the correlation and RMSE of the residuals (measurements - forecasts) as a function of the forecast range within the horizons analysed (up to 48 h). In summary, the WRF performance is excellent and accurate, thus allowing it to be implemented as an operational tool at the ORM.
Optimization of Compressor Mounting Bracket of a Passenger Car
NASA Astrophysics Data System (ADS)
Kalsi, Sachin; Singh, Daljeet; Saini, J. S.
2018-05-01
In the present work, the CAE tools are used for the optimization of the compressor mounting bracket used in an automobile. Both static and dynamic analysis is done for the bracket. With the objective to minimize the mass and increase the stiffness of the bracket, the new design is optimized using shape and topology optimization techniques. The optimized design given by CAE tool is then validated experimentally. The new design results in lower level of vibrations, contribute to lower mass along with lesser cost which is effective in air conditioning system as well as the efficiency of a vehicle. The results given by CAE tool had a very good correlation with the experimental results.
Computational tool for optimizing the essential oils utilization in inhibiting the bacterial growth
El-Attar, Noha E; Awad, Wael A
2017-01-01
Day after day, the importance of relying on nature in many fields such as food, medical, pharmaceutical industries, and others is increasing. Essential oils (EOs) are considered as one of the most significant natural products for use as antimicrobials, antioxidants, antitumorals, and anti-inflammatories. Optimizing the usage of EOs is a big challenge faced by the scientific researchers because of the complexity of chemical composition of every EO, in addition to the difficulties to determine the best in inhibiting the bacterial activity. The goal of this article is to present a new computational tool based on two methodologies: reduction by using rough sets and optimization with particle swarm optimization. The developed tool dubbed as Essential Oil Reduction and Optimization Tool is applied on 24 types of EOs that have been tested toward 17 different species of bacteria. PMID:28919787
Optimization of global model composed of radial basis functions using the term-ranking approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Peng; Tao, Chao, E-mail: taochao@nju.edu.cn; Liu, Xiao-Jun
2014-03-15
A term-ranking method is put forward to optimize the global model composed of radial basis functions to improve the predictability of the model. The effectiveness of the proposed method is examined by numerical simulation and experimental data. Numerical simulations indicate that this method can significantly lengthen the prediction time and decrease the Bayesian information criterion of the model. The application to real voice signal shows that the optimized global model can capture more predictable component in chaos-like voice data and simultaneously reduce the predictable component (periodic pitch) in the residual signal.
NASA Technical Reports Server (NTRS)
Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.
2009-01-01
.We study local-in-time adjoint-based methods for minimization of ow matching functionals subject to the 2-D unsteady compressible Euler equations. The key idea of the local-in-time method is to construct a very accurate approximation of the global-in-time adjoint equations and the corresponding sensitivity derivative by using only local information available on each time subinterval. In contrast to conventional time-dependent adjoint-based optimization methods which require backward-in-time integration of the adjoint equations over the entire time interval, the local-in-time method solves local adjoint equations sequentially over each time subinterval. Since each subinterval contains relatively few time steps, the storage cost of the local-in-time method is much lower than that of the global adjoint formulation, thus making the time-dependent optimization feasible for practical applications. The paper presents a detailed comparison of the local- and global-in-time adjoint-based methods for minimization of a tracking functional governed by the Euler equations describing the ow around a circular bump. Our numerical results show that the local-in-time method converges to the same optimal solution obtained with the global counterpart, while drastically reducing the memory cost as compared to the global-in-time adjoint formulation.
Computational Tools and Algorithms for Designing Customized Synthetic Genes
Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris
2014-01-01
Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050
Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa
The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.
C. elegans network biology: a beginning.
Piano, Fabio; Gunsalus, Kristin C; Hill, David E; Vidal, Marc
2006-01-01
The architecture and dynamics of molecular networks can provide an understanding of complex biological processes complementary to that obtained from the in-depth study of single genes and proteins. With a completely sequenced and well-annotated genome, a fully characterized cell lineage, and powerful tools available to dissect development, Caenorhabditis elegans, among metazoans, provides an optimal system to bridge cellular and organismal biology with the global properties of macromolecular networks. This chapter considers omic technologies available for C. elegans to describe molecular networks--encompassing transcriptional and phenotypic profiling as well as physical interaction mapping--and discusses how their individual and integrated applications are paving the way for a network-level understanding of C. elegans biology. PMID:18050437
Power System Simulations For The Globalstar2 Mission Using The PowerCap Software
NASA Astrophysics Data System (ADS)
Defoug, S.; Pin, R.
2011-10-01
The Globalstar system aims to enable customers to communicate all around the world thanks to its constellation of 48 LEO satellites. Thales Alenia Space is in charge of the design and manufacturing of the second generation of the Globalstar satellites. For such a long duration mission (15 years) and with a payload power consumption varying incessantly, the optimization of the solar arrays and battery has to be consolidated by an accurate power simulation tool. After a general overview of the Globalstar power system and of the PowerCap software, this paper presents the dedicated version elaborated for the GlobalStar2 mission, the simulations results and their correlation with the tests.
NASA Astrophysics Data System (ADS)
Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.
2017-08-01
This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.
Zhang, Huaguang; Feng, Tao; Yang, Guang-Hong; Liang, Hongjing
2015-07-01
In this paper, the inverse optimal approach is employed to design distributed consensus protocols that guarantee consensus and global optimality with respect to some quadratic performance indexes for identical linear systems on a directed graph. The inverse optimal theory is developed by introducing the notion of partial stability. As a result, the necessary and sufficient conditions for inverse optimality are proposed. By means of the developed inverse optimal theory, the necessary and sufficient conditions are established for globally optimal cooperative control problems on directed graphs. Basic optimal cooperative design procedures are given based on asymptotic properties of the resulting optimal distributed consensus protocols, and the multiagent systems can reach desired consensus performance (convergence rate and damping rate) asymptotically. Finally, two examples are given to illustrate the effectiveness of the proposed methods.
Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Oliker, Leonid; Sohn, Andrew
1996-01-01
Dynamic mesh adaptation on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load inbalances among processors on a parallel machine. This paper described the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution coast is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35 percent of the mesh is randomly adapted. For large scale scientific computations, our load balancing strategy gives an almost sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remappier yields processor assignments that are less than 3 percent of the optimal solutions, but requires only 1 percent of the computational time.
Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
Challenges of NDE simulation tool validation, optimization, and utilization for composites
NASA Astrophysics Data System (ADS)
Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter
2016-02-01
Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.
Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Decision Support for Resilient Communities: EPA’s Watershed Management Optimization Support Tool
The U.S. EPA Atlantic Ecology Division is releasing version 3 of the Watershed Management Optimization Support Tool (WMOST v3) in February 2018. WMOST is a decision-support tool that facilitates integrated water resources management (IWRM) by communities and watershed organizati...
The Watershed Management Optimization Support Tool (WMOST) is a public-domain software application designed to aid decision makers with integrated water resources management. The tool allows water resource managers and planners to screen a wide-range of management practices for c...
Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS.
Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young
2015-11-09
In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry.
Merging spatially variant physical process models under an optimized systems dynamics framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cain, William O.; Lowry, Thomas Stephen; Pierce, Suzanne A.
The complexity of water resource issues, its interconnectedness to other systems, and the involvement of competing stakeholders often overwhelm decision-makers and inhibit the creation of clear management strategies. While a range of modeling tools and procedures exist to address these problems, they tend to be case specific and generally emphasize either a quantitative and overly analytic approach or present a qualitative dialogue-based approach lacking the ability to fully explore consequences of different policy decisions. The integration of these two approaches is needed to drive toward final decisions and engender effective outcomes. Given these limitations, the Computer Assisted Dispute Resolution systemmore » (CADRe) was developed to aid in stakeholder inclusive resource planning. This modeling and negotiation system uniquely addresses resource concerns by developing a spatially varying system dynamics model as well as innovative global optimization search techniques to maximize outcomes from participatory dialogues. Ultimately, the core system architecture of CADRe also serves as the cornerstone upon which key scientific innovation and challenges can be addressed.« less
Modeling Self-Healing of Concrete Using Hybrid Genetic Algorithm–Artificial Neural Network
Ramadan Suleiman, Ahmed; Nehdi, Moncef L.
2017-01-01
This paper presents an approach to predicting the intrinsic self-healing in concrete using a hybrid genetic algorithm–artificial neural network (GA–ANN). A genetic algorithm was implemented in the network as a stochastic optimizing tool for the initial optimal weights and biases. This approach can assist the network in achieving a global optimum and avoid the possibility of the network getting trapped at local optima. The proposed model was trained and validated using an especially built database using various experimental studies retrieved from the open literature. The model inputs include the cement content, water-to-cement ratio (w/c), type and dosage of supplementary cementitious materials, bio-healing materials, and both expansive and crystalline additives. Self-healing indicated by means of crack width is the model output. The results showed that the proposed GA–ANN model is capable of capturing the complex effects of various self-healing agents (e.g., biochemical material, silica-based additive, expansive and crystalline components) on the self-healing performance in cement-based materials. PMID:28772495
Solving multi-objective water management problems using evolutionary computation.
Lewis, A; Randall, M
2017-12-15
Water as a resource is becoming increasingly more valuable given the changes in global climate. In an agricultural sense, the role of water is vital to ensuring food security. Therefore the management of it has become a subject of increasing attention and the development of effective tools to support participative decision-making in water management will be a valuable contribution. In this paper, evolutionary computation techniques and Pareto optimisation are incorporated in a model-based system for water management. An illustrative test case modelling optimal crop selection across dry, average and wet years based on data from the Murrumbidgee Irrigation Area in Australia is presented. It is shown that sets of trade-off solutions that provide large net revenues, or minimise environmental flow deficits can be produced rapidly, easily and automatically. The system is capable of providing detailed information on optimal solutions to achieve desired outcomes, responding to a variety of factors including climate conditions and economics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS
Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young
2015-01-01
In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry. PMID:26569219
Hanna, Debra; Romero, Klaus; Schito, Marco
2017-03-01
The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.
Modeling Self-Healing of Concrete Using Hybrid Genetic Algorithm-Artificial Neural Network.
Ramadan Suleiman, Ahmed; Nehdi, Moncef L
2017-02-07
This paper presents an approach to predicting the intrinsic self-healing in concrete using a hybrid genetic algorithm-artificial neural network (GA-ANN). A genetic algorithm was implemented in the network as a stochastic optimizing tool for the initial optimal weights and biases. This approach can assist the network in achieving a global optimum and avoid the possibility of the network getting trapped at local optima. The proposed model was trained and validated using an especially built database using various experimental studies retrieved from the open literature. The model inputs include the cement content, water-to-cement ratio (w/c), type and dosage of supplementary cementitious materials, bio-healing materials, and both expansive and crystalline additives. Self-healing indicated by means of crack width is the model output. The results showed that the proposed GA-ANN model is capable of capturing the complex effects of various self-healing agents (e.g., biochemical material, silica-based additive, expansive and crystalline components) on the self-healing performance in cement-based materials.
Azunre, P.
2016-09-21
Here in this paper, two novel techniques for bounding the solutions of parametric weakly coupled second-order semilinear parabolic partial differential equations are developed. The first provides a theorem to construct interval bounds, while the second provides a theorem to construct lower bounds convex and upper bounds concave in the parameter. The convex/concave bounds can be significantly tighter than the interval bounds because of the wrapping effect suffered by interval analysis in dynamical systems. Both types of bounds are computationally cheap to construct, requiring solving auxiliary systems twice and four times larger than the original system, respectively. An illustrative numerical examplemore » of bound construction and use for deterministic global optimization within a simple serial branch-and-bound algorithm, implemented numerically using interval arithmetic and a generalization of McCormick's relaxation technique, is presented. Finally, problems within the important class of reaction-diffusion systems may be optimized with these tools.« less
Mitigating energy loss on distribution lines through the allocation of reactors
NASA Astrophysics Data System (ADS)
Miranda, T. M.; Romero, F.; Meffe, A.; Castilho Neto, J.; Abe, L. F. T.; Corradi, F. E.
2018-03-01
This paper presents a methodology for automatic reactors allocation on medium voltage distribution lines to reduce energy loss. In Brazil, some feeders are distinguished by their long lengths and very low load, which results in a high influence of the capacitance of the line on the circuit’s performance, requiring compensation through the installation of reactors. The automatic allocation is accomplished using an optimization meta-heuristic called Global Neighbourhood Algorithm. Given a set of reactor models and a circuit, it outputs an optimal solution in terms of reduction of energy loss. The algorithm is also able to verify if the voltage limits determined by the user are not being violated, besides checking for energy quality. The methodology was implemented in a software tool, which can also show the allocation graphically. A simulation with four real feeders is presented in the paper. The obtained results were able to reduce the energy loss significantly, from 50.56%, in the worst case, to 93.10%, in the best case.
Yang, Zhen-Lun; Wu, Angus; Min, Hua-Qing
2015-01-01
An improved quantum-behaved particle swarm optimization with elitist breeding (EB-QPSO) for unconstrained optimization is presented and empirically studied in this paper. In EB-QPSO, the novel elitist breeding strategy acts on the elitists of the swarm to escape from the likely local optima and guide the swarm to perform more efficient search. During the iterative optimization process of EB-QPSO, when criteria met, the personal best of each particle and the global best of the swarm are used to generate new diverse individuals through the transposon operators. The new generated individuals with better fitness are selected to be the new personal best particles and global best particle to guide the swarm for further solution exploration. A comprehensive simulation study is conducted on a set of twelve benchmark functions. Compared with five state-of-the-art quantum-behaved particle swarm optimization algorithms, the proposed EB-QPSO performs more competitively in all of the benchmark functions in terms of better global search capability and faster convergence rate.
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.
2018-01-01
This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is presented wherein the combined effects of temperature and loading rate on the predicted response of a braided composite is investigated.
Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity
NASA Astrophysics Data System (ADS)
Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.
2017-12-01
Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
Extension of an Object-Oriented Optimization Tool: User's Reference Manual
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Truong, Samson S.
2015-01-01
The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Optimizing human activity patterns using global sensitivity analysis
Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...
2013-12-10
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less
Rafati, Hasan; Talebpour, Zahra; Adlnasab, Laleh; Ebrahimi, Samad Nejad
2009-07-01
In this study, pH responsive macroparticles incorporating peppermint oil (PO) were prepared using a simple emulsification/polymer precipitation technique. The formulations were examined for their properties and the desired quality was then achieved using a quality by design (QBD) approach. For this purpose, a Draper-Lin small composite design study was employed in order to investigate the effect of four independent variables, including the PO to water ratio, the concentration of pH sensitive polymer (hydroxypropyl methylcellulose phthalate), acid and plasticizer concentrations, on the encapsulation efficiency and PO loading. The analysis of variance showed that the polymer concentration was the most important variable on encapsulation efficiency (p < 0.05). The multiple regression analysis of the results led to equations that adequately described the influence of the independent variables on the selected responses. Furthermore, the desirability function was employed as an effective tool for transforming each response separately and encompassing all of these responses in an overall desirability function for global optimization of the encapsulation process. The optimized macroparticles were predicted to yield 93.4% encapsulation efficiency and 72.8% PO loading, which were remarkably close to the experimental values of 89.2% and 69.5%, consequently.
Baxter, John S. H.; Inoue, Jiro; Drangova, Maria; Peters, Terry M.
2016-01-01
Abstract. Optimization-based segmentation approaches deriving from discrete graph-cuts and continuous max-flow have become increasingly nuanced, allowing for topological and geometric constraints on the resulting segmentation while retaining global optimality. However, these two considerations, topological and geometric, have yet to be combined in a unified manner. The concept of “shape complexes,” which combine geodesic star convexity with extendable continuous max-flow solvers, is presented. These shape complexes allow more complicated shapes to be created through the use of multiple labels and super-labels, with geodesic star convexity governed by a topological ordering. These problems can be optimized using extendable continuous max-flow solvers. Previous approaches required computationally expensive coordinate system warping, which are ill-defined and ambiguous in the general case. These shape complexes are demonstrated in a set of synthetic images as well as vessel segmentation in ultrasound, valve segmentation in ultrasound, and atrial wall segmentation from contrast-enhanced CT. Shape complexes represent an extendable tool alongside other continuous max-flow methods that may be suitable for a wide range of medical image segmentation problems. PMID:28018937
A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics.
Halloran, John T; Rocke, David M
2018-05-04
Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l 2 -SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l 2 -SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l 2 -SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator_upgrade .
Particle Swarm Optimization with Double Learning Patterns
Shen, Yuanxia; Wei, Linna; Zeng, Chuanhua; Chen, Jian
2016-01-01
Particle Swarm Optimization (PSO) is an effective tool in solving optimization problems. However, PSO usually suffers from the premature convergence due to the quick losing of the swarm diversity. In this paper, we first analyze the motion behavior of the swarm based on the probability characteristic of learning parameters. Then a PSO with double learning patterns (PSO-DLP) is developed, which employs the master swarm and the slave swarm with different learning patterns to achieve a trade-off between the convergence speed and the swarm diversity. The particles in the master swarm and the slave swarm are encouraged to explore search for keeping the swarm diversity and to learn from the global best particle for refining a promising solution, respectively. When the evolutionary states of two swarms interact, an interaction mechanism is enabled. This mechanism can help the slave swarm in jumping out of the local optima and improve the convergence precision of the master swarm. The proposed PSO-DLP is evaluated on 20 benchmark functions, including rotated multimodal and complex shifted problems. The simulation results and statistical analysis show that PSO-DLP obtains a promising performance and outperforms eight PSO variants. PMID:26858747
NASA Astrophysics Data System (ADS)
Martowicz, Adam; Uhl, Tadeusz
2012-10-01
The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.
NASA Astrophysics Data System (ADS)
Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie
2009-06-01
Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.
Bousquet, J; Schunemann, H J; Fonseca, J; Samolinski, B; Bachert, C; Canonica, G W; Casale, T; Cruz, A A; Demoly, P; Hellings, P; Valiulis, A; Wickman, M; Zuberbier, T; Bosnic-Anticevitch, S; Bedbrook, A; Bergmann, K C; Caimmi, D; Dahl, R; Fokkens, W J; Grisle, I; Lodrup Carlsen, K; Mullol, J; Muraro, A; Palkonen, S; Papadopoulos, N; Passalacqua, G; Ryan, D; Valovirta, E; Yorgancioglu, A; Aberer, W; Agache, I; Adachi, M; Akdis, C A; Akdis, M; Annesi-Maesano, I; Ansotegui, I J; Anto, J M; Arnavielhe, S; Arshad, H; Baiardini, I; Baigenzhin, A K; Barbara, C; Bateman, E D; Beghé, B; Bel, E H; Ben Kheder, A; Bennoor, K S; Benson, M; Bewick, M; Bieber, T; Bindslev-Jensen, C; Bjermer, L; Blain, H; Boner, A L; Boulet, L P; Bonini, M; Bonini, S; Bosse, I; Bourret, R; Bousquet, P J; Braido, F; Briggs, A H; Brightling, C E; Brozek, J; Buhl, R; Burney, P G; Bush, A; Caballero-Fonseca, F; Calderon, M A; Camargos, P A M; Camuzat, T; Carlsen, K H; Carr, W; Cepeda Sarabia, A M; Chavannes, N H; Chatzi, L; Chen, Y Z; Chiron, R; Chkhartishvili, E; Chuchalin, A G; Ciprandi, G; Cirule, I; Correia de Sousa, J; Cox, L; Crooks, G; Costa, D J; Custovic, A; Dahlen, S E; Darsow, U; De Carlo, G; De Blay, F; Dedeu, T; Deleanu, D; Denburg, J A; Devillier, P; Didier, A; Dinh-Xuan, A T; Dokic, D; Douagui, H; Dray, G; Dubakiene, R; Durham, S R; Dykewicz, M S; El-Gamal, Y; Emuzyte, R; Fink Wagner, A; Fletcher, M; Fiocchi, A; Forastiere, F; Gamkrelidze, A; Gemicioğlu, B; Gereda, J E; González Diaz, S; Gotua, M; Grouse, L; Guzmán, M A; Haahtela, T; Hellquist-Dahl, B; Heinrich, J; Horak, F; Hourihane, J O 'b; Howarth, P; Humbert, M; Hyland, M E; Ivancevich, J C; Jares, E J; Johnston, S L; Joos, G; Jonquet, O; Jung, K S; Just, J; Kaidashev, I; Kalayci, O; Kalyoncu, A F; Keil, T; Keith, P K; Khaltaev, N; Klimek, L; Koffi N'Goran, B; Kolek, V; Koppelman, G H; Kowalski, M L; Kull, I; Kuna, P; Kvedariene, V; Lambrecht, B; Lau, S; Larenas-Linnemann, D; Laune, D; Le, L T T; Lieberman, P; Lipworth, B; Li, J; Louis, R; Magard, Y; Magnan, A; Mahboub, B; Majer, I; Makela, M J; Manning, P; De Manuel Keenoy, E; Marshall, G D; Masjedi, M R; Maurer, M; Mavale-Manuel, S; Melén, E; Melo-Gomes, E; Meltzer, E O; Merk, H; Miculinic, N; Mihaltan, F; Milenkovic, B; Mohammad, Y; Molimard, M; Momas, I; Montilla-Santana, A; Morais-Almeida, M; Mösges, R; Namazova-Baranova, L; Naclerio, R; Neou, A; Neffen, H; Nekam, K; Niggemann, B; Nyembue, T D; O'Hehir, R E; Ohta, K; Okamoto, Y; Okubo, K; Ouedraogo, S; Paggiaro, P; Pali-Schöll, I; Palmer, S; Panzner, P; Papi, A; Park, H S; Pavord, I; Pawankar, R; Pfaar, O; Picard, R; Pigearias, B; Pin, I; Plavec, D; Pohl, W; Popov, T A; Portejoie, F; Postma, D; Potter, P; Price, D; Rabe, K F; Raciborski, F; Radier Pontal, F; Repka-Ramirez, S; Robalo-Cordeiro, C; Rolland, C; Rosado-Pinto, J; Reitamo, S; Rodenas, F; Roman Rodriguez, M; Romano, A; Rosario, N; Rosenwasser, L; Rottem, M; Sanchez-Borges, M; Scadding, G K; Serrano, E; Schmid-Grendelmeier, P; Sheikh, A; Simons, F E R; Sisul, J C; Skrindo, I; Smit, H A; Solé, D; Sooronbaev, T; Spranger, O; Stelmach, R; Strandberg, T; Sunyer, J; Thijs, C; Todo-Bom, A; Triggiani, M; Valenta, R; Valero, A L; van Hage, M; Vandenplas, O; Vezzani, G; Vichyanond, P; Viegi, G; Wagenmann, M; Walker, S; Wang, D Y; Wahn, U; Williams, D M; Wright, J; Yawn, B P; Yiallouros, P K; Yusuf, O M; Zar, H J; Zernotti, M E; Zhang, L; Zhong, N; Zidarn, M; Mercier, J
2015-11-01
Several unmet needs have been identified in allergic rhinitis: identification of the time of onset of the pollen season, optimal control of rhinitis and comorbidities, patient stratification, multidisciplinary team for integrated care pathways, innovation in clinical trials and, above all, patient empowerment. MASK-rhinitis (MACVIA-ARIA Sentinel NetworK for allergic rhinitis) is a simple system centred around the patient which was devised to fill many of these gaps using Information and Communications Technology (ICT) tools and a clinical decision support system (CDSS) based on the most widely used guideline in allergic rhinitis and its asthma comorbidity (ARIA 2015 revision). It is one of the implementation systems of Action Plan B3 of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA). Three tools are used for the electronic monitoring of allergic diseases: a cell phone-based daily visual analogue scale (VAS) assessment of disease control, CARAT (Control of Allergic Rhinitis and Asthma Test) and e-Allergy screening (premedical system of early diagnosis of allergy and asthma based on online tools). These tools are combined with a clinical decision support system (CDSS) and are available in many languages. An e-CRF and an e-learning tool complete MASK. MASK is flexible and other tools can be added. It appears to be an advanced, global and integrated ICT answer for many unmet needs in allergic diseases which will improve policies and standards. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods
NASA Astrophysics Data System (ADS)
Rogers, Adam; Safi-Harb, Samar; Fiege, Jason
2015-08-01
The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.
Global Optimality of the Successive Maxbet Algorithm.
ERIC Educational Resources Information Center
Hanafi, Mohamed; ten Berge, Jos M. F.
2003-01-01
It is known that the Maxbet algorithm, which is an alternative to the method of generalized canonical correlation analysis and Procrustes analysis, may converge to local maxima. Discusses an eigenvalue criterion that is sufficient, but not necessary, for global optimality of the successive Maxbet algorithm. (SLD)
NASA Astrophysics Data System (ADS)
Shevtsov, S.; Zhilyaev, I.; Oganesyan, P.; Axenov, V.
2017-01-01
The glass/carbon fiber composites are widely used in the design of various aircraft and rotorcraft components such as fairings and cowlings, which have predominantly a shell-like geometry and are made of quasi-isotropic laminates. The main requirements to such the composite parts are the specified mechanical stiffness to withstand the non-uniform air pressure at the different flight conditions and reduce a level of noise caused by the airflow-induced vibrations at the constrained weight of the part. The main objective of present study is the optimization of wall thickness and lay-up of composite shell-like cowling. The present approach assumes conversion of the CAD model of the cowling surface to finite element (FE) representation, then its wind tunnel testing simulation at the different orientation of airflow to find the most stressed mode of flight. Numerical solutions of the Reynolds averaged Navier-Stokes (RANS) equations supplemented by k-w turbulence model provide the spatial distributions of air pressure applied to the shell surface. At the formulation of optimization problem the global strain energy calculated within the optimized shell was assumed as the objective. A wall thickness of the shell had to change over its surface to minimize the objective at the constrained weight. We used a parameterization of the problem that assumes an initiation of auxiliary sphere with varied radius and coordinates of the center, which were the design variables. Curve that formed by the intersection of the shell with sphere defined boundary of area, which should be reinforced by local thickening the shell wall. To eliminate a local stress concentration this increment was defined as the smooth function defined on the shell surface. As a result of structural optimization we obtained the thickness of shell's wall distribution, which then was used to design the draping and lay-up of composite prepreg layers. The global strain energy in the optimized cowling was reduced in2.5 times at the weight growth up to 15%, whereas the eigenfrequencies at the 6 first natural vibration modes have been increased by 5-15%. The present approach and developed programming tools that demonstrated a good efficiency and stability at the acceptable computational costs can be used to optimize a wide range of shell-like structures made of quasi-isotropic laminates.
The Impact of Machine Translation and Computer-aided Translation on Translators
NASA Astrophysics Data System (ADS)
Peng, Hao
2018-03-01
Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.
Numerical study on injection parameters optimization of thin wall and biodegradable polymers parts
NASA Astrophysics Data System (ADS)
Santos, C.; Mendes, A.; Carreira, P.; Mateus, A.; Malça, C.
2017-07-01
Nowadays, the molds industry searches new markets, with diversified and added value products. The concept associated to the production of thin walled and biodegradable parts mostly manufactured by injection process has assumed a relevant importance due to environmental and economic factors. The growth of a global consciousness about the harmful effects of the conventional polymers in our life quality associated with the legislation imposed, become key factors for the choice of a particular product by the consumer. The target of this work is to provide an integrated solution for the injection of parts with thin walls and manufactured using biodegradable materials. This integrated solution includes the design and manufacture processes of the mold as well as to find the optimum values for the injection parameters in order to become the process effective and competitive. For this, the Moldflow software was used. It was demonstrated that this computational tool provides an effective responsiveness and it can constitute an important tool in supporting the injection molding of thin-walled and biodegradable parts.
Holyoke, Caleb W; Cordova, Daniel; Zhang, Wenming; Barry, James D; Leighty, Robert M; Dietrich, Robert F; Rauh, James J; Pahutski, Thomas F; Lahm, George P; Tong, My-Hanh Thi; Benner, Eric A; Andreassi, John L; Smith, Rejane M; Vincent, Daniel R; Christianson, Laurie A; Teixeira, Luis A; Singh, Vineet; Hughes, Kenneth A
2017-04-01
As the world population grows towards 9 billion by 2050, it is projected that food production will need to increase by 60%. A critical part of this growth includes the safe and effective use of insecticides to reduce the estimated 20-49% loss of global crop yields owing to pests. The development of new insecticides will help to sustain this protection and overcome insecticide resistance. A novel class of mesoionic compounds has been discovered, with exceptional insecticidal activity on a range of Hemiptera and Lepidoptera. These compounds bind to the orthosteric site of the nicotinic acetylcholine receptor and result in a highly potent inhibitory action at the receptor with minimal agonism. The synthesis, biological activity, optimization and mode of action will be discussed. Triflumezopyrim insect control will provide a powerful tool for control of hopper species in rice throughout Asia. Dicloromezotiaz can provide a useful control tool for lepidopteran pests, with an underexploited mode of action among these pests. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
A practical globalization of one-shot optimization for optimal design of tokamak divertors
NASA Astrophysics Data System (ADS)
Blommaert, Maarten; Dekeyser, Wouter; Baelmans, Martine; Gauger, Nicolas R.; Reiter, Detlev
2017-01-01
In past studies, nested optimization methods were successfully applied to design of the magnetic divertor configuration in nuclear fusion reactors. In this paper, so-called one-shot optimization methods are pursued. Due to convergence issues, a globalization strategy for the one-shot solver is sought. Whereas Griewank introduced a globalization strategy using a doubly augmented Lagrangian function that includes primal and adjoint residuals, its practical usability is limited by the necessity of second order derivatives and expensive line search iterations. In this paper, a practical alternative is offered that avoids these drawbacks by using a regular augmented Lagrangian merit function that penalizes only state residuals. Additionally, robust rank-two Hessian estimation is achieved by adaptation of Powell's damped BFGS update rule. The application of the novel one-shot approach to magnetic divertor design is considered in detail. For this purpose, the approach is adapted to be complementary with practical in parts adjoint sensitivities. Using the globalization strategy, stable convergence of the one-shot approach is achieved.
NASA Astrophysics Data System (ADS)
Rocha, Ana Maria A. C.; Costa, M. Fernanda P.; Fernandes, Edite M. G. P.
2016-12-01
This article presents a shifted hyperbolic penalty function and proposes an augmented Lagrangian-based algorithm for non-convex constrained global optimization problems. Convergence to an ?-global minimizer is proved. At each iteration k, the algorithm requires the ?-global minimization of a bound constrained optimization subproblem, where ?. The subproblems are solved by a stochastic population-based metaheuristic that relies on the artificial fish swarm paradigm and a two-swarm strategy. To enhance the speed of convergence, the algorithm invokes the Nelder-Mead local search with a dynamically defined probability. Numerical experiments with benchmark functions and engineering design problems are presented. The results show that the proposed shifted hyperbolic augmented Lagrangian compares favorably with other deterministic and stochastic penalty-based methods.
New estimation architecture for multisensor data fusion
NASA Astrophysics Data System (ADS)
Covino, Joseph M.; Griffiths, Barry E.
1991-07-01
This paper describes a novel method of hierarchical asynchronous distributed filtering called the Net Information Approach (NIA). The NIA is a Kalman-filter-based estimation scheme for spatially distributed sensors which must retain their local optimality yet require a nearly optimal global estimate. The key idea of the NIA is that each local sensor-dedicated filter tells the global filter 'what I've learned since the last local-to-global transmission,' whereas in other estimation architectures the local-to-global transmission consists of 'what I think now.' An algorithm based on this idea has been demonstrated on a small-scale target-tracking problem with many encouraging results. Feasibility of this approach was demonstrated by comparing NIA performance to an optimal centralized Kalman filter (lower bound) via Monte Carlo simulations.
Optimal Design of Grid-Stiffened Composite Panels Using Global and Local Buckling Analysis
NASA Technical Reports Server (NTRS)
Ambur, Damodar R.; Jaunky, Navin; Knight, Norman F., Jr.
1996-01-01
A design strategy for optimal design of composite grid-stiffened panels subjected to global and local buckling constraints is developed using a discrete optimizer. An improved smeared stiffener theory is used for the global buckling analysis. Local buckling of skin segments is assessed using a Rayleigh-Ritz method that accounts for material anisotropy and transverse shear flexibility. The local buckling of stiffener segments is also assessed. Design variables are the axial and transverse stiffener spacing, stiffener height and thickness, skin laminate, and stiffening configuration. The design optimization process is adapted to identify the lightest-weight stiffening configuration and pattern for grid stiffened composite panels given the overall panel dimensions, design in-plane loads, material properties, and boundary conditions of the grid-stiffened panel.
Implementation of a Low-Thrust Trajectory Optimization Algorithm for Preliminary Design
NASA Technical Reports Server (NTRS)
Sims, Jon A.; Finlayson, Paul A.; Rinderle, Edward A.; Vavrina, Matthew A.; Kowalkowski, Theresa D.
2006-01-01
A tool developed for the preliminary design of low-thrust trajectories is described. The trajectory is discretized into segments and a nonlinear programming method is used for optimization. The tool is easy to use, has robust convergence, and can handle many intermediate encounters. In addition, the tool has a wide variety of features, including several options for objective function and different low-thrust propulsion models (e.g., solar electric propulsion, nuclear electric propulsion, and solar sail). High-thrust, impulsive trajectories can also be optimized.
NASA Astrophysics Data System (ADS)
Chandra, Rishabh
Partial differential equation-constrained combinatorial optimization (PDECCO) problems are a mixture of continuous and discrete optimization problems. PDECCO problems have discrete controls, but since the partial differential equations (PDE) are continuous, the optimization space is continuous as well. Such problems have several applications, such as gas/water network optimization, traffic optimization, micro-chip cooling optimization, etc. Currently, no efficient classical algorithm which guarantees a global minimum for PDECCO problems exists. A new mapping has been developed that transforms PDECCO problem, which only have linear PDEs as constraints, into quadratic unconstrained binary optimization (QUBO) problems that can be solved using an adiabatic quantum optimizer (AQO). The mapping is efficient, it scales polynomially with the size of the PDECCO problem, requires only one PDE solve to form the QUBO problem, and if the QUBO problem is solved correctly and efficiently on an AQO, guarantees a global optimal solution for the original PDECCO problem.
Topology and boundary shape optimization as an integrated design tool
NASA Technical Reports Server (NTRS)
Bendsoe, Martin Philip; Rodrigues, Helder Carrico
1990-01-01
The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.
Topology optimization under stochastic stiffness
NASA Astrophysics Data System (ADS)
Asadpoure, Alireza
Topology optimization is a systematic computational tool for optimizing the layout of materials within a domain for engineering design problems. It allows variation of structural boundaries and connectivities. This freedom in the design space often enables discovery of new, high performance designs. However, solutions obtained by performing the optimization in a deterministic setting may be impractical or suboptimal when considering real-world engineering conditions with inherent variabilities including (for example) variabilities in fabrication processes and operating conditions. The aim of this work is to provide a computational methodology for topology optimization in the presence of uncertainties associated with structural stiffness, such as uncertain material properties and/or structural geometry. Existing methods for topology optimization under deterministic conditions are first reviewed. Modifications are then proposed to improve the numerical performance of the so-called Heaviside Projection Method (HPM) in continuum domains. Next, two approaches, perturbation and Polynomial Chaos Expansion (PCE), are proposed to account for uncertainties in the optimization procedure. These approaches are intrusive, allowing tight and efficient coupling of the uncertainty quantification with the optimization sensitivity analysis. The work herein develops a robust topology optimization framework aimed at reducing the sensitivity of optimized solutions to uncertainties. The perturbation-based approach combines deterministic topology optimization with a perturbation method for the quantification of uncertainties. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. The PCE approach combines the spectral stochastic approach for the representation and propagation of uncertainties with an existing deterministic topology optimization technique. The resulting compact representations for the response quantities allow for efficient and accurate calculation of sensitivities of response statistics with respect to the design variables. The proposed methods are shown to be successful at generating robust optimal topologies. Examples from topology optimization in continuum and discrete domains (truss structures) under uncertainty are presented. It is also shown that proposed methods lead to significant computational savings when compared to Monte Carlo-based optimization which involve multiple formations and inversions of the global stiffness matrix and that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.
Using Optimization to Improve Test Planning
2017-09-01
friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool for the test and... evaluation schedulers. 14. SUBJECT TERMS schedule optimization, test planning 15. NUMBER OF PAGES 223 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...make the input more user-friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool
Optimal harvesting of a stochastic delay logistic model with Lévy jumps
NASA Astrophysics Data System (ADS)
Qiu, Hong; Deng, Wenmin
2016-10-01
The optimal harvesting problem of a stochastic time delay logistic model with Lévy jumps is considered in this article. We first show that the model has a unique global positive solution and discuss the uniform boundedness of its pth moment with harvesting. Then we prove that the system is globally attractive and asymptotically stable in distribution under our assumptions. Furthermore, we obtain the existence of the optimal harvesting effort by the ergodic method, and then we give the explicit expression of the optimal harvesting policy and maximum yield.
Electronic neural networks for global optimization
NASA Technical Reports Server (NTRS)
Thakoor, A. P.; Moopenn, A. W.; Eberhardt, S.
1990-01-01
An electronic neural network with feedback architecture, implemented in analog custom VLSI is described. Its application to problems of global optimization for dynamic assignment is discussed. The convergence properties of the neural network hardware are compared with computer simulation results. The neural network's ability to provide optimal or near optimal solutions within only a few neuron time constants, a speed enhancement of several orders of magnitude over conventional search methods, is demonstrated. The effect of noise on the circuit dynamics and the convergence behavior of the neural network hardware is also examined.
OPTIMAL WELL LOCATOR (OWL): A SCREENING TOOL FOR EVALUATING LOCATIONS OF MONITORING WELLS
The Optimal Well Locator ( OWL) program was designed and developed by USEPA to be a screening tool to evaluate and optimize the placement of wells in long term monitoring networks at small sites. The first objective of the OWL program is to allow the user to visualize the change ...
Offering Global Collaboration Services beyond CERN and HEP
NASA Astrophysics Data System (ADS)
Fernandes, J.; Ferreira, P.; Baron, T.
2015-12-01
The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 100+ institutes worldwide. But providing the software is often not enough and institutes, collaborations and project teams do not always possess the expertise, or human or material resources that are needed to set up and maintain such services. Regional and national institutions have to answer needs, which are growingly global and often contradict their operational capabilities or organizational mandate and so are looking at existing worldwide service offers such as CERN's. We believe that the accumulated experience obtained through the operation of a large scale worldwide collaboration service combined with CERN's global network and its recently- deployed Agile Infrastructure would allow the Organization to set up and operate collaborative services, such as Indico and Vidyo, at a much larger scale and on behalf of worldwide research and education institutions and thus answer these pressing demands while optimizing resources at a global level. Such services would be built over a robust and massively scalable Indico server to which the concept of communities would be added, and which would then serve as a hub for accessing other collaboration services such as Vidyo, on the same simple and successful model currently in place for CERN users. This talk will describe this vision, its benefits and the steps that have already been taken to make it come to life.
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-12-28
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Dynamic least-cost optimisation of wastewater system remedial works requirements.
Vojinovic, Z; Solomatine, D; Price, R K
2006-01-01
In recent years, there has been increasing concern for wastewater system failure and identification of optimal set of remedial works requirements. So far, several methodologies have been developed and applied in asset management activities by various water companies worldwide, but often with limited success. In order to fill the gap, there are several research projects that have been undertaken in exploring various algorithms to optimise remedial works requirements, but mostly for drinking water supply systems, and very limited work has been carried out for the wastewater assets. Some of the major deficiencies of commonly used methods can be found in either one or more of the following aspects: inadequate representation of systems complexity, incorporation of a dynamic model into the decision-making loop, the choice of an appropriate optimisation technique and experience in applying that technique. This paper is oriented towards resolving these issues and discusses a new approach for the optimisation of wastewater systems remedial works requirements. It is proposed that the optimal problem search is performed by a global optimisation tool (with various random search algorithms) and the system performance is simulated by the hydrodynamic pipe network model. The work on assembling all required elements and the development of an appropriate interface protocols between the two tools, aimed to decode the potential remedial solutions into the pipe network model and to calculate the corresponding scenario costs, is currently underway.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
3Drefine: an interactive web server for efficient protein structure refinement.
Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin
2016-07-08
3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G
2016-10-01
Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of <38%; at this threshold, the mean difference was 0.3 ml (SD 19.8 ml), the mean absolute difference was 14.3 (SD 13.7) ml, and CTP was 67% sensitive and 87% specific for identification of DWI positive tissue voxels. The benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.
Eilbeigi, Elnaz; Setarehdan, Seyed Kamaledin
2018-05-26
Brain-computer interfaces (BCIs) are a promising tool in neurorehabilitation. The intention to perform a motor action can be detected from brain signals and used to control robotic devices. Most previous studies have focused on the starting of movements from a resting state, while in daily life activities, motions occur continuously and the neural activities correlated to the evolving movements are yet to be investigated. First we investigate the existence of neural correlates of intention to replace an object on the table during a holding phase. Next, we present a new method to extract the movement-related cortical potentials (MRCP) from a single-trial EEG. A novel method called Global optimal constrained ICA (GocICA) is proposed to overcome the limitations of cICA which is implemented based on Particle Swarm Optimization (PSO) and Charged System Search (CSS) techniques. GocICA is then utilized for decoding the intention to grasp and lift and intention to replace movements where the results were compared. It was found that GocICA significantly improves the intention detection performance. Best results in offline detection were obtained with CSS-cICA for both kinds of intentions. Furthermore, pseudo-online decoding showed that GocICA was able to predict both intentions before the onset of related movements with the highest probability. Decoding of the next movement intention during current movement is possible, which can be used to create more natural neuroprostheses. The results demonstrate that GocICA is a promising new algorithm for single-trial MRCP detection which can be used for detecting other types of ERPs such as P300. Copyright © 2018 Elsevier Ltd. All rights reserved.
Parallel Computational Protein Design.
Zhou, Yichao; Donald, Bruce R; Zeng, Jianyang
2017-01-01
Computational structure-based protein design (CSPD) is an important problem in computational biology, which aims to design or improve a prescribed protein function based on a protein structure template. It provides a practical tool for real-world protein engineering applications. A popular CSPD method that guarantees to find the global minimum energy solution (GMEC) is to combine both dead-end elimination (DEE) and A* tree search algorithms. However, in this framework, the A* search algorithm can run in exponential time in the worst case, which may become the computation bottleneck of large-scale computational protein design process. To address this issue, we extend and add a new module to the OSPREY program that was previously developed in the Donald lab (Gainza et al., Methods Enzymol 523:87, 2013) to implement a GPU-based massively parallel A* algorithm for improving protein design pipeline. By exploiting the modern GPU computational framework and optimizing the computation of the heuristic function for A* search, our new program, called gOSPREY, can provide up to four orders of magnitude speedups in large protein design cases with a small memory overhead comparing to the traditional A* search algorithm implementation, while still guaranteeing the optimality. In addition, gOSPREY can be configured to run in a bounded-memory mode to tackle the problems in which the conformation space is too large and the global optimal solution cannot be computed previously. Furthermore, the GPU-based A* algorithm implemented in the gOSPREY program can be combined with the state-of-the-art rotamer pruning algorithms such as iMinDEE (Gainza et al., PLoS Comput Biol 8:e1002335, 2012) and DEEPer (Hallen et al., Proteins 81:18-39, 2013) to also consider continuous backbone and side-chain flexibility.
Kasper, Jennifer; Greene, Jeremy A; Farmer, Paul E; Jones, David S
2016-05-01
As physicians work to achieve optimal health outcomes for their patients, they often struggle to address the issues that arise outside the clinic. Social, economic, and political factors influence patients' burden of disease, access to treatment, and health outcomes. This challenge has motivated recent calls for increased attention to the social determinants of health. At the same time, advocates have called for increased attention to global health. Each year, more U.S. medical students participate in global health experiences. Yet, the global health training that is available varies widely. The discipline of social medicine, which attends to the social determinants of disease, social meanings of disease, and social responses to disease, offers a solution to both challenges. The analyses and techniques of social medicine provide an invaluable toolkit for providing health care in the United States and abroad.In 2007, Harvard Medical School implemented a new course, required for all first-year students, that teaches social medicine in a way that integrates global health. In this article, the authors argue for the importance of including social medicine and global health in the preclinical curriculum; describe Harvard Medical School's innovative, integrated approach to teaching these disciplines, which can be used at other medical schools; and explore the barriers that educators may face in implementing such a curriculum, including resistance from students. Such a course can equip medical students with the knowledge and tools that they will need to address complex health problems in the United States and abroad.
Watershed Management Optimization Support Tool (WMOST) v1: User Manual and Case Study Examples
The Watershed Management Optimization Support Tool (WMOST) is intended to be used as a screening tool as part of an integrated watershed management process such as that described in EPA’s watershed planning handbook (EPA 2008).1 The objective of WMOST is to serve as a public-doma...
NASA Astrophysics Data System (ADS)
Vasu, M.; Shivananda, Nayaka H.
2018-04-01
EN47 steel samples are machined on a self-centered lathe using Chemical Vapor Deposition of coated TiCN/Al2O3/TiN and uncoated tungsten carbide tool inserts, with nose radius 0.8mm. Results are compared with each other and optimized using statistical tool. Input (cutting) parameters that are considered in this work are feed rate (f), cutting speed (Vc), and depth of cut (ap), the optimization criteria are based on the Taguchi (L9) orthogonal array. ANOVA method is adopted to evaluate the statistical significance and also percentage contribution for each model. Multiple response characteristics namely cutting force (Fz), tool tip temperature (T) and surface roughness (Ra) are evaluated. The results discovered that coated tool insert (TiCN/Al2O3/TiN) exhibits 1.27 and 1.29 times better than the uncoated tool insert for tool tip temperature and surface roughness respectively. A slight increase in cutting force was observed for coated tools.
Efficient global fiber tracking on multi-dimensional diffusion direction maps
NASA Astrophysics Data System (ADS)
Klein, Jan; Köhler, Benjamin; Hahn, Horst K.
2012-02-01
Global fiber tracking algorithms have recently been proposed which were able to compute results of unprecedented quality. They account for avoiding accumulation errors by a global optimization process at the cost of a high computation time of several hours or even days. In this paper, we introduce a novel global fiber tracking algorithm which, for the first time, globally optimizes the underlying diffusion direction map obtained from DTI or HARDI data, instead of single fiber segments. As a consequence, the number of iterations in the optimization process can drastically be reduced by about three orders of magnitude. Furthermore, in contrast to all previous algorithms, the density of the tracked fibers can be adjusted after the optimization within a few seconds. We evaluated our method for diffusion-weighted images obtained from software phantoms, healthy volunteers, and tumor patients. We show that difficult fiber bundles, e.g., the visual pathways or tracts for different motor functions can be determined and separated in an excellent quality. Furthermore, crossing and kissing bundles are correctly resolved. On current standard hardware, a dense fiber tracking result of a whole brain can be determined in less than half an hour which is a strong improvement compared to previous work.
Agroforestry landscapes and global change: landscape ecology tools for management and conservation
Guillermo Martinez Pastur; Emilie Andrieu; Louis R. Iverson; Pablo Luis Peri
2012-01-01
Forest ecosystems are impacted by multiple uses under the influence of global drivers, and where landscape ecology tools may substantially facilitate the management and conservation of the agroforestry ecosystems. The use of landscape ecology tools was described in the eight papers of the present special issue, including changes in forested landscapes due to...
ERIC Educational Resources Information Center
Glover, Alison; Peters, Carl; Haslett, Simon K.
2011-01-01
Purpose: The purpose of this paper is to test the validity of the curriculum auditing tool Sustainability Tool for Auditing University Curricula in Higher Education (STAUNCH[C]), which was designed to audit the education for sustainability and global citizenship content of higher education curricula. The Welsh Assembly Government aspires to…
Open Source GIS based integrated watershed management
NASA Astrophysics Data System (ADS)
Byrne, J. M.; Lindsay, J.; Berg, A. A.
2013-12-01
Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users
Run-of-river power plants in Alpine regions: whither optimal capacity?
NASA Astrophysics Data System (ADS)
Lazzaro, Gianluca; Botter, Gianluca
2015-04-01
Hydropower is the major renewable electricity generation technology worldwide. The future expansion of this technology mostly relies on the development of small run-of-river projects, in which a fraction of the running flows is diverted from the river to a turbine for energy production. Even though small hydro inflicts a smaller impact on aquatic ecosystems and local communities compared to large dams, it cannot prevent stresses on plant, animal, and human well-being. This is especially true in mountain regions where the plant outflow is located several kilometers downstream of the intake, thereby inducing the depletion of river reaches of considerable length. Moreover, the negative cumulative effects of run-of-river systems operating along the same river threaten the ability of stream networks to supply ecological corridors for plants, invertebrates or fishes, and support biodiversity. Research in this area is severely lacking. Therefore, the prediction of the long-term impacts associated to the expansion of run-of-river projects induced by global-scale incentive policies remains highly uncertain. This contribution aims at providing objective tools to address the preliminary choice of the capacity of a run-of-river hydropower plant when the economic value of the plant and the alteration of the flow regime are simultaneously accounted for. This is done using the concepts of Pareto-optimality and Pareto-dominance, which are powerful tools suited to face multi-objective optimization in presence of conflicting goals, such as the maximization of the profitability and the minimization of the hydrologic disturbance induced by the plant in the river reach between the intake and the outflow. The application to a set of case studies belonging to the Piave River basin (Italy) suggests that optimal solutions are strongly dependent the natural flow regime at the plant intake. While in some cases (namely, reduced streamflow variability) the optimal trade-off between economic profitability and hydrologic disturbance is well identified, in other cases (enhanced streamflow variability) multiple options and/or ranges of optimal capacities may be devised. Such alternatives offer to water managers an objective basis to identify optimal allocation of resources and policy actions. Small hydro technology is likely to gain a higher social value in the next decades if the environmental and hydrologic footprint associated to the energetic exploitation of surface water will take a higher priority in civil infrastructures planning.
Preliminary Development of an Object-Oriented Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.
Lukashin, A V; Fuchs, R
2001-05-01
Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.
NASA Astrophysics Data System (ADS)
Shoemaker, C. A.; Pang, M.; Akhtar, T.; Bindel, D.
2016-12-01
New parallel surrogate global optimization algorithms are developed and applied to objective functions that are expensive simulations (possibly with multiple local minima). The algorithms can be applied to most geophysical simulations, including those with nonlinear partial differential equations. The optimization does not require simulations be parallelized. Asynchronous (and synchronous) parallel execution is available in the optimization toolbox "pySOT". The parallel algorithms are modified from serial to eliminate fine grained parallelism. The optimization is computed with open source software pySOT, a Surrogate Global Optimization Toolbox that allows user to pick the type of surrogate (or ensembles), the search procedure on surrogate, and the type of parallelism (synchronous or asynchronous). pySOT also allows the user to develop new algorithms by modifying parts of the code. In the applications here, the objective function takes up to 30 minutes for one simulation, and serial optimization can take over 200 hours. Results from Yellowstone (NSF) and NCSS (Singapore) supercomputers are given for groundwater contaminant hydrology simulations with applications to model parameter estimation and decontamination management. All results are compared with alternatives. The first results are for optimization of pumping at many wells to reduce cost for decontamination of groundwater at a superfund site. The optimization runs with up to 128 processors. Superlinear speed up is obtained for up to 16 processors, and efficiency with 64 processors is over 80%. Each evaluation of the objective function requires the solution of nonlinear partial differential equations to describe the impact of spatially distributed pumping and model parameters on model predictions for the spatial and temporal distribution of groundwater contaminants. The second application uses an asynchronous parallel global optimization for groundwater quality model calibration. The time for a single objective function evaluation varies unpredictably, so efficiency is improved with asynchronous parallel calculations to improve load balancing. The third application (done at NCSS) incorporates new global surrogate multi-objective parallel search algorithms into pySOT and applies it to a large watershed calibration problem.
Resistance Genes in Global Crop Breeding Networks.
Garrett, K A; Andersen, K F; Asche, F; Bowden, R L; Forbes, G A; Kulakow, P A; Zhou, B
2017-10-01
Resistance genes are a major tool for managing crop diseases. The networks of crop breeders who exchange resistance genes and deploy them in varieties help to determine the global landscape of resistance and epidemics, an important system for maintaining food security. These networks function as a complex adaptive system, with associated strengths and vulnerabilities, and implications for policies to support resistance gene deployment strategies. Extensions of epidemic network analysis can be used to evaluate the multilayer agricultural networks that support and influence crop breeding networks. Here, we evaluate the general structure of crop breeding networks for cassava, potato, rice, and wheat. All four are clustered due to phytosanitary and intellectual property regulations, and linked through CGIAR hubs. Cassava networks primarily include public breeding groups, whereas others are more mixed. These systems must adapt to global change in climate and land use, the emergence of new diseases, and disruptive breeding technologies. Research priorities to support policy include how best to maintain both diversity and redundancy in the roles played by individual crop breeding groups (public versus private and global versus local), and how best to manage connectivity to optimize resistance gene deployment while avoiding risks to the useful life of resistance genes. [Formula: see text] Copyright © 2017 The Author(s). This is an open access article distributed under the CC BY 4.0 International license .
Global dynamic optimization approach to predict activation in metabolic pathways.
de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R
2014-01-06
During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.
Has the time come for big science in wildlife health?
Sleeman, Jonathan M.
2013-01-01
The consequences of wildlife emerging diseases are global and profound with increased burden on the public health system, negative impacts on the global economy, declines and extinctions of wildlife species, and subsequent loss of ecological integrity. Examples of health threats to wildlife include Batrachochytrium dendrobatidis, which causes a cutaneous fungal infection of amphibians and is linked to declines of amphibians globally; and the recently discovered Pseudogymnoascus (Geomyces) destructans, the etiologic agent of white nose syndrome which has caused precipitous declines of North American bat species. Of particular concern are the novel pathogens that have emerged as they are particularly devastating and challenging to manage. A big science approach to wildlife health research is needed if we are to make significant and enduring progress in managing these diseases. The advent of new analytical models and bench assays will provide us with the mathematical and molecular tools to identify and anticipate threats to wildlife, and understand the ecology and epidemiology of these diseases. Specifically, new molecular diagnostic techniques have opened up avenues for pathogen discovery, and the application of spatially referenced databases allows for risk assessments that can assist in targeting surveillance. Long-term, systematic collection of data for wildlife health and integration with other datasets is also essential. Multidisciplinary research programs should be expanded to increase our understanding of the drivers of emerging diseases and allow for the development of better disease prevention and management tools, such as vaccines. Finally, we need to create a National Fish and Wildlife Health Network that provides the operational framework (governance, policies, procedures, etc.) by which entities with a stake in wildlife health cooperate and collaborate to achieve optimal outcomes for human, animal, and ecosystem health.
The Optimal Well Locator ( OWL) program was designed and developed by USEPA to be a screening tool to evaluate and optimize the placement of wells in long term monitoring networks at small sites. The first objective of the OWL program is to allow the user to visualize the change ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Modiri, A; Gu, X; Sawant, A
2014-06-15
Purpose: We present a particle swarm optimization (PSO)-based 4D IMRT planning technique designed for dynamic MLC tracking delivery to lung tumors. The key idea is to utilize the temporal dimension as an additional degree of freedom rather than a constraint in order to achieve improved sparing of organs at risk (OARs). Methods: The target and normal structures were manually contoured on each of the ten phases of a 4DCT scan acquired from a lung SBRT patient who exhibited 1.5cm tumor motion despite the use of abdominal compression. Corresponding ten IMRT plans were generated using the Eclipse treatment planning system. Thesemore » plans served as initial guess solutions for the PSO algorithm. Fluence weights were optimized over the entire solution space i.e., 10 phases × 12 beams × 166 control points. The size of the solution space motivated our choice of PSO, which is a highly parallelizable stochastic global optimization technique that is well-suited for such large problems. A summed fluence map was created using an in-house B-spline deformable image registration. Each plan was compared with a corresponding, internal target volume (ITV)-based IMRT plan. Results: The PSO 4D IMRT plan yielded comparable PTV coverage and significantly higher dose—sparing for parallel and serial OARs compared to the ITV-based plan. The dose-sparing achieved via PSO-4DIMRT was: lung Dmean = 28%; lung V20 = 90%; spinal cord Dmax = 23%; esophagus Dmax = 31%; heart Dmax = 51%; heart Dmean = 64%. Conclusion: Truly 4D IMRT that uses the temporal dimension as an additional degree of freedom can achieve significant dose sparing of serial and parallel OARs. Given the large solution space, PSO represents an attractive, parallelizable tool to achieve globally optimal solutions for such problems. This work was supported through funding from the National Institutes of Health and Varian Medical Systems. Amit Sawant has research funding from Varian Medical Systems, VisionRT Ltd. and Elekta.« less
Interior search algorithm (ISA): a novel approach for global optimization.
Gandomi, Amir H
2014-07-01
This paper presents the interior search algorithm (ISA) as a novel method for solving optimization tasks. The proposed ISA is inspired by interior design and decoration. The algorithm is different from other metaheuristic algorithms and provides new insight for global optimization. The proposed method is verified using some benchmark mathematical and engineering problems commonly used in the area of optimization. ISA results are further compared with well-known optimization algorithms. The results show that the ISA is efficiently capable of solving optimization problems. The proposed algorithm can outperform the other well-known algorithms. Further, the proposed algorithm is very simple and it only has one parameter to tune. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Semidefinite Relaxation-Based Optimization of Multiple-Input Wireless Power Transfer Systems
NASA Astrophysics Data System (ADS)
Lang, Hans-Dieter; Sarris, Costas D.
2017-11-01
An optimization procedure for multi-transmitter (MISO) wireless power transfer (WPT) systems based on tight semidefinite relaxation (SDR) is presented. This method ensures physical realizability of MISO WPT systems designed via convex optimization -- a robust, semi-analytical and intuitive route to optimizing such systems. To that end, the nonconvex constraints requiring that power is fed into rather than drawn from the system via all transmitter ports are incorporated in a convex semidefinite relaxation, which is efficiently and reliably solvable by dedicated algorithms. A test of the solution then confirms that this modified problem is equivalent (tight relaxation) to the original (nonconvex) one and that the true global optimum has been found. This is a clear advantage over global optimization methods (e.g. genetic algorithms), where convergence to the true global optimum cannot be ensured or tested. Discussions of numerical results yielded by both the closed-form expressions and the refined technique illustrate the importance and practicability of the new method. It, is shown that this technique offers a rigorous optimization framework for a broad range of current and emerging WPT applications.
Flyby Geometry Optimization Tool
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2007-01-01
The Flyby Geometry Optimization Tool is a computer program for computing trajectories and trajectory-altering impulsive maneuvers for spacecraft used in radio relay of scientific data to Earth from an exploratory airplane flying in the atmosphere of Mars.
Data and Tools | Concentrating Solar Power | NREL
download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and
Watershed Management Optimization Support Tool (WMOST) Webinar
This webinar will highlight version 3 of EPA’s Watershed Management Optimization Support Tool (WMOST). WMOST facilitates implementation of integrated water management by communities, utilities, watershed management organizations, consultants, and others. There can be many o...
DOT National Transportation Integrated Search
2015-09-01
This report describes an Alternative Fuel Transportation Optimization Tool (AFTOT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Federal Aviation Administration (FAA)....
GRA prospectus: optimizing design and management of protected areas
Bernknopf, Richard; Halsing, David
2001-01-01
Protected areas comprise one major type of global conservation effort that has been in the form of parks, easements, or conservation concessions. Though protected areas are increasing in number and size throughout tropical ecosystems, there is no systematic method for optimally targeting specific local areas for protection, designing the protected area, and monitoring it, or for guiding follow-up actions to manage it or its surroundings over the long run. Without such a system, conservation projects often cost more than necessary and/or risk protecting ecosystems and biodiversity less efficiently than desired. Correcting these failures requires tools and strategies for improving the placement, design, and long-term management of protected areas. The objective of this project is to develop a set of spatially based analytical tools to improve the selection, design, and management of protected areas. In this project, several conservation concessions will be compared using an economic optimization technique. The forest land use portfolio model is an integrated assessment that measures investment in different land uses in a forest. The case studies of individual tropical ecosystems are developed as forest (land) use and preservation portfolios in a geographic information system (GIS). Conservation concessions involve a private organization purchasing development and resource access rights in a certain area and retiring them. Forests are put into conservation, and those people who would otherwise have benefited from extracting resources or selling the right to do so are compensated. Concessions are legal agreements wherein the exact amount and nature of the compensation result from a negotiated agreement between an agent of the conservation community and the local community. Funds are placed in a trust fund, and annual payments are made to local communities and regional/national governments. The payments are made pending third-party verification that the forest expanse and quality have been maintained.
Soaring energetics and glide performance in a moving atmosphere
Reynolds, Kate V.; Thomas, Adrian L. R.
2016-01-01
Here, we analyse the energetics, performance and optimization of flight in a moving atmosphere. We begin by deriving a succinct expression describing all of the mechanical energy flows associated with gliding, dynamic soaring and thermal soaring, which we use to explore the optimization of gliding in an arbitrary wind. We use this optimization to revisit the classical theory of the glide polar, which we expand upon in two significant ways. First, we compare the predictions of the glide polar for different species under the various published models. Second, we derive a glide optimization chart that maps every combination of headwind and updraft speed to the unique combination of airspeed and inertial sink rate at which the aerodynamic cost of transport is expected to be minimized. With these theoretical tools in hand, we test their predictions using empirical data collected from a captive steppe eagle (Aquila nipalensis) carrying an inertial measurement unit, global positioning system, barometer and pitot tube. We show that the bird adjusts airspeed in relation to headwind speed as expected if it were seeking to minimize its aerodynamic cost of transport, but find only weak evidence to suggest that it adjusts airspeed similarly in response to updrafts during straight and interthermal glides. This article is part of the themed issue ‘Moving in a moving medium: new perspectives on flight’. PMID:27528788
Barmpalexis, Panagiotis; Kachrimanis, Kyriakos; Georgarakis, Emanouil
2011-01-01
The present study investigates the use of nimodipine-polyethylene glycol solid dispersions for the development of effervescent controlled release floating tablet formulations. The physical state of the dispersed nimodipine in the polymer matrix was characterized by differential scanning calorimetry, powder X-ray diffraction, FT-IR spectroscopy and polarized light microscopy, and the mixture proportions of polyethylene glycol (PEG), polyvinyl-pyrrolidone (PVP), hydroxypropylmethylcellulose (HPMC), effervescent agents (EFF) and nimodipine were optimized in relation to drug release (% release at 60 min, and time at which the 90% of the drug was dissolved) and floating properties (tablet's floating strength and duration), employing a 25-run D-optimal mixture design combined with artificial neural networks (ANNs) and genetic programming (GP). It was found that nimodipine exists as mod I microcrystals in the solid dispersions and is stable for at least a three-month period. The tablets showed good floating properties and controlled release profiles, with drug release proceeding via the concomitant operation of swelling and erosion of the polymer matrix. ANNs and GP both proved to be efficient tools in the optimization of the tablet formulation, and the global optimum formulation suggested by the GP equations consisted of PEG=9%, PVP=30%, HPMC=36%, EFF=11%, nimodipine=14%. Copyright © 2010 Elsevier B.V. All rights reserved.
Cihan, Abdullah; Birkholzer, Jens; Bianchi, Marco
2014-12-31
Large-scale pressure increases resulting from carbon dioxide (CO 2) injection in the subsurface can potentially impact caprock integrity, induce reactivation of critically stressed faults, and drive CO 2 or brine through conductive features into shallow groundwater. Pressure management involving the extraction of native fluids from storage formations can be used to minimize pressure increases while maximizing CO2 storage. However, brine extraction requires pumping, transportation, possibly treatment, and disposal of substantial volumes of extracted brackish or saline water, all of which can be technically challenging and expensive. This paper describes a constrained differential evolution (CDE) algorithm for optimal well placement andmore » injection/ extraction control with the goal of minimizing brine extraction while achieving predefined pressure contraints. The CDE methodology was tested for a simple optimization problem whose solution can be partially obtained with a gradient-based optimization methodology. The CDE successfully estimated the true global optimum for both extraction well location and extraction rate, needed for the test problem. A more complex example application of the developed strategy was also presented for a hypothetical CO 2 storage scenario in a heterogeneous reservoir consisting of a critically stressed fault nearby an injection zone. Through the CDE optimization algorithm coupled to a numerical vertically-averaged reservoir model, we successfully estimated optimal rates and locations for CO 2 injection and brine extraction wells while simultaneously satisfying multiple pressure buildup constraints to avoid fault activation and caprock fracturing. The study shows that the CDE methodology is a very promising tool to solve also other optimization problems related to GCS, such as reducing ‘Area of Review’, monitoring design, reducing risk of leakage and increasing storage capacity and trapping.« less
NASA Astrophysics Data System (ADS)
Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan
2017-10-01
An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.
Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism.
Debats, Nienke B; Ernst, Marc O; Heuer, Herbert
2017-04-01
Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, 1 ) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and 2 ) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in experiment 1 but not in experiment 2 The biased position judgments' variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. NEW & NOTEWORTHY Kinematic tool use is associated with a perceptual attraction between the spatially separated hand and the effective part of the tool. We provide a formal account for this phenomenon, thereby showing that the process behind it is similar to optimal integration of sensory information relating to single objects. Copyright © 2017 the American Physiological Society.
Ryan, Kelsey N; Adams, Katherine P; Vosti, Stephen A; Ordiz, M Isabel; Cimo, Elizabeth D; Manary, Mark J
2014-12-01
Ready-to-use therapeutic food (RUTF) is the standard of care for children suffering from noncomplicated severe acute malnutrition (SAM). The objective was to develop a comprehensive linear programming (LP) tool to create novel RUTF formulations for Ethiopia. A systematic approach that surveyed international and national crop and animal food databases was used to create a global and local candidate ingredient database. The database included information about each ingredient regarding nutrient composition, ingredient category, regional availability, and food safety, processing, and price. An LP tool was then designed to compose novel RUTF formulations. For the example case of Ethiopia, the objective was to minimize the ingredient cost of RUTF; the decision variables were ingredient weights and the extent of use of locally available ingredients, and the constraints were nutritional and product-quality related. Of the new RUTF formulations found by the LP tool for Ethiopia, 32 were predicted to be feasible for creating a paste, and these were prepared in the laboratory. Palatable final formulations contained a variety of ingredients, including fish, different dairy powders, and various seeds, grains, and legumes. Nearly all of the macronutrient values calculated by the LP tool differed by <10% from results produced by laboratory analyses, but the LP tool consistently underestimated total energy. The LP tool can be used to develop new RUTF formulations that make more use of locally available ingredients. This tool has the potential to lead to production of a variety of low-cost RUTF formulations that meet international standards and thereby potentially allow more children to be treated for SAM. © 2014 American Society for Nutrition.
NASA Astrophysics Data System (ADS)
Xu, Shuo; Ji, Ze; Truong Pham, Duc; Yu, Fan
2011-11-01
The simultaneous mission assignment and home allocation for hospital service robots studied is a Multidimensional Assignment Problem (MAP) with multiobjectives and multiconstraints. A population-based metaheuristic, the Binary Bees Algorithm (BBA), is proposed to optimize this NP-hard problem. Inspired by the foraging mechanism of honeybees, the BBA's most important feature is an explicit functional partitioning between global search and local search for exploration and exploitation, respectively. Its key parts consist of adaptive global search, three-step elitism selection (constraint handling, non-dominated solutions selection, and diversity preservation), and elites-centred local search within a Hamming neighbourhood. Two comparative experiments were conducted to investigate its single objective optimization, optimization effectiveness (indexed by the S-metric and C-metric) and optimization efficiency (indexed by computational burden and CPU time) in detail. The BBA outperformed its competitors in almost all the quantitative indices. Hence, the above overall scheme, and particularly the searching history-adapted global search strategy was validated.
Multilevel algorithms for nonlinear optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Dennis, J. E., Jr.
1994-01-01
Multidisciplinary design optimization (MDO) gives rise to nonlinear optimization problems characterized by a large number of constraints that naturally occur in blocks. We propose a class of multilevel optimization methods motivated by the structure and number of constraints and by the expense of the derivative computations for MDO. The algorithms are an extension to the nonlinear programming problem of the successful class of local Brown-Brent algorithms for nonlinear equations. Our extensions allow the user to partition constraints into arbitrary blocks to fit the application, and they separately process each block and the objective function, restricted to certain subspaces. The methods use trust regions as a globalization strategy, and they have been shown to be globally convergent under reasonable assumptions. The multilevel algorithms can be applied to all classes of MDO formulations. Multilevel algorithms for solving nonlinear systems of equations are a special case of the multilevel optimization methods. In this case, they can be viewed as a trust-region globalization of the Brown-Brent class.
Multidisciplinary optimization of controlled space structures with global sensitivity equations
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; James, Benjamin B.; Graves, Philip C.; Woodard, Stanley E.
1991-01-01
A new method for the preliminary design of controlled space structures is presented. The method coordinates standard finite element structural analysis, multivariable controls, and nonlinear programming codes and allows simultaneous optimization of the structures and control systems of a spacecraft. Global sensitivity equations are a key feature of this method. The preliminary design of a generic geostationary platform is used to demonstrate the multidisciplinary optimization method. Fifteen design variables are used to optimize truss member sizes and feedback gain values. The goal is to reduce the total mass of the structure and the vibration control system while satisfying constraints on vibration decay rate. Incorporating the nonnegligible mass of actuators causes an essential coupling between structural design variables and control design variables. The solution of the demonstration problem is an important step toward a comprehensive preliminary design capability for structures and control systems. Use of global sensitivity equations helps solve optimization problems that have a large number of design variables and a high degree of coupling between disciplines.
NASA Technical Reports Server (NTRS)
Brown, Aaron J.
2015-01-01
The International Space Station's (ISS) trajectory is coordinated and executed by the Trajectory Operations and Planning (TOPO) group at NASA's Johnson Space Center. TOPO group personnel routinely generate look-ahead trajectories for the ISS that incorporate translation burns needed to maintain its orbit over the next three to twelve months. The burns are modeled as in-plane, horizontal burns, and must meet operational trajectory constraints imposed by both NASA and the Russian Space Agency. In generating these trajectories, TOPO personnel must determine the number of burns to model, each burn's Time of Ignition (TIG), and magnitude (i.e. deltaV) that meet these constraints. The current process for targeting these burns is manually intensive, and does not take advantage of more modern techniques that can reduce the workload needed to find feasible burn solutions, i.e. solutions that simply meet the constraints, or provide optimal burn solutions that minimize the total DeltaV while simultaneously meeting the constraints. A two-level, hybrid optimization technique is proposed to find both feasible and globally optimal burn solutions for ISS trajectory planning. For optimal solutions, the technique breaks the optimization problem into two distinct sub-problems, one for choosing the optimal number of burns and each burn's optimal TIG, and the other for computing the minimum total deltaV burn solution that satisfies the trajectory constraints. Each of the two aforementioned levels uses a different optimization algorithm to solve one of the sub-problems, giving rise to a hybrid technique. Level 2, or the outer level, uses a genetic algorithm to select the number of burns and each burn's TIG. Level 1, or the inner level, uses the burn TIGs from Level 2 in a sequential quadratic programming (SQP) algorithm to compute a minimum total deltaV burn solution subject to the trajectory constraints. The total deltaV from Level 1 is then used as a fitness function by the genetic algorithm in Level 2 to select the number of burns and their TIGs for the next generation. In this manner, the two levels solve their respective sub-problems separately but collaboratively until a burn solution is found that globally minimizes the deltaV across the entire trajectory. Feasible solutions can also be found by simply using the SQP algorithm in Level 1 with a zero cost function. This paper discusses the formulation of the Level 1 sub-problem and the development of a prototype software tool to solve it. The Level 2 sub-problem will be discussed in a future work. Following the Level 1 formulation and solution, several look-ahead trajectory examples for the ISS are explored. In each case, the burn targeting results using the current process are compared against a feasible solution found using Level 1 in the proposed technique. Level 1 is then used to find a minimum deltaV solution given the fixed number of burns and burn TIGs. The optimal solution is compared with the previously found feasible solution to determine the deltaV (and therefore propellant) savings. The proposed technique seeks to both improve the current process for targeting ISS burns, and to add the capability to optimize ISS burns in a novel fashion. The optimal solutions found using this technique can potentially save hundreds of kilograms of propellant over the course of the ISS mission compared to feasible solutions alone. While the software tool being developed to implement this technique is specific to ISS, the concept is extensible to other long-duration, central-body orbiting missions that must perform orbit maintenance burns to meet operational trajectory constraints.
Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel
2016-05-01
Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.
[Added value of family practitioners' supervision of junior doctors in a walk-in clinic].
Perdrix, J; Gubser, R; Gilgien, W; Bischoff, T
2011-05-18
The pending workforce crisis in family medicine has triggered various initiatives. This article describes the PMU-FLON walk-in clinic, a project of the Institute of General Medicine University of Lausanne. The working conditions in this clinic are close to that of a family practice. Doctors in training are supervised by family doctors who work part-time in the clinic. The objective is to improve training in the various fields of family medicine, from technical skills (improving optimal use of diagnostic tools), to integrating patients' requests in a more global patient-centered approach. This new educational model allows doctors in training to benefit from the specific approaches of different trainers. It will contribute to promoting quality family medicine in the future.
PubChem applications in drug discovery: a bibliometric analysis
Cheng, Tiejun; Pan, Yongmei; Hao, Ming; Wang, Yanli; Bryant, Stephen H.
2014-01-01
A bibliometric analysis of PubChem applications is presented by reviewing 1132 research articles. The massive volume of chemical structure and bioactivity data in PubChem and its online services has been used globally in various fields including chemical biology, medicinal chemistry and informatics research. PubChem supports drug discovery in many aspects such as lead identification and optimization, compound–target profiling, polypharmacology studies and unknown chemical identity elucidation. PubChem has also become a valuable resource for developing secondary databases, informatics tools and web services. The growing PubChem resource with its public availability offers support and great opportunities for the interrogation of pharmacological mechanisms and the genetic basis of diseases, which are vital for drug innovation and repurposing. PMID:25168772
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools
NASA Astrophysics Data System (ADS)
Januszkiewicz, Krystyna; Banachowicz, Marta
2017-10-01
The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.
ERIC Educational Resources Information Center
Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie
2008-01-01
Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…
A practical globalization of one-shot optimization for optimal design of tokamak divertors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blommaert, Maarten, E-mail: maarten.blommaert@kuleuven.be; Dekeyser, Wouter; Baelmans, Martine
In past studies, nested optimization methods were successfully applied to design of the magnetic divertor configuration in nuclear fusion reactors. In this paper, so-called one-shot optimization methods are pursued. Due to convergence issues, a globalization strategy for the one-shot solver is sought. Whereas Griewank introduced a globalization strategy using a doubly augmented Lagrangian function that includes primal and adjoint residuals, its practical usability is limited by the necessity of second order derivatives and expensive line search iterations. In this paper, a practical alternative is offered that avoids these drawbacks by using a regular augmented Lagrangian merit function that penalizes onlymore » state residuals. Additionally, robust rank-two Hessian estimation is achieved by adaptation of Powell's damped BFGS update rule. The application of the novel one-shot approach to magnetic divertor design is considered in detail. For this purpose, the approach is adapted to be complementary with practical in parts adjoint sensitivities. Using the globalization strategy, stable convergence of the one-shot approach is achieved.« less
NASA Astrophysics Data System (ADS)
Zhuang, Yufei; Huang, Haibin
2014-02-01
A hybrid algorithm combining particle swarm optimization (PSO) algorithm with the Legendre pseudospectral method (LPM) is proposed for solving time-optimal trajectory planning problem of underactuated spacecrafts. At the beginning phase of the searching process, an initialization generator is constructed by the PSO algorithm due to its strong global searching ability and robustness to random initial values, however, PSO algorithm has a disadvantage that its convergence rate around the global optimum is slow. Then, when the change in fitness function is smaller than a predefined value, the searching algorithm is switched to the LPM to accelerate the searching process. Thus, with the obtained solutions by the PSO algorithm as a set of proper initial guesses, the hybrid algorithm can find a global optimum more quickly and accurately. 200 Monte Carlo simulations results demonstrate that the proposed hybrid PSO-LPM algorithm has greater advantages in terms of global searching capability and convergence rate than both single PSO algorithm and LPM algorithm. Moreover, the PSO-LPM algorithm is also robust to random initial values.
Optimization of Shipboard Manning Levels Using Imprint Pro Forces Module
2015-09-01
NPS-OR-15-008 NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA OPTIMIZATION OF SHIPBOARD MANNING LEVELS USING IMPRINT PRO...Optimization of Shipboard Manning Levels Using IMPRINT Pro Forces Module 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...ABSTRACT The Improved Performance Research Integration Tool ( IMPRINT ) is a dynamic, stochastic, discrete-event modeling tool used to develop a model
[Optimization of end-tool parameters based on robot hand-eye calibration].
Zhang, Lilong; Cao, Tong; Liu, Da
2017-04-01
A new one-time registration method was developed in this research for hand-eye calibration of a surgical robot to simplify the operation process and reduce the preparation time. And a new and practical method is introduced in this research to optimize the end-tool parameters of the surgical robot based on analysis of the error sources in this registration method. In the process with one-time registration method, firstly a marker on the end-tool of the robot was recognized by a fixed binocular camera, and then the orientation and position of the marker were calculated based on the joint parameters of the robot. Secondly the relationship between the camera coordinate system and the robot base coordinate system could be established to complete the hand-eye calibration. Because of manufacturing and assembly errors of robot end-tool, an error equation was established with the transformation matrix between the robot end coordinate system and the robot end-tool coordinate system as the variable. Numerical optimization was employed to optimize end-tool parameters of the robot. The experimental results showed that the one-time registration method could significantly improve the efficiency of the robot hand-eye calibration compared with the existing methods. The parameter optimization method could significantly improve the absolute positioning accuracy of the one-time registration method. The absolute positioning accuracy of the one-time registration method can meet the requirements of the clinical surgery.
Blended near-optimal tools for flexible water resources decision making
NASA Astrophysics Data System (ADS)
Rosenberg, David
2015-04-01
State-of-the-art systems analysis techniques focus on efficiently finding optimal solutions. Yet an optimal solution is optimal only for the static modelled issues and managers often seek near-optimal alternatives that address un-modelled or changing objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as performance within a tolerable deviation from the optimal objective function value and identified a few maximally-different alternatives that addressed select un-modelled issues. This paper presents new stratified, Monte Carlo Markov Chain sampling and parallel coordinate plotting tools that generate and communicate the structure and full extent of the near-optimal region to an optimization problem. Plot controls allow users to interactively explore region features of most interest. Controls also streamline the process to elicit un-modelled issues and update the model formulation in response to elicited issues. Use for a single-objective water quality management problem at Echo Reservoir, Utah identifies numerous and flexible practices to reduce the phosphorus load to the reservoir and maintain close-to-optimal performance. Compared to MGA, the new blended tools generate more numerous alternatives faster, more fully show the near-optimal region, help elicit a larger set of un-modelled issues, and offer managers greater flexibility to cope in a changing world.
Habitat Design Optimization and Analysis
NASA Technical Reports Server (NTRS)
SanSoucie, Michael P.; Hull, Patrick V.; Tinker, Michael L.
2006-01-01
Long-duration surface missions to the Moon and Mars will require habitats for the astronauts. The materials chosen for the habitat walls play a direct role in the protection against the harsh environments found on the surface. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Advanced optimization techniques are necessary for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat design optimization tool utilizing genetic algorithms has been developed. Genetic algorithms use a "survival of the fittest" philosophy, where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multi-objective formulation of structural analysis, heat loss, radiation protection, and meteoroid protection. This paper presents the research and development of this tool.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
On computing the global time-optimal motions of robotic manipulators in the presence of obstacles
NASA Technical Reports Server (NTRS)
Shiller, Zvi; Dubowsky, Steven
1991-01-01
A method for computing the time-optimal motions of robotic manipulators is presented that considers the nonlinear manipulator dynamics, actuator constraints, joint limits, and obstacles. The optimization problem is reduced to a search for the time-optimal path in the n-dimensional position space. A small set of near-optimal paths is first efficiently selected from a grid, using a branch and bound search and a series of lower bound estimates on the traveling time along a given path. These paths are further optimized with a local path optimization to yield the global optimal solution. Obstacles are considered by eliminating the collision points from the tessellated space and by adding a penalty function to the motion time in the local optimization. The computational efficiency of the method stems from the reduced dimensionality of the searched spaced and from combining the grid search with a local optimization. The method is demonstrated in several examples for two- and six-degree-of-freedom manipulators with obstacles.
Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data
García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio
2016-01-01
Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed. PMID:28787882
García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio
2016-01-28
Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC-MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc . Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC-MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.
The HYTHIRM Project: Flight Thermography of the Space Shuttle During the Hypersonic Re-entry
NASA Technical Reports Server (NTRS)
Horvath, Thomas J.; Tomek, Deborah M.; Berger, Karen T.; Zalameda, Joseph N.; Splinter, Scott C.; Krasa, Paul W.; Schwartz, Richard J.; Gibson, David M.; Tietjen, Alan B.; Tack, Steve
2010-01-01
This report describes a NASA Langley led endeavor sponsored by the NASA Engineering Safety Center, the Space Shuttle Program Office and the NASA Aeronautics Research Mission Directorate to demonstrate a quantitative thermal imaging capability. A background and an overview of several multidisciplinary efforts that culminated in the acquisition of high resolution calibrated infrared imagery of the Space Shuttle during hypervelocity atmospheric entry is presented. The successful collection of thermal data has demonstrated the feasibility of obtaining remote high-resolution infrared imagery during hypersonic flight for the accurate measurement of surface temperature. To maximize science and engineering return, the acquisition of quantitative thermal imagery and capability demonstration was targeted towards three recent Shuttle flights - two of which involved flight experiments flown on Discovery. In coordination with these two Shuttle flight experiments, a US Navy NP-3D aircraft was flown between 26-41 nautical miles below Discovery and remotely monitored surface temperature of the Orbiter at Mach 8.4 (STS-119) and Mach 14.7 (STS-128) using a long-range infrared optical package referred to as Cast Glance. This same Navy aircraft successfully monitored the Orbiter Atlantis traveling at approximately Mach 14.3 during its return from the successful Hubble repair mission (STS-125). The purpose of this paper is to describe the systematic approach used by the Hypersonic Thermodynamic Infrared Measurements team to develop and implement a set of mission planning tools designed to establish confidence in the ability of an imaging platform to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. The mission planning tools included a pre-flight capability to predict the infrared signature of the Shuttle. Such tools permitted optimization of the hardware configuration to increase signal-to-noise and to maximize the available dynamic range while mitigating the potential for saturation. Post flight, analysis tools were used to assess atmospheric effects and to convert the 2-D intensity images to 3-D temperature maps of the windward surface. Comparison of the spatially resolved global thermal measurements to surface thermocouples and CFD prediction is made. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the Shuttle suggests future applications towards hypersonic flight test programs within NASA, DoD and DARPA along with flight test opportunities supporting NASA's project Constellation.
OPTIMIZING BMP PLACEMENT AT WATERSHED-SCALE USING SUSTAIN
Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...
A trust region-based approach to optimize triple response systems
NASA Astrophysics Data System (ADS)
Fan, Shu-Kai S.; Fan, Chihhao; Huang, Chia-Fen
2014-05-01
This article presents a new computing procedure for the global optimization of the triple response system (TRS) where the response functions are non-convex quadratics and the input factors satisfy a radial constrained region of interest. The TRS arising from response surface modelling can be approximated using a nonlinear mathematical program that considers one primary objective function and two secondary constraint functions. An optimization algorithm named the triple response surface algorithm (TRSALG) is proposed to determine the global optimum for the non-degenerate TRS. In TRSALG, the Lagrange multipliers of the secondary functions are determined using the Hooke-Jeeves search method and the Lagrange multiplier of the radial constraint is located using the trust region method within the global optimality space. The proposed algorithm is illustrated in terms of three examples appearing in the quality-control literature. The results of TRSALG compared to a gradient-based method are also presented.
Visualizing and improving the robustness of phase retrieval algorithms
Tripathi, Ashish; Leyffer, Sven; Munson, Todd; ...
2015-06-01
Coherent x-ray diffractive imaging is a novel imaging technique that utilizes phase retrieval and nonlinear optimization methods to image matter at nanometer scales. We explore how the convergence properties of a popular phase retrieval algorithm, Fienup's HIO, behave by introducing a reduced dimensionality problem allowing us to visualize and quantify convergence to local minima and the globally optimal solution. We then introduce generalizations of HIO that improve upon the original algorithm's ability to converge to the globally optimal solution.
Visualizing and improving the robustness of phase retrieval algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tripathi, Ashish; Leyffer, Sven; Munson, Todd
Coherent x-ray diffractive imaging is a novel imaging technique that utilizes phase retrieval and nonlinear optimization methods to image matter at nanometer scales. We explore how the convergence properties of a popular phase retrieval algorithm, Fienup's HIO, behave by introducing a reduced dimensionality problem allowing us to visualize and quantify convergence to local minima and the globally optimal solution. We then introduce generalizations of HIO that improve upon the original algorithm's ability to converge to the globally optimal solution.
Synergy optimization and operation management on syndicate complementary knowledge cooperation
NASA Astrophysics Data System (ADS)
Tu, Kai-Jan
2014-10-01
The number of multi enterprises knowledge cooperation has grown steadily, as a result of global innovation competitions. I have conducted research based on optimization and operation studies in this article, and gained the conclusion that synergy management is effective means to break through various management barriers and solve cooperation's chaotic systems. Enterprises must communicate system vision and access complementary knowledge. These are crucial considerations for enterprises to exert their optimization and operation knowledge cooperation synergy to meet global marketing challenges.
Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning
NASA Astrophysics Data System (ADS)
Belozerov, V. A.; Uteshev, M. H.
2016-08-01
This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.
A Novel Particle Swarm Optimization Algorithm for Global Optimization
Wang, Chun-Feng; Liu, Kui
2016-01-01
Particle Swarm Optimization (PSO) is a recently developed optimization method, which has attracted interest of researchers in various areas due to its simplicity and effectiveness, and many variants have been proposed. In this paper, a novel Particle Swarm Optimization algorithm is presented, in which the information of the best neighbor of each particle and the best particle of the entire population in the current iteration is considered. Meanwhile, to avoid premature, an abandoned mechanism is used. Furthermore, for improving the global convergence speed of our algorithm, a chaotic search is adopted in the best solution of the current iteration. To verify the performance of our algorithm, standard test functions have been employed. The experimental results show that the algorithm is much more robust and efficient than some existing Particle Swarm Optimization algorithms. PMID:26955387
Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization
Xi, Maolong; Lu, Dan; Gui, Dongwei; ...
2016-11-27
Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less
Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization
NASA Astrophysics Data System (ADS)
Xi, Maolong; Lu, Dan; Gui, Dongwei; Qi, Zhiming; Zhang, Guannan
2017-01-01
Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so as to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.
Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xi, Maolong; Lu, Dan; Gui, Dongwei
Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less
Enders, Philip; Adler, Werner; Schaub, Friederike; Hermann, Manuel M; Diestelhorst, Michael; Dietlein, Thomas; Cursiefen, Claus; Heindl, Ludwig M
2017-10-24
To compare a simultaneously optimized continuous minimum rim surface parameter between Bruch's membrane opening (BMO) and the internal limiting membrane to the standard sequential minimization used for calculating the BMO minimum rim area in spectral domain optical coherence tomography (SD-OCT). In this case-control, cross-sectional study, 704 eyes of 445 participants underwent SD-OCT of the optic nerve head (ONH), visual field testing, and clinical examination. Globally and clock-hour sector-wise optimized BMO-based minimum rim area was calculated independently. Outcome parameters included BMO-globally optimized minimum rim area (BMO-gMRA) and sector-wise optimized BMO-minimum rim area (BMO-MRA). BMO area was 1.89 ± 0.05 mm 2 . Mean global BMO-MRA was 0.97 ± 0.34 mm 2 , mean global BMO-gMRA was 1.01 ± 0.36 mm 2 . Both parameters correlated with r = 0.995 (P < 0.001); mean difference was 0.04 mm 2 (P < 0.001). In all sectors, parameters differed by 3.0-4.2%. In receiver operating characteristics, the calculated area under the curve (AUC) to differentiate glaucoma was 0.873 for BMO-MRA, compared to 0.866 for BMO-gMRA (P = 0.004). Among ONH sectors, the temporal inferior location showed the highest AUC. Optimization strategies to calculate BMO-based minimum rim area led to significantly different results. Imposing an additional adjacency constraint within calculation of BMO-MRA does not improve diagnostic power. Global and temporal inferior BMO-MRA performed best in differentiating glaucoma patients.
Duarte, Belmiro P.M.; Wong, Weng Kee; Atkinson, Anthony C.
2016-01-01
T-optimum designs for model discrimination are notoriously difficult to find because of the computational difficulty involved in solving an optimization problem that involves two layers of optimization. Only a handful of analytical T-optimal designs are available for the simplest problems; the rest in the literature are found using specialized numerical procedures for a specific problem. We propose a potentially more systematic and general way for finding T-optimal designs using a Semi-Infinite Programming (SIP) approach. The strategy requires that we first reformulate the original minimax or maximin optimization problem into an equivalent semi-infinite program and solve it using an exchange-based method where lower and upper bounds produced by solving the outer and the inner programs, are iterated to convergence. A global Nonlinear Programming (NLP) solver is used to handle the subproblems, thus finding the optimal design and the least favorable parametric configuration that minimizes the residual sum of squares from the alternative or test models. We also use a nonlinear program to check the global optimality of the SIP-generated design and automate the construction of globally optimal designs. The algorithm is successfully used to produce results that coincide with several T-optimal designs reported in the literature for various types of model discrimination problems with normally distributed errors. However, our method is more general, merely requiring that the parameters of the model be estimated by a numerical optimization. PMID:27330230
Duarte, Belmiro P M; Wong, Weng Kee; Atkinson, Anthony C
2015-03-01
T-optimum designs for model discrimination are notoriously difficult to find because of the computational difficulty involved in solving an optimization problem that involves two layers of optimization. Only a handful of analytical T-optimal designs are available for the simplest problems; the rest in the literature are found using specialized numerical procedures for a specific problem. We propose a potentially more systematic and general way for finding T-optimal designs using a Semi-Infinite Programming (SIP) approach. The strategy requires that we first reformulate the original minimax or maximin optimization problem into an equivalent semi-infinite program and solve it using an exchange-based method where lower and upper bounds produced by solving the outer and the inner programs, are iterated to convergence. A global Nonlinear Programming (NLP) solver is used to handle the subproblems, thus finding the optimal design and the least favorable parametric configuration that minimizes the residual sum of squares from the alternative or test models. We also use a nonlinear program to check the global optimality of the SIP-generated design and automate the construction of globally optimal designs. The algorithm is successfully used to produce results that coincide with several T-optimal designs reported in the literature for various types of model discrimination problems with normally distributed errors. However, our method is more general, merely requiring that the parameters of the model be estimated by a numerical optimization.
Addressing an I/O Bottleneck in a Web-Based CERES QC Tool
NASA Astrophysics Data System (ADS)
Heckert, E.; Sun-Mack, S.; Chen, Y.; Chu, C.; Smith, R. A.
2016-12-01
In this poster, we explore the technologies we have used to overcome the problem of transmitting and analyzing large datasets in our web-based CERES Quality Control tool and consider four technologies to potentially adopt for future performance improvements. The CERES team uses this tool to validate pixel-level data from Terra, Aqua, SNPP, MSG, MTSAT, and many geostationary GOES satellites, as well as to develop cloud retrieval algorithms. The tool includes a histogram feature that allows the user to aggregate data from many different timestamps and different scenes globally or locally selected by the user by drawing bounding boxes. In order to provide a better user experience, the tool passes a large amount of data to the user's browser. The browser then processes the data in order to present it to users in various formats, for example as a histogram. In addition to using multiple servers to subset data and pass a smaller set of data to the browser, the tool also makes use of a compression technology, Gzip, to reduce the size of the data. However, sometimes the application in the browser is still slow when dealing with these large sets of data due to the delay in the browser receiving the server's response. To address this I/O bottleneck, we will investigate four alternatives and present the results in this poster: 1) sending uncompressed data, 2) ESRI's Limited Error Raster Compression (LERC), 3) Gzip, and 4) WebSocket protocol. These approaches are compared to each other and to the uncompressed control to determine the optimal solution.
Cross-Polar Aircraft Trajectory Optimization and the Potential Climate Impact
NASA Technical Reports Server (NTRS)
Ng, Hok K.; Sridhar, Banavar; Grabbe, Shon; Chen, Neil
2011-01-01
Cross-Polar routes offer new opportunities for air travel markets. Transpolar flights reduce travel times, fuel burns, and associated environmental emissions by flying direct paths between many North American and Asian cities. This study evaluates the potential benefits of flying wind-optimal polar routes and assessed their potential impact on climate change. An optimization algorithm is developed for transpolar flights to generate wind-optimal trajectories that minimize climate impact of aircraft, in terms of global warming potentials (relative to warming by one kg of CO2) of several types of emissions, while avoiding regions of airspace that facilitate persistent contrail formation. Estimations of global warming potential are incorporated into the objective function of the optimization algorithm to assess the climate impact of aircraft emissions discharged at a given location and altitude. The regions of airspace with very low ambient temperature and areas favorable to persistent contrail formation are modeled as undesirable regions that aircraft should avoid and are formulated as soft state constraints. The fuel burn and climate impact of cross-polar air traffic flying various types of trajectory including flight plan, great circle, wind-optimal, and contrail-avoidance are computed for 15 origin-destination pairs between major international airports in the U.S. and Asia. Wind-optimal routes reduce average fuel burn of flight plan routes by 4.4% on December 4, 2010 and 8.0% on August 7, 2010, respectively. The tradeoff between persistent contrail formation and additional global warming potential of aircraft emissions is investigated with and without altitude optimization. Without altitude optimization, the reduction in contrail travel times is gradual with increase in total fuel consumption. When altitude is optimized, a one percent increase in additional global warming potential, a climate impact equivalent to that of 4070kg and 4220kg CO2 emission, reduces 135 and 105 minutes persistent contrail formation per flight during a day with medium and high contrail formation, respectively.
Cross-Polar Aircraft Trajectory Optimization and Potential Climate Impact
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Chen, Neil; Ng, Hok
2011-01-01
Cross-Polar routes offer new opportunities for air travel markets. Transpolar flights reduce travel times, fuel burns, and associated environmental emissions by flying direct paths between many North American and Asian cities. This study evaluates the potential benefits of flying wind-optimal polar routes and assessed their potential impact on climate change. An optimization algorithm is developed for transpolar flights to generate wind-optimal trajectories that minimize climate impact of aircraft, in terms of global warming potentials (relative to warming by one kg of CO2) of several types of emissions, while avoiding regions of airspace that facilitate persistent contrail formation. Estimations of global warming potential are incorporated into the objective function of the optimization algorithm to assess the climate impact of aircraft emissions discharged at a given location and altitude. The regions of airspace with very low ambient temperature and areas favorable to persistent contrail formation are modeled as undesirable regions that aircraft should avoid and are formulated as soft state constraints. The fuel burn and climate impact of cross-polar air traffic flying various types of trajectory including flightplan, great circle, wind-optimal, and contrail-avoidance are computed for 15 origin-destination pairs between major international airports in the U.S. and Asia. Wind-optimal routes reduce average fuel burn of flight plan routes by 4.4% on December 4, 2010 and 8.0% on August 7, 2010, respectively. The tradeoff between persistent contrail formation and additional global warming potential of aircraft emissions is investigated with and without altitude optimization. Without altitude optimization, the reduction in contrail travel times is gradual with increase in total fuel consumption. When altitude is optimized, a one percent increase in additional global warming potential, a climate impact equivalent to that of 4070kg and 4220kg CO2 emission, reduces 135 and 105 minutes persistent contrail formation per flight during a day with medium and high contrail formation, respectively.
The Nitrogen Footprint Tool Network: A Multi-Institution Program To Reduce Nitrogen Pollution
Leach, Allison M.; Leary, Neil; Baron, Jill; Compton, Jana E.; Galloway, James N.; Hastings, Meredith G.; Kimiecik, Jacob; Lantz-Trissel, Jonathan; de la Reguera, Elizabeth; Ryals, Rebecca
2017-01-01
Abstract Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This article uses the Nitrogen Footprint Tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this article, the first seven completed institution nitrogen footprint results are presented. The Nitrogen Footprint Tool Network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive nitrogen released to the environment. Energy use and food purchases are the two largest sectors contributing to institution nitrogen footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the nitrogen footprint, but the impact of food production on nitrogen pollution has not been directly addressed by the higher education sustainability community. The Nitrogen Footprint Tool Network found that institutions could reduce their nitrogen footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as by reducing dependence on fossil fuels for energy. PMID:29350216
Recent Updates in the Endoscopic Diagnosis of Barrett's Oesophagus.
Sharma, Neel; Ho, Khek Yu
2016-10-01
Barrett's oesophagus (BO) is a premalignant condition associated with the development of oesophageal adenocarcinoma (OAC). Despite the low risk of progression per annum, OAC is associated with significant morbidity and mortality, with an estimated 5-year survival of 10%. Furthermore, the incidence of OAC continues to rise globally. Therefore, it is imperative to detect the premalignant phase of BO and follow up such patients accordingly. The mainstay diagnosis of BO is endoscopy and biopsy sampling. However, limitations with white light endoscopy (WLE) and undertaking biopsies have shifted the current focus towards real-time image analysis. Utilization of additional tools such as chromoendoscopy, narrow-band imaging (NBI), confocal laser endomicroscopy (CLE), and optical coherence tomography (OCT) are proving beneficial. Furthermore, it is also becoming more apparent that often these tools are utilized by experts in the field. Therefore, for the non-expert, training in these systems is key. Currently as yet, the methodologies used for training optimization require further inquiry. (1) Real-time imaging can serve to minimize excess biopsies. (2) Tools such as chromoendoscopy, NBI, CLE, and OCT can help to compliment WLE. WLE is associated with limited sensitivity. Biopsy sampling is cost-ineffective and associated with sampling error. Hence, from a practical perspective, endoscopists should aim to utilize additional tools to help in real-time image interpretation and minimize an overreliance on histology.
Recent Updates in the Endoscopic Diagnosis of Barrett's Oesophagus
Sharma, Neel; Ho, Khek Yu
2016-01-01
Background Barrett's oesophagus (BO) is a premalignant condition associated with the development of oesophageal adenocarcinoma (OAC). Despite the low risk of progression per annum, OAC is associated with significant morbidity and mortality, with an estimated 5-year survival of 10%. Furthermore, the incidence of OAC continues to rise globally. Therefore, it is imperative to detect the premalignant phase of BO and follow up such patients accordingly. Summary The mainstay diagnosis of BO is endoscopy and biopsy sampling. However, limitations with white light endoscopy (WLE) and undertaking biopsies have shifted the current focus towards real-time image analysis. Utilization of additional tools such as chromoendoscopy, narrow-band imaging (NBI), confocal laser endomicroscopy (CLE), and optical coherence tomography (OCT) are proving beneficial. Furthermore, it is also becoming more apparent that often these tools are utilized by experts in the field. Therefore, for the non-expert, training in these systems is key. Currently as yet, the methodologies used for training optimization require further inquiry. Key Message (1) Real-time imaging can serve to minimize excess biopsies. (2) Tools such as chromoendoscopy, NBI, CLE, and OCT can help to compliment WLE. Practical Implications WLE is associated with limited sensitivity. Biopsy sampling is cost-ineffective and associated with sampling error. Hence, from a practical perspective, endoscopists should aim to utilize additional tools to help in real-time image interpretation and minimize an overreliance on histology. PMID:27904863
CNV detection method optimized for high-resolution arrayCGH by normality test.
Ahn, Jaegyoon; Yoon, Youngmi; Park, Chihyun; Park, Sanghyun
2012-04-01
High-resolution arrayCGH platform makes it possible to detect small gains and losses which previously could not be measured. However, current CNV detection tools fitted to early low-resolution data are not applicable to larger high-resolution data. When CNV detection tools are applied to high-resolution data, they suffer from high false-positives, which increases validation cost. Existing CNV detection tools also require optimal parameter values. In most cases, obtaining these values is a difficult task. This study developed a CNV detection algorithm that is optimized for high-resolution arrayCGH data. This tool operates up to 1500 times faster than existing tools on a high-resolution arrayCGH of whole human chromosomes which has 42 million probes whose average length is 50 bases, while preserving false positive/negative rates. The algorithm also uses a normality test, thereby removing the need for optimal parameters. To our knowledge, this is the first formulation for CNV detecting problems that results in a near-linear empirical overall complexity for real high-resolution data. Copyright © 2012 Elsevier Ltd. All rights reserved.
An optimization model to agroindustrial sector in antioquia (Colombia, South America)
NASA Astrophysics Data System (ADS)
Fernandez, J.
2015-06-01
This paper develops a proposal of a general optimization model for the flower industry, which is defined by using discrete simulation and nonlinear optimization, whose mathematical models have been solved by using ProModel simulation tools and Gams optimization. It defines the operations that constitute the production and marketing of the sector, statistically validated data taken directly from each operation through field work, the discrete simulation model of the operations and the linear optimization model of the entire industry chain are raised. The model is solved with the tools described above and presents the results validated in a case study.
Application of simulation models for the optimization of business processes
NASA Astrophysics Data System (ADS)
Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří
2016-06-01
The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.
A simulation-optimization-based decision support tool for mitigating traffic congestion.
DOT National Transportation Integrated Search
2009-12-01
"Traffic congestion has grown considerably in the United States over the past twenty years. In this paper, we develop : a robust decision support tool based on simulation optimization to evaluate and recommend congestion-mitigation : strategies to tr...
Kastner, Monika; Perrier, Laure; Hamid, Jemila; Tricco, Andrea C; Cardoso, Roberta; Ivers, Noah M; Liu, Barbara; Marr, Sharon; Holroyd-Leduc, Jayna; Wong, Geoff; Graves, Lisa; Straus, Sharon E
2015-02-03
The burden of chronic disease is a global phenomenon, particularly among people aged 65 years and older. More than half of older adults have more than one chronic disease and their care is not optimal. Chronic disease management (CDM) tools have the potential to meet this challenge but they are primarily focused on a single disease, which fails to address the growing number of seniors with multiple chronic conditions. We will conduct a systematic review alongside a realist review to identify effective CDM tools that integrate one or more high-burden chronic diseases affecting older adults and to better understand for whom, under what circumstances, how and why they produce their outcomes. We will search MEDLINE, EMBASE, CINAHL, AgeLine and the Cochrane Library for experimental, quasi-experimental, observational and qualitative studies in any language investigating CDM tools that facilitate optimal disease management in one or more high-burden chronic diseases affecting adults aged ≥65 years. Study selection will involve calibration of reviewers to ensure reliability of screening and duplicate assessment of articles. Data abstraction and risk of bias assessment will also be performed independently. Analysis will include descriptive summaries of study and appraisal characteristics, effectiveness of each CDM tool (meta-analysis if appropriate); and a realist programme theory will be developed and refined to explain the outcome patterns within the included studies. Ethics approval is not required for this study. We anticipate that our findings, pertaining to gaps in care across high-burden chronic diseases affecting seniors and highlighting specific areas that may require more research, will be of interest to a wide range of knowledge users and stakeholders. We will publish and present our findings widely, and also plan more active dissemination strategies such as workshops with our key stakeholders. Our protocol is registered with PROSPERO (registration number CRD42014014489). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Trincado, Claudia; Molina, Víctor; Urcelay, Gonzalo; Dellepiane, Paulina
2018-02-01
The echocardiographic evaluation of patients after heart transplantation is a useful tool. However, it is still necessary to define an optimal follow-up protocol. To describe the results of the application of a functional echocardiographic protocol in patients being followed after pediatric heart transplantation. Alls patients being followed at our institution after pediatric heart transplantation underwent an echocardiographic examination with a functional protocol that included global longitudinal strain. Contemporaneous endomyocardial biopsy results and hemodynamic data were recorded. 9 patients were evaluated with our echocardiographic functional protocol. Of these patients, only 1 showed systolic left ventricular dysfunction according to classic parameters. However, almost all patients had an abnormal global longitudinal strain. Right ventricular systolic dysfunction was observed in all patients. No epidodes of moderate to severe rejectiom were recorded. No correlation was observed between these parameters and pulmonary artery pressure. Subclinical biventricular systolic dysfunction was observed in the majority of the patients in this study. No association with rejection episodes or pulmonary hypertension was observed, which may be related to the absence of moderate or severe rejection episodes during the study period, and to the small sample size. Long term follow-up of these patients may better define the clinical relevance of our findings.
Marine Algae: a Source of Biomass for Biotechnological Applications.
Stengel, Dagmar B; Connan, Solène
2015-01-01
Biomass derived from marine microalgae and macroalgae is globally recognized as a source of valuable chemical constituents with applications in the agri-horticultural sector (including animal feeds and health and plant stimulants), as human food and food ingredients as well as in the nutraceutical, cosmeceutical, and pharmaceutical industries. Algal biomass supply of sufficient quality and quantity however remains a concern with increasing environmental pressures conflicting with the growing demand. Recent attempts in supplying consistent, safe and environmentally acceptable biomass through cultivation of (macro- and micro-) algal biomass have concentrated on characterizing natural variability in bioactives, and optimizing cultivated materials through strain selection and hybridization, as well as breeding and, more recently, genetic improvements of biomass. Biotechnological tools including metabolomics, transcriptomics, and genomics have recently been extended to algae but, in comparison to microbial or plant biomass, still remain underdeveloped. Current progress in algal biotechnology is driven by an increased demand for new sources of biomass due to several global challenges, new discoveries and technologies available as well as an increased global awareness of the many applications of algae. Algal diversity and complexity provides significant potential provided that shortages in suitable and safe biomass can be met, and consumer demands are matched by commercial investment in product development.
NASA Astrophysics Data System (ADS)
Godinez, H. C.; Rougier, E.; Osthus, D.; Srinivasan, G.
2017-12-01
Fracture propagation play a key role for a number of application of interest to the scientific community. From dynamic fracture processes like spall and fragmentation in metals and detection of gas flow in static fractures in rock and the subsurface, the dynamics of fracture propagation is important to various engineering and scientific disciplines. In this work we implement a global sensitivity analysis test to the Hybrid Optimization Software Suite (HOSS), a multi-physics software tool based on the combined finite-discrete element method, that is used to describe material deformation and failure (i.e., fracture and fragmentation) under a number of user-prescribed boundary conditions. We explore the sensitivity of HOSS for various model parameters that influence how fracture are propagated through a material of interest. The parameters control the softening curve that the model relies to determine fractures within each element in the mesh, as well a other internal parameters which influence fracture behavior. The sensitivity method we apply is the Fourier Amplitude Sensitivity Test (FAST), which is a global sensitivity method to explore how each parameter influence the model fracture and to determine the key model parameters that have the most impact on the model. We present several sensitivity experiments for different combination of model parameters and compare against experimental data for verification.
NASA Astrophysics Data System (ADS)
Yazdanpanah Moghadam, Peyman; Quaegebeur, Nicolas; Masson, Patrice
2015-01-01
Piezoelectric transducers are commonly used in structural health monitoring systems to generate and measure ultrasonic guided waves (GWs) by applying interfacial shear and normal stresses to the host structure. In most cases, in order to perform damage detection, advanced signal processing techniques are required, since a minimum of two dispersive modes are propagating in the host structure. In this paper, a systematic approach for mode selection is proposed by optimizing the interfacial shear stress profile applied to the host structure, representing the first step of a global optimization of selective mode actuator design. This approach has the potential of reducing the complexity of signal processing tools as the number of propagating modes could be reduced. Using the superposition principle, an analytical method is first developed for GWs excitation by a finite number of uniform segments, each contributing with a given elementary shear stress profile. Based on this, cost functions are defined in order to minimize the undesired modes and amplify the selected mode and the optimization problem is solved with a parallel genetic algorithm optimization framework. Advantages of this method over more conventional transducers tuning approaches are that (1) the shear stress can be explicitly optimized to both excite one mode and suppress other undesired modes, (2) the size of the excitation area is not constrained and mode-selective excitation is still possible even if excitation width is smaller than all excited wavelengths, and (3) the selectivity is increased and the bandwidth extended. The complexity of the optimal shear stress profile obtained is shown considering two cost functions with various optimal excitation widths and number of segments. Results illustrate that the desired mode (A0 or S0) can be excited dominantly over other modes up to a wave power ratio of 1010 using an optimal shear stress profile.
Development of optimal grinding and polishing tools for aspheric surfaces
NASA Astrophysics Data System (ADS)
Burge, James H.; Anderson, Bill; Benjamin, Scott; Cho, Myung K.; Smith, Koby Z.; Valente, Martin J.
2001-12-01
The ability to grind and polish steep aspheric surfaces to high quality is limited by the tools used for working the surface. The optician prefers to use large, stiff tools to get good natural smoothing, avoiding small scale surface errors. This is difficult for steep aspheres because the tools must have sufficient compliance to fit the aspheric surface, yet we wish the tools to be stiff so they wear down high regions on the surface. This paper presents a toolkit for designing optimal tools that provide large scale compliance to fit the aspheric surface, yet maintain small scale stiffness for efficient polishing.
Shape Optimization of Supersonic Turbines Using Response Surface and Neural Network Methods
NASA Technical Reports Server (NTRS)
Papila, Nilay; Shyy, Wei; Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
Turbine performance directly affects engine specific impulse, thrust-to-weight ratio, and cost in a rocket propulsion system. A global optimization framework combining the radial basis neural network (RBNN) and the polynomial-based response surface method (RSM) is constructed for shape optimization of a supersonic turbine. Based on the optimized preliminary design, shape optimization is performed for the first vane and blade of a 2-stage supersonic turbine, involving O(10) design variables. The design of experiment approach is adopted to reduce the data size needed by the optimization task. It is demonstrated that a major merit of the global optimization approach is that it enables one to adaptively revise the design space to perform multiple optimization cycles. This benefit is realized when an optimal design approaches the boundary of a pre-defined design space. Furthermore, by inspecting the influence of each design variable, one can also gain insight into the existence of multiple design choices and select the optimum design based on other factors such as stress and materials considerations.
An optimization tool for satellite equipment layout
NASA Astrophysics Data System (ADS)
Qin, Zheng; Liang, Yan-gang; Zhou, Jian-ping
2018-01-01
Selection of the satellite equipment layout with performance constraints is a complex task which can be viewed as a constrained multi-objective optimization and a multiple criteria decision making problem. The layout design of a satellite cabin involves the process of locating the required equipment in a limited space, thereby satisfying various behavioral constraints of the interior and exterior environments. The layout optimization of satellite cabin in this paper includes the C.G. offset, the moments of inertia and the space debris impact risk of the system, of which the impact risk index is developed to quantify the risk to a satellite cabin of coming into contact with space debris. In this paper an optimization tool for the integration of CAD software as well as the optimization algorithms is presented, which is developed to automatically find solutions for a three-dimensional layout of equipment in satellite. The effectiveness of the tool is also demonstrated by applying to the layout optimization of a satellite platform.
Egli, Lukas; Meyer, Carsten; Scherber, Christoph; Kreft, Holger; Tscharntke, Teja
2018-05-01
Closing yield gaps within existing croplands, and thereby avoiding further habitat conversions, is a prominently and controversially discussed strategy to meet the rising demand for agricultural products, while minimizing biodiversity impacts. The agricultural intensification associated with such a strategy poses additional threats to biodiversity within agricultural landscapes. The uneven spatial distribution of both yield gaps and biodiversity provides opportunities for reconciling agricultural intensification and biodiversity conservation through spatially optimized intensification. Here, we integrate distribution and habitat information for almost 20,000 vertebrate species with land-cover and land-use datasets. We estimate that projected agricultural intensification between 2000 and 2040 would reduce the global biodiversity value of agricultural lands by 11%, relative to 2000. Contrasting these projections with spatial land-use optimization scenarios reveals that 88% of projected biodiversity loss could be avoided through globally coordinated land-use planning, implying huge efficiency gains through international cooperation. However, global-scale optimization also implies a highly uneven distribution of costs and benefits, resulting in distinct "winners and losers" in terms of national economic development, food security, food sovereignty or conservation. Given conflicting national interests and lacking effective governance mechanisms to guarantee equitable compensation of losers, multinational land-use optimization seems politically unlikely. In turn, 61% of projected biodiversity loss could be avoided through nationally focused optimization, and 33% through optimization within just 10 countries. Targeted efforts to improve the capacity for integrated land-use planning for sustainable intensification especially in these countries, including the strengthening of institutions that can arbitrate subnational land-use conflicts, may offer an effective, yet politically feasible, avenue to better reconcile future trade-offs between agriculture and conservation. The efficiency gains of optimization remained robust when assuming that yields could only be increased to 80% of their potential. Our results highlight the need to better integrate real-world governance, political and economic challenges into sustainable development and global change mitigation research. © 2018 John Wiley & Sons Ltd.
Global Budgeting in the OECD Countries
Wolfe, Patrice R.; Moran, Donald W.
1993-01-01
Many of the Organization for Economic Cooperation and Development countries use global budgeting to control all or certain portions of their health care expenditures. Although the use of global budgets as a cost-containment tool has not been implemented in the United States in any comprehensive way, recent health care reform initiatives have increased the need for research into such tools. In general, the structure, process, and effectiveness of global budgets vary enormously from country to country, in part because the underlying social welfare system of each country is unique. PMID:10130584
NASA Astrophysics Data System (ADS)
Chaudhuri, Anirban
Global optimization based on expensive and time consuming simulations or experiments usually cannot be carried out to convergence, but must be stopped because of time constraints, or because the cost of the additional function evaluations exceeds the benefits of improving the objective(s). This dissertation sets to explore the implications of such budget and time constraints on the balance between exploration and exploitation and the decision of when to stop. Three different aspects are considered in terms of their effects on the balance between exploration and exploitation: 1) history of optimization, 2) fixed evaluation budget, and 3) cost as a part of objective function. To this end, this research develops modifications to the surrogate-based optimization technique, Efficient Global Optimization algorithm, that controls better the balance between exploration and exploitation, and stopping criteria facilitated by these modifications. Then the focus shifts to examining experimental optimization, which shares the issues of cost and time constraints. Through a study on optimization of thrust and power for a small flapping wing for micro air vehicles, important differences and similarities between experimental and simulation-based optimization are identified. The most important difference is that reduction of noise in experiments becomes a major time and cost issue, and a second difference is that parallelism as a way to cut cost is more challenging. The experimental optimization reveals the tendency of the surrogate to display optimistic bias near the surrogate optimum, and this tendency is then verified to also occur in simulation based optimization.
Fong, Simon; Deb, Suash; Yang, Xin-She; Zhuang, Yan
2014-01-01
Traditional K-means clustering algorithms have the drawback of getting stuck at local optima that depend on the random values of initial centroids. Optimization algorithms have their advantages in guiding iterative computation to search for global optima while avoiding local optima. The algorithms help speed up the clustering process by converging into a global optimum early with multiple search agents in action. Inspired by nature, some contemporary optimization algorithms which include Ant, Bat, Cuckoo, Firefly, and Wolf search algorithms mimic the swarming behavior allowing them to cooperatively steer towards an optimal objective within a reasonable time. It is known that these so-called nature-inspired optimization algorithms have their own characteristics as well as pros and cons in different applications. When these algorithms are combined with K-means clustering mechanism for the sake of enhancing its clustering quality by avoiding local optima and finding global optima, the new hybrids are anticipated to produce unprecedented performance. In this paper, we report the results of our evaluation experiments on the integration of nature-inspired optimization methods into K-means algorithms. In addition to the standard evaluation metrics in evaluating clustering quality, the extended K-means algorithms that are empowered by nature-inspired optimization methods are applied on image segmentation as a case study of application scenario.
Deb, Suash; Yang, Xin-She
2014-01-01
Traditional K-means clustering algorithms have the drawback of getting stuck at local optima that depend on the random values of initial centroids. Optimization algorithms have their advantages in guiding iterative computation to search for global optima while avoiding local optima. The algorithms help speed up the clustering process by converging into a global optimum early with multiple search agents in action. Inspired by nature, some contemporary optimization algorithms which include Ant, Bat, Cuckoo, Firefly, and Wolf search algorithms mimic the swarming behavior allowing them to cooperatively steer towards an optimal objective within a reasonable time. It is known that these so-called nature-inspired optimization algorithms have their own characteristics as well as pros and cons in different applications. When these algorithms are combined with K-means clustering mechanism for the sake of enhancing its clustering quality by avoiding local optima and finding global optima, the new hybrids are anticipated to produce unprecedented performance. In this paper, we report the results of our evaluation experiments on the integration of nature-inspired optimization methods into K-means algorithms. In addition to the standard evaluation metrics in evaluating clustering quality, the extended K-means algorithms that are empowered by nature-inspired optimization methods are applied on image segmentation as a case study of application scenario. PMID:25202730
Raja, Muhammad Asif Zahoor; Zameer, Aneela; Khan, Aziz Ullah; Wazwaz, Abdul Majid
2016-01-01
In this study, a novel bio-inspired computing approach is developed to analyze the dynamics of nonlinear singular Thomas-Fermi equation (TFE) arising in potential and charge density models of an atom by exploiting the strength of finite difference scheme (FDS) for discretization and optimization through genetic algorithms (GAs) hybrid with sequential quadratic programming. The FDS procedures are used to transform the TFE differential equations into a system of nonlinear equations. A fitness function is constructed based on the residual error of constituent equations in the mean square sense and is formulated as the minimization problem. Optimization of parameters for the system is carried out with GAs, used as a tool for viable global search integrated with SQP algorithm for rapid refinement of the results. The design scheme is applied to solve TFE for five different scenarios by taking various step sizes and different input intervals. Comparison of the proposed results with the state of the art numerical and analytical solutions reveals that the worth of our scheme in terms of accuracy and convergence. The reliability and effectiveness of the proposed scheme are validated through consistently getting optimal values of statistical performance indices calculated for a sufficiently large number of independent runs to establish its significance.
Illumina Production Sequencing at the DOE Joint Genome Institute - Workflow and Optimizations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarver, Angela; Fern, Alison; Diego, Matthew San
2010-06-18
The U.S. Department of Energy (DOE) Joint Genome Institute?s (JGI) Production Sequencing group is committed to the generation of high-quality genomic DNA sequence to support the DOE mission areas of renewable energy generation, global carbon management, and environmental characterization and clean-up. Within the JGI?s Production Sequencing group, the Illumina Genome Analyzer pipeline has been established as one of three sequencing platforms, along with Roche/454 and ABI/Sanger. Optimization of the Illumina pipeline has been ongoing with the aim of continual process improvement of the laboratory workflow. These process improvement projects are being led by the JGI?s Process Optimization, Sequencing Technologies, Instrumentation&more » Engineering, and the New Technology Production groups. Primary focus has been on improving the procedural ergonomics and the technicians? operating environment, reducing manually intensive technician operations with different tools, reducing associated production costs, and improving the overall process and generated sequence quality. The U.S. DOE JGI was established in 1997 in Walnut Creek, CA, to unite the expertise and resources of five national laboratories? Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, and Pacific Northwest ? along with HudsonAlpha Institute for Biotechnology. JGI is operated by the University of California for the U.S. DOE.« less
Bodien, Yelena G; McCrea, Michael; Dikmen, Sureyya; Temkin, Nancy; Boase, Kim; Machamer, Joan; Taylor, Sabrina R; Sherer, Mark; Levin, Harvey; Kramer, Joel H; Corrigan, John D; McAllister, Thomas W; Whyte, John; Manley, Geoffrey T; Giacino, Joseph T
Traumatic brain injury (TBI) is a global public health problem that affects the long-term cognitive, physical, and psychological health of patients, while also having a major impact on family and caregivers. In stark contrast to the effective trials that have been conducted in other neurological diseases, nearly 30 studies of interventions employed during acute hospital care for TBI have failed to identify treatments that improve outcome. Many factors may confound the ability to detect true and meaningful treatment effects. One promising area for improving the precision of intervention studies is to optimize the validity of the outcome assessment battery by using well-designed tools and data collection strategies to reduce variability in the outcome data. The Transforming Research and Clinical Knowledge in TBI (TRACK-TBI) study, conducted at 18 sites across the United States, implemented a multidimensional outcome assessment battery with 22 measures aimed at characterizing TBI outcome up to 1 year postinjury. In parallel, through the TBI Endpoints Development (TED) Initiative, federal agencies and investigators have partnered to identify the most valid, reliable, and sensitive outcome assessments for TBI. Here, we present lessons learned from the TRACK-TBI and TED initiatives aimed at optimizing the validity of outcome assessment in TBI.
Global, Multi-Objective Trajectory Optimization With Parametric Spreading
NASA Technical Reports Server (NTRS)
Vavrina, Matthew A.; Englander, Jacob A.; Phillips, Sean M.; Hughes, Kyle M.
2017-01-01
Mission design problems are often characterized by multiple, competing trajectory optimization objectives. Recent multi-objective trajectory optimization formulations enable generation of globally-optimal, Pareto solutions via a multi-objective genetic algorithm. A byproduct of these formulations is that clustering in design space can occur in evolving the population towards the Pareto front. This clustering can be a drawback, however, if parametric evaluations of design variables are desired. This effort addresses clustering by incorporating operators that encourage a uniform spread over specified design variables while maintaining Pareto front representation. The algorithm is demonstrated on a Neptune orbiter mission, and enhanced multidimensional visualization strategies are presented.
Streamflow Prediction based on Chaos Theory
NASA Astrophysics Data System (ADS)
Li, X.; Wang, X.; Babovic, V. M.
2015-12-01
Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.
NASA Astrophysics Data System (ADS)
Alimohammadi, Shahrouz; Cavaglieri, Daniele; Beyhaghi, Pooriya; Bewley, Thomas R.
2016-11-01
This work applies a recently developed Derivative-free optimization algorithm to derive a new mixed implicit-explicit (IMEX) time integration scheme for Computational Fluid Dynamics (CFD) simulations. This algorithm allows imposing a specified order of accuracy for the time integration and other important stability properties in the form of nonlinear constraints within the optimization problem. In this procedure, the coefficients of the IMEX scheme should satisfy a set of constraints simultaneously. Therefore, the optimization process, at each iteration, estimates the location of the optimal coefficients using a set of global surrogates, for both the objective and constraint functions, as well as a model of the uncertainty function of these surrogates based on the concept of Delaunay triangulation. This procedure has been proven to converge to the global minimum of the constrained optimization problem provided the constraints and objective functions are twice differentiable. As a result, a new third-order, low-storage IMEX Runge-Kutta time integration scheme is obtained with remarkably fast convergence. Numerical tests are then performed leveraging the turbulent channel flow simulations to validate the theoretical order of accuracy and stability properties of the new scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe
2010-03-31
GlobiPack contains a small collection of optimization globalization algorithms. These algorithms are used by optimization and various nonlinear equation solver algorithms.Used as the line-search procedure with Newton and Quasi-Newton optimization and nonlinear equation solver methods. These are standard published 1-D line search algorithms such as are described in the book Nocedal and Wright Numerical Optimization: 2nd edition, 2006. One set of algorithms were copied and refactored from the existing open-source Trilinos package MOOCHO where the linear search code is used to globalize SQP methods. This software is generic to any mathematical optimization problem where smooth derivatives exist. There is nomore » specific connection or mention whatsoever to any specific application, period. You cannot find more general mathematical software.« less
Ruan, Jujun; Zhang, Chao; Li, Ya; Li, Peiyi; Yang, Zaizhi; Chen, Xiaohong; Huang, Mingzhi; Zhang, Tao
2017-02-01
This work proposes an on-line hybrid intelligent control system based on a genetic algorithm (GA) evolving fuzzy wavelet neural network software sensor to control dissolved oxygen (DO) in an anaerobic/anoxic/oxic process for treating papermaking wastewater. With the self-learning and memory abilities of neural network, handling the uncertainty capacity of fuzzy logic, analyzing local detail superiority of wavelet transform and global search of GA, this proposed control system can extract the dynamic behavior and complex interrelationships between various operation variables. The results indicate that the reasonable forecasting and control performances were achieved with optimal DO, and the effluent quality was stable at and below the desired values in real time. Our proposed hybrid approach proved to be a robust and effective DO control tool, attaining not only adequate effluent quality but also minimizing the demand for energy, and is easily integrated into a global monitoring system for purposes of cost management. Copyright © 2016 Elsevier Ltd. All rights reserved.
Simulation of the spatial frequency-dependent sensitivities of Acoustic Emission sensors
NASA Astrophysics Data System (ADS)
Boulay, N.; Lhémery, A.; Zhang, F.
2018-05-01
Typical configurations of nondestructive testing by Acoustic Emission (NDT/AE) make use of multiple sensors positioned on the tested structure for detecting evolving flaws and possibly locating them by triangulation. Sensors positions must be optimized for ensuring global coverage sensitivity to AE events and minimizing their number. A simulator of NDT/AE is under development to provide help with designing testing configurations and with interpreting measurements. A global model performs sub-models simulating the various phenomena taking place at different spatial and temporal scales (crack growth, AE source and radiation, wave propagation in the structure, reception by sensors). In this context, accurate modelling of sensors behaviour must be developed. These sensors generally consist of a cylindrical piezoelectric element of radius approximately equal to its thickness, without damping and bonded to its case. Sensors themselves are bonded to the structure being tested. Here, a multiphysics finite element simulation tool is used to study the complex behaviour of AE sensor. The simulated behaviour is shown to accurately reproduce the high-amplitude measured contributions used in the AE practice.
Climate targets and cost-effective climate stabilization pathways
NASA Astrophysics Data System (ADS)
Held, H.
2015-08-01
Climate economics has developed two main tools to derive an economically adequate response to the climate problem. Cost benefit analysis weighs in any available information on mitigation costs and benefits and thereby derives an "optimal" global mean temperature. Quite the contrary, cost effectiveness analysis allows deriving costs of potential policy targets and the corresponding cost- minimizing investment paths. The article highlights pros and cons of both approaches and then focusses on the implications of a policy that strives at limiting global warming to 2 °C compared to pre-industrial values. The related mitigation costs and changes in the energy sector are summarized according to the IPCC report of 2014. The article then points to conceptual difficulties when internalizing uncertainty in these types of analyses and suggests pragmatic solutions. Key statements on mitigation economics remain valid under uncertainty when being given the adequate interpretation. Furthermore, the expected economic value of perfect climate information is found to be on the order of hundreds of billions of Euro per year if a 2°-policy were requested. Finally, the prospects of climate policy are sketched.
Detecting Surgical Tools by Modelling Local Appearance and Global Shape.
Bouget, David; Benenson, Rodrigo; Omran, Mohamed; Riffaud, Laurent; Schiele, Bernt; Jannin, Pierre
2015-12-01
Detecting tools in surgical videos is an important ingredient for context-aware computer-assisted surgical systems. To this end, we present a new surgical tool detection dataset and a method for joint tool detection and pose estimation in 2d images. Our two-stage pipeline is data-driven and relaxes strong assumptions made by previous works regarding the geometry, number, and position of tools in the image. The first stage classifies each pixel based on local appearance only, while the second stage evaluates a tool-specific shape template to enforce global shape. Both local appearance and global shape are learned from training data. Our method is validated on a new surgical tool dataset of 2 476 images from neurosurgical microscopes, which is made freely available. It improves over existing datasets in size, diversity and detail of annotation. We show that our method significantly improves over competitive baselines from the computer vision field. We achieve 15% detection miss-rate at 10(-1) false positives per image (for the suction tube) over our surgical tool dataset. Results indicate that performing semantic labelling as an intermediate task is key for high quality detection.
Simple Example of Backtest Overfitting (SEBO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
In the field of mathematical finance, a "backtest" is the usage of historical market data to assess the performance of a proposed trading strategy. It is a relatively simple matter for a present-day computer system to explore thousands, millions or even billions of variations of a proposed strategy, and pick the best performing variant as the "optimal" strategy "in sample" (i.e., on the input dataset). Unfortunately, such an "optimal" strategy often performs very poorly "out of sample" (i.e. on another dataset), because the parameters of the invest strategy have been oversit to the in-sample data, a situation known as "backtestmore » overfitting". While the mathematics of backtest overfitting has been examined in several recent theoretical studies, here we pursue a more tangible analysis of this problem, in the form of an online simulator tool. Given a input random walk time series, the tool develops an "optimal" variant of a simple strategy by exhaustively exploring all integer parameter values among a handful of parameters. That "optimal" strategy is overfit, since by definition a random walk is unpredictable. Then the tool tests the resulting "optimal" strategy on a second random walk time series. In most runs using our online tool, the "optimal" strategy derived from the first time series performs poorly on the second time series, demonstrating how hard it is not to overfit a backtest. We offer this online tool, "Simple Example of Backtest Overfitting (SEBO)", to facilitate further research in this area.« less
NASA Astrophysics Data System (ADS)
Pedersen, N. L.
2015-06-01
The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.
Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2018-02-01
Pharmaceutical batch freeze-drying is commonly used to improve the stability of biological therapeutics. The primary drying step is regulated by the dynamic settings of the adaptable process variables, shelf temperature T s and chamber pressure P c . Mechanistic modelling of the primary drying step leads to the optimal dynamic combination of these adaptable process variables in function of time. According to Good Modelling Practices, a Global Sensitivity Analysis (GSA) is essential for appropriate model building. In this study, both a regression-based and variance-based GSA were conducted on a validated mechanistic primary drying model to estimate the impact of several model input parameters on two output variables, the product temperature at the sublimation front T i and the sublimation rate ṁ sub . T s was identified as most influential parameter on both T i and ṁ sub , followed by P c and the dried product mass transfer resistance α Rp for T i and ṁ sub , respectively. The GSA findings were experimentally validated for ṁ sub via a Design of Experiments (DoE) approach. The results indicated that GSA is a very useful tool for the evaluation of the impact of different process variables on the model outcome, leading to essential process knowledge, without the need for time-consuming experiments (e.g., DoE). Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis and design of a genetic circuit for dynamic metabolic engineering.
Anesiadis, Nikolaos; Kobayashi, Hideki; Cluett, William R; Mahadevan, Radhakrishnan
2013-08-16
Recent advances in synthetic biology have equipped us with new tools for bioprocess optimization at the genetic level. Previously, we have presented an integrated in silico design for the dynamic control of gene expression based on a density-sensing unit and a genetic toggle switch. In the present paper, analysis of a serine-producing Escherichia coli mutant shows that an instantaneous ON-OFF switch leads to a maximum theoretical productivity improvement of 29.6% compared to the mutant. To further the design, global sensitivity analysis is applied here to a mathematical model of serine production in E. coli coupled with a genetic circuit. The model of the quorum sensing and the toggle switch involves 13 parameters of which 3 are identified as having a significant effect on serine concentration. Simulations conducted in this reduced parameter space further identified the optimal ranges for these 3 key parameters to achieve productivity values close to the maximum theoretical values. This analysis can now be used to guide the experimental implementation of a dynamic metabolic engineering strategy and reduce the time required to design the genetic circuit components.
Cost-effectiveness on a local level: whether and when to adopt a new technology.
Woertman, Willem H; Van De Wetering, Gijs; Adang, Eddy M M
2014-04-01
Cost-effectiveness analysis has become a widely accepted tool for decision making in health care. The standard textbook cost-effectiveness analysis focuses on whether to make the switch from an old or common practice technology to an innovative technology, and in doing so, it takes a global perspective. In this article, we are interested in a local perspective, and we look at the questions of whether and when the switch from old to new should be made. A new approach to cost-effectiveness from a local (e.g., a hospital) perspective, by means of a mathematical model for cost-effectiveness that explicitly incorporates time, is proposed. A decision rule is derived for establishing whether a new technology should be adopted, as well as a general rule for establishing when it pays to postpone adoption by 1 more period, and a set of decision rules that can be used to determine the optimal timing of adoption. Finally, a simple example is presented to illustrate our model and how it leads to optimal decision making in a number of cases.
Tour of Jupiter Galilean moons: Winning solution of GTOC6
NASA Astrophysics Data System (ADS)
Colasurdo, Guido; Zavoli, Alessandro; Longo, Alessandro; Casalino, Lorenzo; Simeoni, Francesco
2014-09-01
The paper presents the trajectory designed by the Italian joint team Politecnico di Torino & Sapienza Università di Roma (Team5), winner of the 6th edition of the Global Trajectory Optimization Competition (GTOC6). In the short time available in these competitions, Team5 resorted to basic knowledge, simple tools and a powerful indirect optimization procedure. The mission concerns a 4-year tour of the Jupiter Galilean moons. The paper explains the strategy that was preliminarily devised and eventually implemented by looking for a viable trajectory. The first phase is a capture that moves the spacecraft from the arrival hyperbola to a low-energy orbit around Jupiter. Six series of flybys follow; in each one the spacecraft orbits Jupiter in resonance with a single moon; criteria to construct efficient chains of resonant flybys are presented. Transfer legs move the spacecraft from resonance with a moon to another one; precise phasing of the relevant moons is required; mission opportunities in a 11-year launch window are found by assuming ballistic trajectories and coplanar circular orbits for the Jovian satellites. The actual trajectory is found by using an indirect technique.
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
Optimization of Multiple Related Negotiation through Multi-Negotiation Network
NASA Astrophysics Data System (ADS)
Ren, Fenghui; Zhang, Minjie; Miao, Chunyan; Shen, Zhiqi
In this paper, a Multi-Negotiation Network (MNN) and a Multi- Negotiation Influence Diagram (MNID) are proposed to optimally handle Multiple Related Negotiations (MRN) in a multi-agent system. Most popular, state-of-the-art approaches perform MRN sequentially. However, a sequential procedure may not optimally execute MRN in terms of maximizing the global outcome, and may even lead to unnecessary losses in some situations. The motivation of this research is to use a MNN to handle MRN concurrently so as to maximize the expected utility of MRN. Firstly, both the joint success rate and the joint utility by considering all related negotiations are dynamically calculated based on a MNN. Secondly, by employing a MNID, an agent's possible decision on each related negotiation is reflected by the value of expected utility. Lastly, through comparing expected utilities between all possible policies to conduct MRN, an optimal policy is generated to optimize the global outcome of MRN. The experimental results indicate that the proposed approach can improve the global outcome of MRN in a successful end scenario, and avoid unnecessary losses in an unsuccessful end scenario.
OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.
Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein
2018-01-01
Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.
NASA Astrophysics Data System (ADS)
Dasgupta, S.; Mukherjee, S.
2016-09-01
One of the most significant factors in metal cutting is tool life. In this research work, the effects of machining parameters on tool under wet machining environment were studied. Tool life characteristics of brazed carbide cutting tool machined against mild steel and optimization of machining parameters based on Taguchi design of experiments were examined. The experiments were conducted using three factors, spindle speed, feed rate and depth of cut each having three levels. Nine experiments were performed on a high speed semi-automatic precision central lathe. ANOVA was used to determine the level of importance of the machining parameters on tool life. The optimum machining parameter combination was obtained by the analysis of S/N ratio. A mathematical model based on multiple regression analysis was developed to predict the tool life. Taguchi's orthogonal array analysis revealed the optimal combination of parameters at lower levels of spindle speed, feed rate and depth of cut which are 550 rpm, 0.2 mm/rev and 0.5mm respectively. The Main Effects plot reiterated the same. The variation of tool life with different process parameters has been plotted. Feed rate has the most significant effect on tool life followed by spindle speed and depth of cut.
NASA Astrophysics Data System (ADS)
Abdeh-Kolahchi, A.; Satish, M.; Datta, B.
2004-05-01
A state art groundwater monitoring network design is introduced. The method combines groundwater flow and transport results with optimization Genetic Algorithm (GA) to identify optimal monitoring well locations. Optimization theory uses different techniques to find a set of parameter values that minimize or maximize objective functions. The suggested groundwater optimal monitoring network design is based on the objective of maximizing the probability of tracking a transient contamination plume by determining sequential monitoring locations. The MODFLOW and MT3DMS models included as separate modules within the Groundwater Modeling System (GMS) are used to develop three dimensional groundwater flow and contamination transport simulation. The groundwater flow and contamination simulation results are introduced as input to the optimization model, using Genetic Algorithm (GA) to identify the groundwater optimal monitoring network design, based on several candidate monitoring locations. The groundwater monitoring network design model is used Genetic Algorithms with binary variables representing potential monitoring location. As the number of decision variables and constraints increase, the non-linearity of the objective function also increases which make difficulty to obtain optimal solutions. The genetic algorithm is an evolutionary global optimization technique, which is capable of finding the optimal solution for many complex problems. In this study, the GA approach capable of finding the global optimal solution to a groundwater monitoring network design problem involving 18.4X 1018 feasible solutions will be discussed. However, to ensure the efficiency of the solution process and global optimality of the solution obtained using GA, it is necessary that appropriate GA parameter values be specified. The sensitivity analysis of genetic algorithms parameters such as random number, crossover probability, mutation probability, and elitism are discussed for solution of monitoring network design.
MO-AB-BRA-01: A Global Level Set Based Formulation for Volumetric Modulated Arc Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, D; Lyu, Q; Ruan, D
2016-06-15
Purpose: The current clinical Volumetric Modulated Arc Therapy (VMAT) optimization is formulated as a non-convex problem and various greedy heuristics have been employed for an empirical solution, jeopardizing plan consistency and quality. We introduce a novel global direct aperture optimization method for VMAT to overcome these limitations. Methods: The global VMAT (gVMAT) planning was formulated as an optimization problem with an L2-norm fidelity term and an anisotropic total variation term. A level set function was used to describe the aperture shapes and adjacent aperture shapes were penalized to control MLC motion range. An alternating optimization strategy was implemented to solvemore » the fluence intensity and aperture shapes simultaneously. Single arc gVMAT plans, utilizing 180 beams with 2° angular resolution, were generated for a glioblastoma multiforme (GBM), lung (LNG), and 2 head and neck cases—one with 3 PTVs (H&N3PTV) and one with 4 PTVs (H&N4PTV). The plans were compared against the clinical VMAT (cVMAT) plans utilizing two overlapping coplanar arcs. Results: The optimization of the gVMAT plans had converged within 600 iterations. gVMAT reduced the average max and mean OAR dose by 6.59% and 7.45% of the prescription dose. Reductions in max dose and mean dose were as high as 14.5 Gy in the LNG case and 15.3 Gy in the H&N3PTV case. PTV coverages (D95, D98, D99) were within 0.25% of the prescription dose. By globally considering all beams, the gVMAT optimizer allowed some beams to deliver higher intensities, yielding a dose distribution that resembles a static beam IMRT plan with beam orientation optimization. Conclusions: The novel VMAT approach allows for the search of an optimal plan in the global solution space and generates deliverable apertures directly. The single arc VMAT approach fully utilizes the digital linacs’ capability in dose rate and gantry rotation speed modulation. Varian Medical Systems, NIH grant R01CA188300, NIH grant R43CA183390.« less
Zhong, Shangping; Chen, Tianshun; He, Fengying; Niu, Yuzhen
2014-09-01
For a practical pattern classification task solved by kernel methods, the computing time is mainly spent on kernel learning (or training). However, the current kernel learning approaches are based on local optimization techniques, and hard to have good time performances, especially for large datasets. Thus the existing algorithms cannot be easily extended to large-scale tasks. In this paper, we present a fast Gaussian kernel learning method by solving a specially structured global optimization (SSGO) problem. We optimize the Gaussian kernel function by using the formulated kernel target alignment criterion, which is a difference of increasing (d.i.) functions. Through using a power-transformation based convexification method, the objective criterion can be represented as a difference of convex (d.c.) functions with a fixed power-transformation parameter. And the objective programming problem can then be converted to a SSGO problem: globally minimizing a concave function over a convex set. The SSGO problem is classical and has good solvability. Thus, to find the global optimal solution efficiently, we can adopt the improved Hoffman's outer approximation method, which need not repeat the searching procedure with different starting points to locate the best local minimum. Also, the proposed method can be proven to converge to the global solution for any classification task. We evaluate the proposed method on twenty benchmark datasets, and compare it with four other Gaussian kernel learning methods. Experimental results show that the proposed method stably achieves both good time-efficiency performance and good classification performance. Copyright © 2014 Elsevier Ltd. All rights reserved.
Optimal stomatal behaviour around the world
NASA Astrophysics Data System (ADS)
Lin, Yan-Shih; Medlyn, Belinda E.; Duursma, Remko A.; Prentice, I. Colin; Wang, Han; Baig, Sofia; Eamus, Derek; de Dios, Victor Resco; Mitchell, Patrick; Ellsworth, David S.; de Beeck, Maarten Op; Wallin, Göran; Uddling, Johan; Tarvainen, Lasse; Linderson, Maj-Lena; Cernusak, Lucas A.; Nippert, Jesse B.; Ocheltree, Troy W.; Tissue, David T.; Martin-Stpaul, Nicolas K.; Rogers, Alistair; Warren, Jeff M.; de Angelis, Paolo; Hikosaka, Kouki; Han, Qingmin; Onoda, Yusuke; Gimeno, Teresa E.; Barton, Craig V. M.; Bennie, Jonathan; Bonal, Damien; Bosc, Alexandre; Löw, Markus; Macinins-Ng, Cate; Rey, Ana; Rowland, Lucy; Setterfield, Samantha A.; Tausz-Posch, Sabine; Zaragoza-Castells, Joana; Broadmeadow, Mark S. J.; Drake, John E.; Freeman, Michael; Ghannoum, Oula; Hutley, Lindsay B.; Kelly, Jeff W.; Kikuzawa, Kihachiro; Kolari, Pasi; Koyama, Kohei; Limousin, Jean-Marc; Meir, Patrick; Lola da Costa, Antonio C.; Mikkelsen, Teis N.; Salinas, Norma; Sun, Wei; Wingate, Lisa
2015-05-01
Stomatal conductance (gs) is a key land-surface attribute as it links transpiration, the dominant component of global land evapotranspiration, and photosynthesis, the driving force of the global carbon cycle. Despite the pivotal role of gs in predictions of global water and carbon cycle changes, a global-scale database and an associated globally applicable model of gs that allow predictions of stomatal behaviour are lacking. Here, we present a database of globally distributed gs obtained in the field for a wide range of plant functional types (PFTs) and biomes. We find that stomatal behaviour differs among PFTs according to their marginal carbon cost of water use, as predicted by the theory underpinning the optimal stomatal model and the leaf and wood economics spectrum. We also demonstrate a global relationship with climate. These findings provide a robust theoretical framework for understanding and predicting the behaviour of gs across biomes and across PFTs that can be applied to regional, continental and global-scale modelling of ecosystem productivity, energy balance and ecohydrological processes in a future changing climate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division
2007-01-01
The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less
Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...
Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J
2017-07-14
In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2009-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!
Methodology of Numerical Optimization for Orbital Parameters of Binary Systems
NASA Astrophysics Data System (ADS)
Araya, I.; Curé, M.
2010-02-01
The use of a numerical method of maximization (or minimization) in optimization processes allows us to obtain a great amount of solutions. Therefore, we can find a global maximum or minimum of the problem, but this is only possible if we used a suitable methodology. To obtain the global optimum values, we use the genetic algorithm called PIKAIA (P. Charbonneau) and other four algorithms implemented in Mathematica. We demonstrate that derived orbital parameters of binary systems published in some papers, based on radial velocity measurements, are local minimum instead of global ones.
Kazachenko, Sergey; Bulusu, Satya; Thakkar, Ajit J
2013-06-14
Putative global minima are reported for methanol clusters (CH3OH)n with n ≤ 15. The predictions are based on global optimization of three intermolecular potential energy models followed by local optimization and single-point energy calculations using two variants of dispersion-corrected density functional theory. Recurring structural motifs include folded and/or twisted rings, folded rings with a short branch, and stacked rings. Many of the larger structures are stabilized by weak C-H···O bonds.
Dispositional Optimism and Terminal Decline in Global Quality of Life
ERIC Educational Resources Information Center
Zaslavsky, Oleg; Palgi, Yuval; Rillamas-Sun, Eileen; LaCroix, Andrea Z.; Schnall, Eliezer; Woods, Nancy F.; Cochrane, Barbara B.; Garcia, Lorena; Hingle, Melanie; Post, Stephen; Seguin, Rebecca; Tindle, Hilary; Shrira, Amit
2015-01-01
We examined whether dispositional optimism relates to change in global quality of life (QOL) as a function of either chronological age or years to impending death. We used a sample of 2,096 deceased postmenopausal women from the Women's Health Initiative clinical trials who were enrolled in the 2005-2010 Extension Study and for whom at least 1…
Digitalizing the Circular Economy
NASA Astrophysics Data System (ADS)
Reuter, Markus A.
2016-12-01
Metallurgy is a key enabler of a circular economy (CE), its digitalization is the metallurgical Internet of Things (m-IoT). In short: Metallurgy is at the heart of a CE, as metals all have strong intrinsic recycling potentials. Process metallurgy, as a key enabler for a CE, will help much to deliver its goals. The first-principles models of process engineering help quantify the resource efficiency (RE) of the CE system, connecting all stakeholders via digitalization. This provides well-argued and first-principles environmental information to empower a tax paying consumer society, policy, legislators, and environmentalists. It provides the details of capital expenditure and operational expenditure estimates. Through this path, the opportunities and limits of a CE, recycling, and its technology can be estimated. The true boundaries of sustainability can be determined in addition to the techno-economic evaluation of RE. The integration of metallurgical reactor technology and systems digitally, not only on one site but linking different sites globally via hardware, is the basis for describing CE systems as dynamic feedback control loops, i.e., the m-IoT. It is the linkage of the global carrier metallurgical processing system infrastructure that maximizes the recovery of all minor and technology elements in its associated refining metallurgical infrastructure. This will be illustrated through the following: (1) System optimization models for multimetal metallurgical processing. These map large-scale m-IoT systems linked to computer-aided design tools of the original equipment manufacturers and then establish a recycling index through the quantification of RE. (2) Reactor optimization and industrial system solutions to realize the "CE (within a) Corporation—CEC," realizing the CE of society. (3) Real-time measurement of ore and scrap properties in intelligent plant structures, linked to the modeling, simulation, and optimization of industrial extractive process metallurgical reactors and plants for both primary and secondary materials processing. (4) Big-data analysis and process control of industrial metallurgical systems, processes, and reactors by the application of, among others, artificial intelligence techniques and computer-aided engineering. (5) Minerals processing and process metallurgical theory, technology, simulation, and analytical tools, which are all key enablers of the CE. (6) Visualizing the results of all the tools used for estimating the RE of the CE system in a form that the consumer and general public can understand. (7) The smart integration of tools and methods that quantify RE and deliver sustainable solutions, named in this article as circular economy engineering. In view of space limitations, this message will be colored in by various publications also with students and colleagues, referring to (often commercial) software that acts as a conduit to capture and formalize the research of the large body of work in the literature by distinguished metallurgical engineers and researchers and realized in innovative industrial solutions. The author stands humbly on the shoulders of these developments and their distinguished developers. This award lecture article implicitly also refers to work done while working for Ausmelt (Australia), Outotec (Finland and Australia), Mintek (South Africa), and Anglo American Corporation (South Africa), honoring the many colleagues the author has worked with over the years.
Optimal Wonderful Life Utility Functions in Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Tumer, Kagan; Swanson, Keith (Technical Monitor)
2000-01-01
The mathematics of Collective Intelligence (COINs) is concerned with the design of multi-agent systems so as to optimize an overall global utility function when those systems lack centralized communication and control. Typically in COINs each agent runs a distinct Reinforcement Learning (RL) algorithm, so that much of the design problem reduces to how best to initialize/update each agent's private utility function, as far as the ensuing value of the global utility is concerned. Traditional team game solutions to this problem assign to each agent the global utility as its private utility function. In previous work we used the COIN framework to derive the alternative Wonderful Life Utility (WLU), and experimentally established that having the agents use it induces global utility performance up to orders of magnitude superior to that induced by use of the team game utility. The WLU has a free parameter (the clamping parameter) which we simply set to zero in that previous work. Here we derive the optimal value of the clamping parameter, and demonstrate experimentally that using that optimal value can result in significantly improved performance over that of clamping to zero, over and above the improvement beyond traditional approaches.
Convex relaxations for gas expansion planning
Borraz-Sanchez, Conrado; Bent, Russell Whitford; Backhaus, Scott N.; ...
2016-01-01
Expansion of natural gas networks is a critical process involving substantial capital expenditures with complex decision-support requirements. Here, given the non-convex nature of gas transmission constraints, global optimality and infeasibility guarantees can only be offered by global optimisation approaches. Unfortunately, state-of-the-art global optimisation solvers are unable to scale up to real-world size instances. In this study, we present a convex mixed-integer second-order cone relaxation for the gas expansion planning problem under steady-state conditions. The underlying model offers tight lower bounds with high computational efficiency. In addition, the optimal solution of the relaxation can often be used to derive high-quality solutionsmore » to the original problem, leading to provably tight optimality gaps and, in some cases, global optimal solutions. The convex relaxation is based on a few key ideas, including the introduction of flux direction variables, exact McCormick relaxations, on/off constraints, and integer cuts. Numerical experiments are conducted on the traditional Belgian gas network, as well as other real larger networks. The results demonstrate both the accuracy and computational speed of the relaxation and its ability to produce high-quality solution« less
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
Eckermann, Simon; Willan, Andrew R
2013-05-01
Risk sharing arrangements relate to adjusting payments for new health technologies given evidence of their performance over time. Such arrangements rely on prospective information regarding the incremental net benefit of the new technology, and its use in practice. However, once the new technology has been adopted in a particular jurisdiction, randomized clinical trials within that jurisdiction are likely to be infeasible and unethical in the cases where they would be most helpful, i.e. with current evidence of positive while uncertain incremental health and net monetary benefit. Informed patients in these cases would likely be reluctant to participate in a trial, preferring instead to receive the new technology with certainty. Consequently, informing risk sharing arrangements within a jurisdiction is problematic given the infeasibility of collecting prospective trial data. To overcome such problems, we demonstrate that global trials facilitate trialling post adoption, leading to more complete and robust risk sharing arrangements that mitigate the impact of costs of reversal on expected value of information in jurisdictions who adopt while a global trial is undertaken. More generally, optimally designed global trials offer distinct advantages over locally optimal solutions for decision makers and manufacturers alike: avoiding opportunity costs of delay in jurisdictions that adopt; overcoming barriers to evidence collection; and improving levels of expected implementation. Further, the greater strength and translatability of evidence across jurisdictions inherent in optimal global trial design reduces barriers to translation across jurisdictions characteristic of local trials. Consequently, efficiently designed global trials better align the interests of decision makers and manufacturers, increasing the feasibility of risk sharing and the expected strength of evidence over local trials, up until the point that current evidence is globally sufficient.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Lenas, Petros; Moreno, Angel; Ikonomou, Laertis; Mayer, Joerg; Honda, Hiroyuki; Novellino, Antonio; Pizarro, Camilo; Nicodemou-Lena, Eleni; Rodergas, Silvia; Pintor, Jesus
2008-09-01
Although tissue engineering uses powerful biological tools, it still has a weak conceptual foundation, which is restricted at the cell level. The design criteria at the cell level are not directly related with the tissue functions, and consequently, such functions cannot be implemented in bioartificial tissues with the currently used methods. On the contrary, the field of artificial organs focuses on the function of the artificial organs that are treated in the design as integral entities, instead of the optimization of the artificial organ components. The field of artificial organs has already developed and tested methodologies that are based on system concepts and mathematical-computational methods that connect the component properties with the desired global organ function. Such methodologies are needed in tissue engineering for the design of bioartificial tissues with tissue functions. Under the framework of biomedical engineering, artificial organs and tissue engineering do not present competitive approaches, but are rather complementary and should therefore design a common future for the benefit of patients.
Assessing the Impact of Observations on Numerical Weather Forecasts Using the Adjoint Method
NASA Technical Reports Server (NTRS)
Gelaro, Ronald
2012-01-01
The adjoint of a data assimilation system provides a flexible and efficient tool for estimating observation impacts on short-range weather forecasts. The impacts of any or all observations can be estimated simultaneously based on a single execution of the adjoint system. The results can be easily aggregated according to data type, location, channel, etc., making this technique especially attractive for examining the impacts of new hyper-spectral satellite instruments and for conducting regular, even near-real time, monitoring of the entire observing system. This talk provides a general overview of the adjoint method, including the theoretical basis and practical implementation of the technique. Results are presented from the adjoint-based observation impact monitoring tool in NASA's GEOS-5 global atmospheric data assimilation and forecast system. When performed in conjunction with standard observing system experiments (OSEs), the adjoint results reveal both redundancies and dependencies between observing system impacts as observations are added or removed from the assimilation system. Understanding these dependencies may be important for optimizing the use of the current observational network and defining requirements for future observing systems
Cross-verification of the GENE and XGC codes in preparation for their coupling
NASA Astrophysics Data System (ADS)
Jenko, Frank; Merlo, Gabriele; Bhattacharjee, Amitava; Chang, Cs; Dominski, Julien; Ku, Seunghoe; Parker, Scott; Lanti, Emmanuel
2017-10-01
A high-fidelity Whole Device Model (WDM) of a magnetically confined plasma is a crucial tool for planning and optimizing the design of future fusion reactors, including ITER. Aiming at building such a tool, in the framework of the Exascale Computing Project (ECP) the two existing gyrokinetic codes GENE (Eulerian delta-f) and XGC (PIC full-f) will be coupled, thus enabling to carry out first principle kinetic WDM simulations. In preparation for this ultimate goal, a benchmark between the two codes is carried out looking at ITG modes in the adiabatic electron limit. This verification exercise is also joined by the global Lagrangian PIC code ORB5. Linear and nonlinear comparisons have been carried out, neglecting for simplicity collisions and sources. A very good agreement is recovered on frequency, growth rate and mode structure of linear modes. A similarly excellent agreement is also observed comparing the evolution of the heat flux and of the background temperature profile during nonlinear simulations. Work supported by the US DOE under the Exascale Computing Project (17-SC-20-SC).
Fusarium species-a promising tool box for industrial biotechnology.
Pessôa, Marina Gabriel; Paulino, Bruno Nicolau; Mano, Mario Cezar Rodrigues; Neri-Numa, Iramaia Angélica; Molina, Gustavo; Pastore, Glaucia Maria
2017-05-01
Global demand for biotechnological products has increased steadily over the years. Thus, need for optimized processes and reduced costs appear as a key factor in the success of this market. A process tool of high importance is the direct or indirect use of enzymes to catalyze the generation of various substances. Also, obtaining aromas and pigments from natural sources has becoming priority in cosmetic and food industries in order to supply the demand from consumers to substitute synthetic compounds, especially when by-products can be used as starting material for this purpose. Species from Fusarium genera are recognized as promising sources of several enzymes for industrial application as well as biocatalysts in the production of aromas, pigments and second generation biofuels, among others. In addition, secondary metabolites from these strains can present important biological activities for medical field. In this approach, this review brings focus on the use of Fusarium sp. strains in biotechnological production of compounds of industrial interest, showing the most recent researches in this area, results obtained and the best process conditions for each case.
Kann, Maricel G.; Sheetlin, Sergey L.; Park, Yonil; Bryant, Stephen H.; Spouge, John L.
2007-01-01
The sequencing of complete genomes has created a pressing need for automated annotation of gene function. Because domains are the basic units of protein function and evolution, a gene can be annotated from a domain database by aligning domains to the corresponding protein sequence. Ideally, complete domains are aligned to protein subsequences, in a ‘semi-global alignment’. Local alignment, which aligns pieces of domains to subsequences, is common in high-throughput annotation applications, however. It is a mature technique, with the heuristics and accurate E-values required for screening large databases and evaluating the screening results. Hidden Markov models (HMMs) provide an alternative theoretical framework for semi-global alignment, but their use is limited because they lack heuristic acceleration and accurate E-values. Our new tool, GLOBAL, overcomes some limitations of previous semi-global HMMs: it has accurate E-values and the possibility of the heuristic acceleration required for high-throughput applications. Moreover, according to a standard of truth based on protein structure, two semi-global HMM alignment tools (GLOBAL and HMMer) had comparable performance in identifying complete domains, but distinctly outperformed two tools based on local alignment. When searching for complete protein domains, therefore, GLOBAL avoids disadvantages commonly associated with HMMs, yet maintains their superior retrieval performance. PMID:17596268
Neoliberal Optimism: Applying Market Techniques to Global Health.
Mei, Yuyang
2017-01-01
Global health and neoliberalism are becoming increasingly intertwined as organizations utilize markets and profit motives to solve the traditional problems of poverty and population health. I use field work conducted over 14 months in a global health technology company to explore how the promise of neoliberalism re-envisions humanitarian efforts. In this company's vaccine refrigerator project, staff members expect their investors and their market to allow them to achieve scale and develop accountability to their users in developing countries. However, the translation of neoliberal techniques to the global health sphere falls short of the ideal, as profits are meager and purchasing power remains with donor organizations. The continued optimism in market principles amidst such a non-ideal market reveals the tenacious ideological commitment to neoliberalism in these global health projects.
ISOT_Calc: A versatile tool for parameter estimation in sorption isotherms
NASA Astrophysics Data System (ADS)
Beltrán, José L.; Pignatello, Joseph J.; Teixidó, Marc
2016-09-01
Geochemists and soil chemists commonly use parametrized sorption data to assess transport and impact of pollutants in the environment. However, this evaluation is often hampered by a lack of detailed sorption data analysis, which implies further non-accurate transport modeling. To this end, we present a novel software tool to precisely analyze and interpret sorption isotherm data. Our developed tool, coded in Visual Basic for Applications (VBA), operates embedded within the Microsoft Excel™ environment. It consists of a user-defined function named ISOT_Calc, followed by a supplementary optimization Excel macro (Ref_GN_LM). The ISOT_Calc function estimates the solute equilibrium concentration in the aqueous and solid phases (Ce and q, respectively). Hence, it represents a very flexible way in the optimization of the sorption isotherm parameters, as it can be carried out over the residuals of q, Ce, or both simultaneously (i.e., orthogonal distance regression). The developed function includes the most usual sorption isotherm models, as predefined equations, as well as the possibility to easily introduce custom-defined ones. Regarding the Ref_GN_LM macro, it allows the parameter optimization by using a Levenberg-Marquardt modified Gauss-Newton iterative procedure. In order to evaluate the performance of the presented tool, both function and optimization macro have been applied to different sorption data examples described in the literature. Results showed that the optimization of the isotherm parameters was successfully achieved in all cases, indicating the robustness and reliability of the developed tool. Thus, the presented software tool, available to researchers and students for free, has proven to be a user-friendly and an interesting alternative to conventional fitting tools used in sorption data analysis.
Watershed Management Optimization Support Tool (WMOST) v1: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a screening model that is spatially lumped with options for a daily or monthly time step. It is specifically focused on modeling the effect of management decisions on the watershed. The model considers water flows and ...
Analysis on design and optimization of dispersion-managed communication systems
NASA Astrophysics Data System (ADS)
El-Aasser, Mostafa A.; Dua, Puneit; Dutta, Niloy K.
2002-07-01
The variational method is a useful tool that can be used for design and optimization of dispersion-managed communication systems. Using this powerful tool, we evaluate the characteristics of a carrier signal for certain system parameters and describe several features of a dispersion-managed soliton.
Expert systems tools for Hubble Space Telescope observation scheduling
NASA Technical Reports Server (NTRS)
Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark
1987-01-01
The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.